Dystopian or useful? Amazon Ring doorbells will now be able to recognize your guests through a new AI facial recognition feature, the company said Tuesday. The controversial feature, dubbed “Familiar Faces,” was announced earlier this September and is now available to Ring device owners in the United States.
Amazon says the feature lets you recognize the people who regularly come to your door by creating a list of up to 50 faces. These can include family members, friends and neighbors, delivery drivers, domestic staff and more. After you tag someone in the Ring app, the device will recognize them as they approach the Ring camera.
Then, instead of alerting you that “a person is at your door,” you’ll get a personalized notification, like “Mom at the front door,” the company explains in its launch announcement.
The feature has already received pushback from consumer protection organizations, such as the EFFand US Senator.
Amazon Ring owners can use the feature to help them turn off notifications they don’t want to see — like those notifications that refer to their own interpretations, for example, the company says. And they can set these notifications per person.
The feature is not enabled by default. Instead, users will have to enable it in their app settings.
Meanwhile, faces can be named in the app directly from the Event History section or from the new Familiar Faces library. Once tagged, the person will be named in all notifications, the app’s timeline, and Event History. These tags can be edited at any time and there are tools to merge duplicates or delete faces.
Techcrunch event
San Francisco
|
13-15 October 2026
Amazon claims that facial data is encrypted and never shared with others. Additionally, it says that unnamed faces are automatically removed after 30 days.
Privacy concerns about AI facial recognition
Despite Amazon’s assurances of privacy, adding the feature raises concerns.
The company has a history of working with law enforcement and once gave police and fire departments the ability to request data from the Ring Neighbors app by asking Amazon directly for people’s doorbell video. More recently, Amazon partnered with Flock, the maker of AI surveillance cameras used by police, federal law enforcement and ICE.
Ring’s own security efforts have failed in the past.
Ring had to pay a $5.8 million fine in 2023 after the US Federal Trade Commission found that Ring employees and contractors had broad and unrestricted access to customer videos for years. The Neighbors app also exposed users’ home addresses and exact locations, and users’ Ring passwords have been circulating on the dark web for years.
Given Amazon’s willingness to cooperate with law enforcement agencies and digital surveillance providers, combined with its poor security record, we recommend that Ring owners, at the very least, be cautious about identifying anyone using their real name. better yet, keep the feature off and just look to see who it is. Not everything needs an AI upgrade.
As a result of privacy concerns, Amazon’s Ring has already faced calls from U.S. Sen. Ed Markey (D-Mass.) leave this attributeand is facing backlash from consumer protection groups such as the EFF. Privacy laws prevent Amazon from releasing the feature Illinois, Texasand Portland, Oregon, the EFF had also noted.
In response to questions posed by the organizationAmazon said users’ biometric data would be processed in the cloud and claimed it does not use the data to train artificial intelligence models. It also claimed that it could not technically identify all the locations where a person was located, even if law enforcement requested that data.
However, it’s unclear why this isn’t the case, given the similarity to the “Search Party” feature that looks into a neighborhood’s network of Ring cameras to find lost dogs and cats.
EFF Staff Attorney F. Mario Trujillo said, “Knocking on a door, or even walking in front of it, shouldn’t require giving up your privacy. With this feature in place, it’s more important than ever for government privacy regulators to investigate, protect people’s privacy, and test their biometric privacy.”
Updated after publication with EFF comment.
