News
Amazon Ring Doorbell Camera to Build Watchlist of “Suspicious” Neighbors for Police
Amazon is working with law enforcement on a system that will identify people who are considered “suspicious.”
(TMU) — Ring, the home security system developed by Amazon, is planning to build a database of neighborhood watchlists using facial recognition technology.
Documents obtained by the Intercept revealed that the company is working with law enforcement on a system that will identify people who are considered “suspicious,” and let Ring owners know when these individuals are near their home, using the facial recognition software built into the security system’s cameras.
The software will also give the Ring owner the ability to notify police or call in the suspicious activity on their own.
According to the documents, the watchlists would be connected to Ring’s Neighbors app, where owners of the system communicate with their neighbors about packages being stolen from doorsteps and other potential security breaches. While this may sound innocent—or even helpful—critics worry that this technology may empower the kind of neighborhood snitches that call the cops on anyone who they find “suspicious,” typically based on their own prejudices.
In fact, a Ring employee, speaking to the Intercept under the condition of anonymity, admitted that “all it is is people reporting people in hoodies.”
While these plans are explicit in the documents, Ring spokesperson Yassi Shahmiri insisted that “the features described are not in development or in use and Ring does not use facial recognition technology.”
However, Amazon was later forced to admit that the facial recognition system is currently a “contemplated but unreleased feature” for Ring, in a response to a formal inquiry by Massachusetts Senator Edward Markey.
Mohammad Tajsar, an attorney with the American Civil Liberties Union of Southern California, said that “‘watchlisting’ capabilities on Ring devices encourages the creation of a digital redline in local neighborhoods, where cops in tandem with skeptical homeowners let machines create lists of undesirables unworthy of entrance into well-to-do areas.”
Many of the questionable features proposed in the documents involve the identification of “suspicious” individuals, but the standards that are used to determine who is suspicious and who is not are unclear. However, if artificial intelligence is being used along with information is being crowdsourced by neighbors, there is a high likelihood that the inherent bias, both on the part of the algorithm and on neighborhood busybodies, will contribute to an overall bias in the artificial intelligence system.
An article published last year in Nature explores the ethical framework of technology like self-driving cars. The article notes that the ethics of self-driving cars are based on the trolley problem, an ethical lifeboat scenario that would prove extremely unlikely in the real world. According to the ethics of self-driving cars, informed by the trolley problem, the lives of old people are less valuable than those of younger generations, and the life of an athlete is likewise more valuable than a “large” woman or homeless person. Other studies have shown racial and gender bias “accidentally” coded into facial recognition systems.
A self-driving car has a choice about who dies in a fatal crash. Here are the ethical considerations https://t.co/ZcEgDQfxhh #automation pic.twitter.com/XzLWQWDzcr
— World Economic Forum (@wef) November 3, 2018
The documents went into detail about a variety of the features that are currently being “contemplated” by the company, including phone notifications about “suspicious” individuals who may have been spotted in the area, which even allows the Ring user to notify their neighbors. Another feature identified in the documents is something called “proactive suspect matching,” and while the documents were unclear about how this would function, it seems like a program that would cross-reference faces that walked by a Ring user’s house with a police database of potential suspects.
“Ring appears to be contemplating a future where police departments can commandeer the technology of private consumers to match ‘suspect’ profiles of individuals captured by private cameras with those cops have identified as suspect—in fact, exponentially expanding their surveillance capabilities without spending a dime,” Tajsar said.
These features are not unprecedented for Ring or Amazon. Earlier this year, Motherboard reported that Ring was encouraging its users to snitch on their neighbors in exchange for discounts and free products.
Stop letting Ring and Amazon dunk on your privacy.
Amazon’s Ring has set up more than 500 partnerships with law enforcement agencies to convince communities to spy on themselves through doorbell cameras and its social app, Neighbors. The company is moving recklessly fast with little regard for the long-term risks of this mass surveillance technology. Join EFF in challenging Ring spokesman Shaq to rethink Ring’s privacy-invasive partnerships with law enforcement. #NothingButDragnet https://eff.org/ring
Posted by Electronic Frontier Foundation (EFF) on Wednesday, November 20, 2019
By John Vibes | Creative Commons | TheMindUnleashed.com
Typos, corrections and/or news tips? Email us at Contact@TheMindUnleashed.com