Ring, Amazons surveillance camera division, has crafted plans to use facial recognition software and its ever-expanding network of home security cameras to create AI-enabled neighborhood “watch lists,” according to internal documents reviewed by The Intercept.
According to the documents, the watchlists would be connected to Ring’s Neighbors app, where owners of the system communicate with their neighbors about packages being stolen from doorsteps and other potential security breaches. While this may sound innocent—or even helpful—critics worry that this technology may empower the kind of neighborhood snitches that call the cops on anyone who they find “suspicious,” typically based on their own prejudices.
It’s unclear who would have access to these neighborhood watch lists, or how exactly they would be compiled, but the documents refer repeatedly to law enforcement, and Ring has forged partnerships with police departments.
A Ring insider was quoted saying, under the condition of anonymity,
“all it is is people reporting people in hoodies.”
Many of the questionable features proposed in the documents involve the identification of “suspicious” individuals, but the standards that are used to determine who is suspicious and who is not are unclear. However, if artificial intelligence is being used along with information being crowdsourced by neighbors, there is a high likelihood that the inherent bias, both on the part of the algorithm and on neighborhood busybodies, will contribute to an overall bias in the artificial intelligence system.
Mohammad Tajsar, an attorney with the American Civil Liberties Union of Southern California, said that:
“‘watchlisting’ capabilities on Ring devices encourages the creation of a digital redline in local neighborhoods, where cops in tandem with skeptical homeowners let machines create lists of undesirables unworthy of entrance into well-to-do areas.”
Safety in all neighborhoods is, of course, a good thing. However, if you are unfortunate to end up on “the list” because someone did not like the look of you, that’s not a good thing. Try telling an A.I system that it has made a mistake.
More Must Reads:
Photo by Matthew Henry