AMAZON RING Doorbell To Build Database Using FACIAL Recognition Technology – More Surveillance!

Ring, Amazons surveillance camera division, has crafted plans to use facial recognition software and its ever-expanding network of home security cameras to create AI-enabled neighborhood “watch lists,” according to internal documents reviewed by The Intercept.

According to the documents, the watchlists would be connected to Ring’s Neighbors app, where owners of the system communicate with their neighbors about packages being stolen from doorsteps and other potential security breaches. While this may sound innocent—or even helpful—critics worry that this technology may empower the kind of neighborhood snitches that call the cops on anyone who they find “suspicious,” typically based on their own prejudices.

It’s unclear who would have access to these neighborhood watch lists, or how exactly they would be compiled, but the documents refer repeatedly to law enforcement, and Ring has forged partnerships with police departments.

A Ring insider was quoted saying, under the condition of anonymity,

“all it is is people reporting people in hoodies.”

Many of the questionable features proposed in the documents involve the identification of “suspicious” individuals, but the standards that are used to determine who is suspicious and who is not are unclear. However, if artificial intelligence is being used along with information being crowdsourced by neighbors, there is a high likelihood that the inherent bias, both on the part of the algorithm and on neighborhood busybodies, will contribute to an overall bias in the artificial intelligence system.

Mohammad Tajsar, an attorney with the American Civil Liberties Union of Southern California, said that:

‘watchlisting’ capabilities on Ring devices encourages the creation of a digital redline in local neighborhoods, where cops in tandem with skeptical homeowners let machines create lists of undesirables unworthy of entrance into well-to-do areas.”

Safety in all neighborhoods is, of course, a good thing. However, if you are unfortunate to end up on “the list” because someone did not like the look of you, that’s not a good thing. Try telling an A.I system that it has made a mistake.

More Must Reads:

FACEBOOK IS AFRAID Of The Truth – BLANKET BANS ALL Vaccine Awareness Content

5G & Biological Health: UK Government Issued Peer-Reviewed Precautionary Research – Will They Listen?

 


Photo by Matthew Henry


 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.