Errattic

Home About Us All Fuctasia_(NSFW) Games Gay+ Health/Food Movies Music Musings Photos_(NSFW) TV Wisps Preferences

Home Page > Current Page


Top Tags

2010s
Abuse
Action
All Rights
Art
Asian
Ass
Backlash
Big Balls
Big Cock
Black
Bullying
Business
Celebration
Celebrity
Choices
Cocksuck
Comedy
Community
Compilation
Cum
Daddy Squish
Dance
Dedication
Discrimination
Drama
Entertainment
Environment
Exhibit
Family
Fantasy
Fear
Feet
Fuck
Funny
Gay
Giant Cock
Gif
Hairy
Hard
Hate
Health
History
Hole Puncher
Homophobia
Horror
Hot Swatch
Hypocrisy
Interracial
Jock
Latin
Legs
Lifestyle
Magic Splatter
Mass Appeal
Massle
Masturbate
Mat
Mental Health
Muscle
Music
New World Order
Opinion
Parenting
Piercings
Pillows
Plenti-fil
Political
Portrait
Pose
Privilege
Relationships
Religion
Respect
Romance
Sad
Science
Self Interest
Sex
Social Media
Sports
Squishy
Stepping Up
Support
Sweet
Tats
Tits
Toned
Toxic
Treatment
Tribute
Undies
Unity
Video
Violence
Weird
Wet
Women
World
Youth


Login

Create Profile
Login


This site does not claim credit for images, videos, or music, except where noted. Let us know if we have used your media and would like it removed.


©2019 Errattic.com

Restricted to Adults
This site does not claim credit for images, videos, or music, except where noted. Let us know if we have used your media and would like it removed.


Welcome to Errattic! We encourage you to customize the type of information you see here by clicking the Preferences link on the top of this page.

 

CHINA HAS CREATED A RACIST A.I. TO TRACK MUSLIMS 

 

The Chinese government is using facial-recognition software to “track and control” a predominantly Muslim minority group, according to a disturbing new report from The New York Times. The Chinese government has reportedly integrated artificial intelligence into its security cameras to identify the Uighurs and appears to be using the information to monitor the persecuted group. The report, based on the accounts of whistleblowers familiar with the systems and a review of databases used by the government and law enforcement, suggests the authoritarian country has opened up a new frontier in the use of A.I. for racist social control—and raises the discomfiting possibility that other governments could adopt similar practices.

Two people familiar with the matter told the Times that police in the Chinese city of Sanmenxia screened whether residents were Uighurs 500,000 times in a single month. Documents provided to the paper reportedly show demand for the technology is ballooning: more than 20 departments in 16 provinces sought access to the camera system, in one case writing that it “should support facial recognition to identify Uighur/non-Uighur attributes.” This, experts say, is more than enough to raise red flags. “I don’t think it’s overblown to treat this as an existential threat to democracy,” Jonathan Frankle, an A.I. researcher at the Massachusetts Institute of Technology, told the Times. “Once a country adopts a model in this heavy authoritarian mode, it’s using data to enforce thought and rules in a much more deep-seated fashion than might have been achievable 70 years ago in the Soviet Union. To that extent, this is an urgent crisis we are slowly sleepwalking our way into.”

Vanity Fair

Tags: All Rights, Discrimination, Environment, Profiling, Safety, Security, Tech, World

Filed under: Gay+

15-Apr-2019


Share

Share on Twitter Share on Facebook Share on Google+ Share via Email