The Uighur Muslims of China are Living in a Police State–and We’re Next
Facial Recognition Software (FRS), a biometric software application capable of uniquely identifying a person by analyzing the patterns of that person’s facial contours, is becoming more advanced as well as widespread. It was initially intended to be a means of identifying criminals and serving other security purposes, but people nowadays use it for simple conveniences: unlocking their phone, skipping lines in airports, and much more. However, we may be becoming too comfortable with FRS and, without restrictions, it could put our rights to privacy in jeopardy.
On the other side of the world, China has utilized and developed their facial recognition systems to such a degree that it has become the world’s most advanced. At the same time, however, they have used it to police and intimidate its citizens, exerting nearly full control over their private lives. Uighur Muslims in particular are heavily discriminated against by these new, advanced security systems. When Uighur Muslims want to go to their local mosques, they have to register with proper identification and pray under heavy surveillance. According to one source, an entire area of a town dominated by Muslims was demolished and rebuilt in a fashion that would allow for more pervasive surveillance.
Clearly China has chosen their path with more advanced security systems: to target and harass its citizens. Uighur Muslims today are living in the constant fear that they may be discriminated against or accused of crimes they have never committed by police, the Chinese government, or even their next-door neighbor. Americans may think, “That would never happen to us. We live in a free country.” They are mistaken.
Many are unaware of it, but there are multiple public databases in the West that contain millions of pictures of people’s faces without their consent. One of the largest databases called MS Celeb, developed by none other than Microsoft, contains over ten million unique faces. One database that contained over 10,000 pictures of people at a San Francisco cafe, called Brainwash, was released to the public by researchers at Stanford. The researchers behind the database did not indicate that the people in the cafe consented to having their image used for research. That may not seem very concerning, but it gets worse: according to research papers, the database was given to Chinese AI companies.
If one instance of foreign countries using our faces to further their invasive security measures wasn’t an unsettling fact about American privacy invasion, it should be acknowledged that even MS Celeb was released internationally and used by foreign governments and military. In addition, America’s own police databases feature the faces of nearly half of Americans and are being used to discriminate against the innocent. One man that was accused of stealing socks from Target was denied the right to due process because he was singled out by authorities, who used FRS, even though there were over 200 other matches. The NYPD looked for a thief using FRS but could not produce results. They instead searched for people who looked like Woody Harrelson because they thought the thief looked like him. American facial recognition systems discriminate against black people because they think black people look more alike than white people. All of these factors can and have led to wrongful convictions and violate some of our most fundamental rights: privacy, freedom from unreasonable search, and due process.
Make no doubt that facial recognition software is an impressive testament to society’s technological advancements and, as a tool, can be used to make our lives easier as well as single out criminals. However, we as Americans should be weary of the authorities using its power unchecked. They can easily abuse it by discriminating against whoever they want, even if such targets are innocent. If the government does not impose restrictions on its FRS security systems then we will end up in nearly the same predicament as Uighur Muslims in the future.