Practice Op-Ed – Christopharo

AI, Artificial Intelligence or Abrasive Invasion

Facial recognition software is unlocking our phones, using your face without your consent, improving security, predicting if you will commit a crime, finding missing persons, and assuming your sexual orientation. If you have nothing to hide then facial recognition may seem convenient, but it is something we should all fear. It is a technology that grows faster every day and exploits our privacy quicker every second. Facial recognition software is an intrusive technology that disregards our human rights and can be operated to conduct bias.

Companies and “researchers” are gathering large stockpiles of people’s faces without their knowledge or consent. Images are being accumulated from your phone, social media, and your favorite coffee shops security cameras. Reasoning behind these large face data sets is supposedly to improve facial recognition systems. However, companies are not keeping your face to themselves, they are sharing and selling your likeness amongst other private companies and even governments. Images of real people’s faces are then being misused around the world to build morally questionable technology. These practices are intrusive, and these companies are not respectful of your or my privacy

China is working on predictive analytics to help authorities stop suspects before a crime is committed. Predictive intelligence will notify police of potential criminals based solely on behavioral patterns. Imagine going to the store to buy a hammer to replace air conditioning ducts, not very suspicious. However, you remember you need duct tape, trash bags to clean up the mess, and rope to keep things in place momentarily. When you get home the police are already there waiting to ask you some specific questions about your choice of purchases. Not only is this an invasion of privacy but this is unlawful and unfounded.

Facial recognition can be used to assume your sexual orientation and such AI algorithms will soon be able to measure the intelligence, and political positioning of people from their facial images alone. Technology startups are even boasting about their claims that they can profile people’s character from their facial images. Claims that sexual orientation tends to go along with a characteristic facial structure is a horrific thing to celebrate about. Even if these claim were true, the use of such facial recognition software to determine sexual orientation is highly unethical.  

Gallery | This entry was posted in christopharocomp, Practice Op-Ed, x Brevity TR. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s