Practice Op-ed – Lelebxby

Our justice system and other authoritative figures say facial recognition is allegedly the best way to protect and serve the community. That with this software, one can precisely and accurately pin-point the true culprits in a police line-up. However, in actuality, the technology unjustifiably stereotypes and misidentifies innocents solely based on coincidences and misguided information.

At Stanford University, researcher Michael Kosinski created an A.I. that identifies sexual orientation. He conducted this project after hearing of out-of -country companies start up facial recognition of their own to help their governments predict who might commit a crime next. After feeding the software dozens of images of heterosexual and homosexual individuals, it was said to have “…81 percent accuracy…” Facial recognition like Kosinski’s exposes the privacy of one’s sexuality, regardless if they’re part of the LGBT community or not. This “passion project” Kosinski created is only the tip of the iceberg of using this type of fishy software for the wrong reasons.

Sexual orientation is much deeper than one’s physical appearance. The slant of one’s eyes or whether or not someone is wearing makeup has no say in the matter. One’s sexuality is theirs and theirs alone and shouldn’t be in danger of being exposed by an A.I. Besides sexuality, facial recognition shouldn’t be playing god or be used to stereotype anything about a person, especially how likely they are to commit a crime or who they happen to be attracted to.

This entry was posted in lelebxby, Non Portfolio Tasks, Practice Op-Ed. Bookmark the permalink.

1 Response to Practice Op-ed – Lelebxby

  1. davidbdale says:

    I wish we’d had time to more fully develop this draft into a finished product, Lele. It gets a good start on an important topic and argues its point with passion (but without enough evidence to fully distinguish it).

    Like

Leave a comment