Is your face gay? AI researchers are asking the wrong questions

There are such things as bad questions. These days, AI researchers are asking troubling questions of technology and attempting to use it to link facial attributes to fundamental character. While they are not saying “yippee, look what AI can do” (in fact, some say they are trying to highlight risks), by publishing in scientific journals, they are risking lending credibility to the very idea of using AI in problematic, physiognomic ways. Recent research has tried to show that political leaning, sexuality, and criminality can be inferred from pictures of people’s faces.

Advertisement

Political orientation. In a 2021 article in Nature Scientific Reports, Stanford researcher Michal Kosinski found that using open-source code and publicly available data and facial images, facial recognition technology can judge a persons’ political orientation accurately 68 percent of the time even when controlling demographic factors. In this research, the primary algorithm learned the average face for conservatives and liberals and then predicted the political leanings of unknown faces by comparing them to the reference images. Kosinski wrote that his findings about AI’s abilities have grave implications: “The privacy threats posed by facial recognition technology are, in many ways, unprecedented.”

While the questions behind this line of inquiry may not immediately trigger an alarm, the underlying premise still fits squarely within physiognomy, predicting personality traits from face features.

Sexuality. In 2017, Kosinski published another work showing that a neural network trained on facial images could be used to distinguish between gay and straight people. Surprisingly, the experiments using the system showed an accuracy of 81 percent for men and 74 percent in women. Based on these results, the machine learning model performed well, but what was the value of the question the study asked? Often, inferring people’s sexual orientation is used for purposes of discrimination or criminal prosecution. In fact, there are still more than 50 countries with laws against same-sex sexual acts on the books, with severity of punishment ranging from imprisonment to death. In this instance, improving accuracy of AI tools in applications like these only magnifies the likely resultant harm. This isn’t even to speak of the instances where the tool’s predictions are incorrect. According to the BBC, organizations representing the LGBTQ community were strongly critical of Kosinski and his colleague’s research, calling it “junk science.” As with the study on political leanings, the authors argued that their study highlights the risk that facial recognition technology might be misused.

Advertisement

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement