Why do robots always turn out sexist?

In the case of TayAI, it appears that the developers’ limited life experiences didn’t prepare them for the expectation of abuse. Most women who have any form of online presence, and particularly those who work in tech, are constantly reminded of the trolls among us. Hate speech surfaces in a range of fora, but it is particularly dangerous in interactive technology that is self-teaching—like AI.

Advertisement

When Microsoft’s Cortana is sexually harassed, she “fights back”, according to Microsoft. Of course, this is because developers programmed “her” to respond to “particularly assholish” comments with negative or even angry responses. (Should we be glad that for once we have a virtual woman taking the abuse instead of a real one?)

We can do better than coding for an expectation of abuse. It starts with recognizing the ways that we are constructing machine artificial intelligence with the handicaps of human misogyny. We are creating a world where Trump’s election is, like our AI bots, a product of the crowdsourced inputs particular to today’s fears. Technologists are concerned. Trump’s win gives license to stymie our future interactions with technology, and each other.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement