AI "girlfriends" called a threat to women's rights

(AP Photo/Sam McNeil)

A few months ago, we discussed a female online “influence” (I remain unsure how that ever became a job) who created an Artificial Intelligence clone of herself and advertised it as “your new virtual girlfriend.” That just sounded kind of sad to me and potentially harmful. But she claimed that people were signing up for her service in the hundreds of thousands. Are there that many men out there who really want to get into a relationship with a chatbot? There must be because the influencer in question soon had a lot of competition springing up. New chatbot apps with names like Replika, Character.AI, and Soulmate are out there offering an experience where users can “customize everything about their virtual partners, from appearance and personality to sexual desires.” (Yikes!) But now, according to the Economic Times, there are ethicists and women’s rights activists warning that these sorts of “relationships” can pose threats to actual women, including undermining women’s rights.

Advertisement

Some AI ethicists and women’s rights activists say developing one-sided relationships in this way could unwittingly reinforce controlling and abusive behaviours against women, since AI bots function by feeding off the user’s imagination and instructions.

“Many of the personas are customisable … for example, you can customise them to be more submissive or more compliant,” said Shannon Vallor, a professor in AI ethics at the University of Edinburgh. “And it’s arguably an invitation to abuse in those cases,” she told the Thomson Reuters Foundation, adding that AI companions can amplify harmful stereotypes and biases against women and girls.

Generative AI has attracted a frenzy of consumer and investor interest due to its ability to foster humanlike interactions. Global funding in the AI companion industry hit a record $299 million in 2022, a significant jump from $7 million in 2021, according to June research by data firm CB Insights.

Allow me to just offer one insight right off the top of my head. If your boyfriend is dating a chatbot, perhaps it was already time to start looking for a new boyfriend anyway. Just saying

We’re apparently supposed to describe these online dating bots as “AI companions” now. I’m still amazed at the speed with which AI is advancing and spreading faster than something you used to worry about catching from a stripper. There was basically nothing even close to this only a few years ago and now the “AI companion industry” is pulling in almost $300 million per year. It’s just difficult to conceive of this happening. Of course, my wife and I met in a barn at a dog shelter back before the first America Online CDs arrived in the mail and other human beings were really your only dating options.

Advertisement

So if I’m following this correctly, the chief concern here is that men who get into “relationships” with these chatbots might develop unhealthy, aggressive, or even violent impulses toward their AI “partners.” And then those impulses might carry over into real life. Now, I’m no psychiatrist, but if those impulses were already lurking somewhere in the man’s personality, I’m guessing his real-life girlfriend was going to find that out sooner or later.

But one analyst of gender-based violence quoted for the article seems to see the same risk I speculated about above. This is from Hera Hussain of Chayn.

“Instead of helping people with their social skills, these sort of avenues are just making things worse,” she said.

“They’re seeking companionship which is one-dimensional. So if someone is already likely to be abusive, and they have a space to be even more abusive, then you’re reinforcing those behaviours and it may escalate.”

I still can’t make up my mind. I continue to have the same reservations that I expressed when we first discussed this issue. I suppose people are free to “date” whoever they wish provided there are no actual human beings being harmed. But the more people retreat into a fantasy world to avoid the complexities of real relationships, the more people will wind up alone. Everything about this “AI companion” system seems unhealthy and culturally counterproductive for our society.

Advertisement

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement
Advertisement