So, are you feeling lonely? According to recent polling, nearly one in five Americans are. Or perhaps you’re just having trouble finding a date, though in this instance we would only talking about guys and lesbians. Well, fear not. Another social media “influencer” I’d never heard of named Caryn Marjorie is ready to ride to the rescue for you. She created an Artificial Intelligence clone of herself and put it online. Now you’ll always have a pretty girl to talk to who will be willing to listen to your problems and lift your spirits. And, amazingly, millions of people have signed up. (NBC News)
Caryn Marjorie wanted to talk to as many of her followers as she could — so she made an AI clone of herself.
The Snapchat influencer, who has 1.8 million subscribers, launched an AI-powered, voice-based chatbot that she hopes will “cure loneliness.”
Called CarynAI, the chatbot is described on its website as a “virtual girlfriend.” It allows Marjorie’s fans to “enjoy private, personalized conversations” with an AI version of the influencer, the chatbot’s website states.
Why did she do this? I’ll let her explain it herself in her own words. This is from her Twitter account.
CarynAI is the first step in the right direction to cure loneliness. Men are told to suppress their emotions, hide their masculinity, and to not talk about issues they are having. I vow to fix this with CarynAI. I have worked with the world’s leading psychologists to seamlessly…
— Caryn Marjorie (@cutiecaryn) May 12, 2023
I was going to go test this out for our readers and share the results, but I’m not going to enter the amount of personal data required to gain access or pay whatever she might be charging. (I wasn’t able to confirm if there was a fee.)
There weren’t many details provided in the FAQ section of the clone’s website. But in case you think I’m painting an unfairly salacious picture of this project, it says, “Your Virtual Girlfriend” right at the top of the page. So what are we to assume that means exactly? There’s a difference between “curing loneliness” and some of the other activities people might expect to engage in with their “girlfriend.” (A request for comment was not immediately returned today.)
How “real” would this “girlfriend” experience be? When a Wall Street Journal columnist recently created a clone of herself, the result was pretty good and even seemed realistic enough to fool someone for a short time. But any conversations that became complex caused the fakery to be fairly obvious in a short time.
On the one hand, this looks like one of the least dangerous applications of AI to show up recently, at least in terms of robots taking over the world. But at the same time, perhaps it’s just me but this looks kind of sad and potentially dangerous in another way. Are people going to get so caught up in a “relationship” with this clone that they begin cutting themselves off from the real world? And when the realization sets in that the relationship is never going to be “consummated” (to put it politely), might the person’s loneliness and even depression become even worse?
The real world can be frightening, particularly these days. And not all relationships work out and result in a long and happy marriage and family. But I’m still enough of a sap to believe that there is someone out there for everyone. However, you’re never going to find them if you don’t leave the house. (Or at least seek out real people online initially instead of a chatbot.) If too many people begin resorting to something like this, AI may still bring down civilization without needing to build a single robot.
Join the conversation as a VIP Member