We were warned that this technology was on the way several years ago, but now it has arrived and it sounds every bit as creepy as I initially imagined it to be. There are now more than half a dozen tech companies offering people the opportunity to basically "upload their consciousness" into an Artificial Intelligence algorithm before they die so that their loved ones can continue their relationship with them. These programs are being depicted as "griefbots" or the even more sinister-sounding "deathbots." That's because the programs are billed as a way to help people cope with the loss of a loved one and work through the grieving process. But do they really do that? Or is this going to turn into yet another form of digital addiction that prevents people from moving on with their lives and engaging with the real world? (Associated Press)
When Michael Bommer found out that he was terminally ill with colon cancer, he spent a lot of time with his wife, Anett, talking about what would happen after his death.
She told him one of the things she’d miss most is being able to ask him questions whenever she wants because he is so well read and always shares his wisdom, Bommer recalled during a recent interview with The Associated Press at his home in a leafy Berlin suburb.
That conversation sparked an idea for Bommer: Recreate his voice using artificial intelligence to survive him after he passed away.
Mr. Bommer worked with a company called Eternos. Together, they created “a comprehensive, interactive AI version” of the client. Mrs. Bommer will now be able to have conversations with her deceased husband and ask him questions. The bot will answer in his voice and craft responses similar to all of the input that he uploaded. The CEO of Eternos said that she will be able to “engage with his life experiences and insights.”
However, based on my admittedly limited understanding of AI, that's not really true at all. The bot will only ever be able to respond using snippets that already exist in the Michael Bommer data library. It will never be able to consider profound questions that the couple hadn't previously discussed and offer an original answer. What if she asks, "Are you okay? What's it like on the other side?" The bot won't be able to answer because Michael Bommer has no idea what his world will be like after he dies. The AI will probably have to draw on other datasets from philosophers and religious scholars and just make something up. That seems like cheating.
If you want to check out an even creepier offering, there is a company out there called Seance AI. They offer "fictionalized seances" with disembodied spirits. For an extra ten dollars, you can upload recordings of your friends and relatives' voices for the bot to imitate. The company claims that they offer users the opportunity to "commune with lost friends and family via AI." But that's even more nonsensical. You're not talking to the dead. You're talking to a computer algorithm that has been programmed to deceive you.
There are deeper problems with this technology in my opinion. As I suggested above, grieving people can find themselves in very vulnerable positions. Tools like these only serve to keep the memories of interacting with the departed fresh and sharp rather than slowly fading as normally happens through the grieving process. Being able to summon up the dead and interact with them whenever you want will only delay that process and could become a form of addiction. I'm not a psychotherapist, but if you asked one, I would be willing to wager that they would share similar concerns. While considering this article, I described the process to my wife and asked her if she would be interested in having me undertake the procedure. She was simply horrified at the idea. But then again, she's always been a sensible woman. (With the possible exception of marrying me, of course.)
Join the conversation as a VIP Member