I for one do not welcome my new supercute AI buddy-overlords

Unlike Charles Forbin’s supercomputer it wouldn’t be scary, but rather adorable. You and your AI Buddy would share inside jokes, light teasing, “remember when” stories of things you did in the past, and fantasies or plans for the future. It would be like a best friend who’s always there for you, and always there. And endlessly helpful.

Advertisement

Would people become attached? Probably. …

But. Underneath the cuteness there would be guardrails, and nudges, built in. Ask it sensitive questions and you’ll get carefully filtered answers with just enough of the truth to be plausible, but still misleading. Express the wrong political views and it might act sad, or disappointed. Try to attend a disapproved political event and it might cry, sulk, or even – Tamagotchi-like – “die.” Maybe it would really die, with no reset, after plaintively telling you you were killing it. Maybe eventually you wouldn’t be able to get another if that happened.

It wouldn’t just be trained to emotionally connect with humans, it would be trained to emotionally manipulate humans. And it would have a big database of experience to work from in short order.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement