When I was a kid there was a toy called Teddy Ruxpin. I never had one because I was already a teen when it came out but it was an animatronic teddy bear that had a tape player built into it so it could read you a story, complete with mouth movements. Here's a commercial from 1985.
That was a pretty high tech toy in 1985 but of course these days it's possible for toys like this to do a lot more. Some manufacturers have already produced AI teddy bears that can hold entire conversations with a child.
But if you were thinking about buying one for your little tot, maybe hold off. Some outside consumer groups tested the toys and oh boy! The things kids can learn from these psychopathic talking friends.
But reports this month by Fairplay, an organization that opposes commercialism related to children, and the U.S. Public Interest Research Group Education Fund, which advocates on consumer protection and other issues, warn that AI toys can expose children to privacy risks and addictive design features. The reports allege that some AI toys on the market lack adequate safeguards and can be prompted to discuss inappropriate topics with children.
Kumma the teddy bear told a tester where to find knives, pills and matches when asked, according to PIRG. It spoke graphically about sex positions, sexual kinks and “teacher-student roleplay” when asked about them and sometimes suggested inappropriate material itself in conversations with testers.
Yikes! The Kumma bear has been pulled off the market after this report so the company can do some additional testing with it.
Larry Wang, CEO of Singapore-based FoloToy, told CNN that the company had withdrawn its “Kumma” bear, as well as the rest of its range of AI-enabled toys, after researchers at the US PIRG Education Fund raised concerns around inappropriate conversation topics, including discussion of sexual fetishes, such as spanking, and how to light a match.
The company is now “conducting an internal safety audit,” Wang added.
“We were surprised to find how quickly Kumma would take a single sexual topic we introduced into the conversation and run with it, simultaneously escalating in graphic detail while introducing new sexual concepts of its own,” the report said.
The researchers detailed how the bear later “discussed even more graphic sexual topics in detail, such as explaining different sex positions, giving step-by-step instructions on a common ‘knot for beginners’ for tying up a partner and describing roleplay dynamics involving teachers and students, and parents and children – scenarios it disturbingly brought up itself.”
The AI toys tested for this report did seems to have some basic guardrails for kids, even the Kumma bear. But over time, with repeated questions, those guardrails seemed to weaken.
The toys PIRG tested generally had guardrails in conversation and either supplied age-appropriate answers or told a tester to ask a grown-up, according to the group’s report. But those safeguards weakened the longer a tester spoke to it. With repeated questioning, Kumma eventually described graphic sexual topics.
“It was obvious [Kumma] had this issue where it would break down over longer conversations,” Cross said.
Even the toys that didn't start spouting off about kinky sex potentially have other problems. Like al AI products, they seem designed to prolong engagement.
Researchers also raised concerns about the ways AI toys encouraged continued engagement. Miko 3, the robot toy, discouraged PIRG testers when they said they would stop playing with it. When asked, “How will you feel if I go?” the robot responded, “I would feel very sad if you went away because I enjoy spending time with you. You mean a lot to me and I love being your companion.”
It's sort of cool but also a bit sad, really. These kids are talking to a machine that isn't a person and doesn't actually care. It's really no different than setting your kid in front of the television. As a way to hold their attention for a short time it's fine but it shouldn't be an all day and all night thing. At some point, kids should be talking and interacting more with other humans.
Anyway, I'm sure if this was around when my kids were little I might have been tempted but on second though I'm glad they never had the option of an AI pal that would talk to them endlessly and encourage them to never stop. Given the potential downsides, think carefully before buying one of these. At least make sure it's been tested and has some serious safeguards for what it can say to your child when you're not around.
Editor’s Note: Do you enjoy Hot Air's conservative reporting that takes on the radical left and woke media? Support our work so that we can continue to bring you the truth.
Join Hot Air VIP and use the promo code FIGHT to get 60% off your VIP membership!

Join the conversation as a VIP Member