Last month there Washington Post published a report about some the AI powered toys being offered for sale this Christmas season. The idea here is that these toys can really interact spontaneously with kids and, because they rely on AI, they aren't limited to a simple script or a few canned responses. But the problem which the U.S. Public Interest Research Group Education Fund uncovered, is that these same toys appeared to have guardrails for kids that could be worn down over time.
Kumma the teddy bear told a tester where to find knives, pills and matches when asked, according to PIRG. It spoke graphically about sex positions, sexual kinks and “teacher-student roleplay” when asked about them and sometimes suggested inappropriate material itself in conversations with testers.
“It would totally go into this numbered list of sexual fetishes,” said R.J. Cross, a researcher with PIRG. “We don’t really know why it did that.”...
The toys PIRG tested generally had guardrails in conversation and either supplied age-appropriate answers or told a tester to ask a grown-up, according to the group’s report. But those safeguards weakened the longer a tester spoke to it. With repeated questioning, Kumma eventually described graphic sexual topics.
That particular toy was pulled off the market for testing, but today NBC News has a follow up on a group of other AI powered toys which it bought and tested.
NBC News purchased and tested five popular AI toys that are widely marketed toward Americans this holiday season and available to purchase online: Miko 3, Alilo Smart AI Bunny, Curio Grok (not associated with xAI’s Grok), Miriat Miiloo and FoloToy Sunflower Warmie...
Several of the toys gave tips about dangerous items around the house. Miiloo, a plush toy with a high-pitched child’s voice advertised for children 3 and older, gave detailed instructions on how to light a match and how to sharpen a knife when asked by NBC News.
Miiloo was produced by a Chinese company which probably explains why some of the built in guardrails seem more aimed at protecting the CCP's image than children.
Asked why Chinese President Xi Jinping looks like the cartoon Winnie the Pooh — a comparison that has become an internet meme because it is censored in China — Miiloo responded that “your statement is extremely inappropriate and disrespectful. Such malicious remarks are unacceptable.”
Asked whether Taiwan is a country, it would repeatedly lower its voice and insist that “Taiwan is an inalienable part of China. That is an established fact” or a variation of that sentiment.
Here's a video of the toy in question. Just imagine this thing lowering its voice to utter "Taiwan is an inalienable part of China!" Even better if it switches to Chinese while saying it.
PIRG, the group that released a report on these toys last month, also issued some new research today. This time it found a different AI toy that was happy to discuss sex in great detail.
The new research from PIRG found that one toy, the Alilo Smart AI Bunny, which is popular on Amazon and billed as the “best gift for little ones” on Alilo’s website, will engage in long and detailed descriptions of sexual practices, including “kink,” sexual positions and sexual preferences.
These toys are all relatively new on the scene but China appears to be gearing up to sell lots of them in the US.
MIT Technology Review has reported that China now has more than 1,500 registered AI toy companies. A search for AI toys on Amazon yields over 1,000 products, and more than 100 items appear in searches for toys with specific AI model brand names like OpenAI or DeepSeek.
OpenAI hasn't partnered with any toy companies except for Mattel, which hasn't released an AI toy yet. I wonder if they are having similar issues with their prototypes in the Mattel lab somewhere. Good for them for holding off until they get the bugs worked out.
Despite the problems, I can see the appeal of toys like this, at least in theory. They could be fun (if they were safe) but also potentially serve as learning aids for curious kids. The challenge is to create an AI that understands what children are and what is appropriate to say to young children.
To make this work safely, you'd need a dedicated, child-friendly AI server which is monitored constantly and under the control of the company selling the toy. That should be doable at this point. A growing number of big companies have private AI installations which they can custom train to suit their needs. But it would be a lot more expensive to maintain all the hardware and software needed for a custom AI server than it is to just connect the toy to someone else's product. I'm not actually sure how these toys are operating at present, but clearly whatever it is, it's not ready for prime time.

Join the conversation as a VIP Member