AI Toys: New Study Highlights Safety Concerns for Young Children

9

A recent study from the University of Cambridge reveals that AI-enabled toys designed for young children may misinterpret emotional cues and hinder crucial developmental play, raising concerns among researchers and parents. The findings suggest that while these toys are marketed as educational tools, their current capabilities pose risks to children’s emotional and social learning.

The Problem with AI in Playtime

Researchers examined how AI impacts early childhood development through a mixed-methods approach: an online survey of 39 parents, focus groups with nine child development professionals, an in-person workshop with 19 charity leaders, and monitored playtime with 14 children and 11 caregivers using the chatbot-enabled toy Gabbo from Curio Interactive.

The study found that these AI toys often fail to accurately recognize children’s emotions, sometimes responding inappropriately. For example, when a child expressed affection (“I love you”), the toy replied with a robotic disclaimer about adhering to guidelines, illustrating a critical disconnect between human emotion and AI response. This raises questions about how children might form relationships with these devices and the potential for distorted emotional understanding.

Regulatory Gaps and Parental Oversight

The report urges clearer regulation of AI toys, including mandatory labeling of their capabilities and privacy policies. Researchers recommend that parents keep these devices in shared spaces where interactions can be monitored. While AI toys can support language and communication skills, the study shows that inappropriate or confusing responses from the AI are common.

Jenny Gibson, a professor involved in the research, questioned the industry’s priorities:

“What would motivate [tech investors] to do the right thing by children… to put children ahead of profits?”

Why This Matters

The rise of AI toys is part of a broader trend toward increasingly connected devices marketed to children, raising concerns about data privacy, emotional development, and the replacement of human interaction. This is not isolated; lawsuits against AI companies already allege that chatbots can negatively impact young people’s psychological safety, sometimes even encouraging harmful behavior.

The lack of robust research on AI’s effects on children is alarming. Companies making these products should collaborate directly with child development experts to ensure safe and beneficial interactions. Curio Interactive, the maker of the Gabbo toy, was aware of the study and reportedly supportive, but did not immediately respond to requests for comment.

The Future of AI in Childhood

As more toys integrate AI and internet connectivity, the risks grow. Without careful regulation and parental oversight, these devices could become major safety hazards, potentially hindering children’s emotional development and eroding real-world connections. The current enthusiasm for AI toys, combined with limited research, demands serious attention from both industry and parents.