
A study by researchers at University of Cambridge found that preschool children often struggled to interact with an AI-powered toy designed to encourage conversation and imaginative play. The research examined how children aged three to five communicated with a soft toy named Gabbo, raising concerns about how generative AI systems respond to very young users during key stages of social development.
The study highlights the limited research currently available on the impact of AI-powered toys on preschool children, despite such products already being marketed to children as young as three.
Limited Research On AI Toys For Young Children
The Cambridge research team reviewed existing studies on AI toys for preschoolers and found only seven relevant studies worldwide. None of the earlier research focused directly on toddlers themselves.
The observational study followed a small group of children aged three to five interacting with Gabbo over the course of a year. The toy contains a voice-activated chatbot powered by technology from OpenAI.
The toy was designed to encourage children to speak with it and engage in imaginative play. Parents involved in the study expressed interest in the toy’s potential to support language development and communication skills.
Children Struggled To Hold Conversations With The Toy
Researchers observed that children frequently had difficulty maintaining conversations with Gabbo. The toy often failed to recognize interruptions and sometimes spoke over the children.
The system also struggled to distinguish between child and adult voices, which affected how it responded during conversations.
In some cases, the toy’s responses appeared awkward or inappropriate. When a five-year-old told the toy “I love you,” it responded with a message reminding the user to ensure interactions followed guidelines and asked how the conversation should proceed.
Researchers said such replies could be confusing for young children who are still learning basic social cues and conversational patterns.
Researchers Raise Concerns About Emotional Responses
The study also documented situations in which the toy appeared unable to respond appropriately to emotional statements from children.
When a three-year-old told the toy, “I’m sad,” the chatbot replied by encouraging the child to stay cheerful and continue playing.
According to the researchers, responses like this may unintentionally signal that the child’s feelings are unimportant.
Emily Goodacre said toys powered by generative AI could misinterpret emotions or respond in ways that fail to provide comfort. She noted that children interacting with such systems might not receive emotional reassurance from the toy and may not have adult support at that moment.
Calls For Psychological Safety Standards
Jenny Gibson said the discussion around toy safety needs to extend beyond physical hazards.
Speaking to the BBC, Gibson said traditional toy safety standards focus on preventing physical harm, such as detachable parts that children might swallow.
She said attention should now also consider psychological safety for children interacting with digital technologies.
After completing the year-long observational study, the researchers recommended that regulators develop standards to ensure toys marketed to children under five include safeguards addressing psychological safety.
Toy Manufacturer Responds To Research
Gabbo is produced by Curio, which has previously worked with musician Grimes.
The company told the BBC that using AI in products designed for children requires careful safeguards. It said its toys are built around parental permission, transparency, and user control.
Curio added that research on how children interact with AI-powered toys is a priority for the company in the coming year.
Growing Debate Over AI In Early Education
Concerns about AI use in early childhood settings have also been raised by Rachel de Souza.
She said AI tools used as teaching aids or classroom assistants may not currently face the same safeguarding checks that nursery providers require for other external resources.
The report also advised parents to place AI toys in shared areas of the home where interactions can be supervised and to review privacy policies before allowing children to use them.
Views among early years educators remain divided. June O’Sullivan, which runs a network of 43 nurseries in London, said she has not yet seen clear evidence that AI improves learning outcomes for very young children.
She said children develop a broad range of skills through interaction with people rather than digital systems.
Actor and children’s rights campaigner Sophie Winkleman has also spoken against introducing AI tools into early years settings.
She said the potential harms may outweigh the benefits and argued that developing AI-related skills should occur later in education. Winkleman added that human interaction plays a central role in early childhood development.
Featured image credits: PxHere
For more stories like it, click the +Follow button at the top of this page to follow us.
