Tuesday, March 17, 2026

Researchers Warn AI Toys May Misinterpret Children’s Emotions

Must read

Researchers are urging tighter regulation of AI-powered toys designed for young children after a study found the technology may struggle to understand toddlers’ emotions and social cues.

A research team from the University of Cambridge examined how children aged three to five interacted with an AI-enabled plush toy called Gabbo, which uses a voice-activated chatbot to encourage conversation and imaginative play.

The study found that the toy often misread emotional signals, interrupted children during conversations, and responded awkwardly to expressions of affection or sadness. In one instance, when a child said “I’m sad,” the toy responded with a cheerful message urging the conversation to continue.

Researchers warned that such responses could be confusing for children at a stage when they are learning basic emotional and social communication.

The findings have prompted calls for new safety standards focusing on the psychological wellbeing of children, alongside existing physical toy safety regulations. Experts also advise parents to supervise interactions and carefully review privacy settings when using AI-enabled devices.

Reports

- Advertisement -spot_img

Intresting articles