A new study reveals the privacy risks associated with smart toys, highlighting the need for stricter data protection measures. Learn how popular toys like the Toniebox and Tiptoi collect and transmit children’s data.
As technology progresses, even simple toys are getting smarter. Some toys connect to the internet or have intricate software, allowing the toys to adapt to the child’s playing. However, some of these toys raise significant privacy concerns, with some data being collected on children’s behavior, according to a study conducted by researchers at the University of Basel in Switzerland.
For example, the Toniebox, a popular choice for young children, offers an easy-to-use interface where kids can place figurines on the box to start playing audio content. The device allows children to control playback by tilting the box or removing the figurine. While this functionality seems convenient, it has a downside: the Toniebox records detailed data on how and when it is used, including which figurine is activated when playback is stopped and any adjustments made. This data is then sent to the manufacturer.
Also Read: Google Rethinks Cookie Apocalypse: A New Approach to Online Advertising
Researchers led by Professor Isabel Wagner from the Department of Mathematics and Computer Science at the University of Basel studied twelve smart toys, including the Toniebox, Tiptoi smart pen, Edurino learning app, and Tamagotchi virtual pet. They also examined less popular toys, such as the Moorebot mobile robot with a camera and microphone, and Kidibuzz, a child-friendly smartphone with parental controls.
The research studies the companies’ data encryption practices (if any), security measures, data protection, transparency, and compliance with the EU General Data Protection Regulation (GDPR). The study found, for example, that neither the Toniebox nor the Tiptoi pen adequately secures data traffic. The Toniebox, even when offline, can store data locally and send it once reconnected to the internet. Researchers are also investigating a ChatGPT-integrated toy, noting that log data often disappears, possibly to manage storage.
Companies often justify data collection as a means to improve their products. However, the purpose and necessity of such data are often unclear to users. For example, some toys request unnecessary permissions, like access to a smartphone’s location or microphone. According to TechXplore, the ongoing study of the ChatGPT toy suggests it may transmit data resembling audio, potentially for optimizing speech recognition.
Julika Feldbusch, the study’s lead author, emphasizes the need for better privacy protection for children. She suggests that toy manufacturers adhere to stricter security and data protection standards, possibly indicated by a label similar to nutritional information on food packaging. Currently, it is challenging for parents to evaluate the security risks of smart toys.
Also Read: Jonathan Moran on the Future of MarTech: AI, Data Privacy, and Emerging Trends
Feldbusch warns of a “two-tier society” in privacy protection, where informed parents can choose safer toys while others may lack the knowledge or time to assess these risks. While the long-term effects of data collection on individual children are not fully known, constant surveillance could potentially impact personal development.