A video explains how to tie a noose, play with matches or taste battery acid. There are cartoons dubbed over with obscenities, an introduction to wine tasting, and beer commercials.
Consumer groups have dug up these adult-themed videos buried among the cartoons, silly songs and science explainers in the YouTube Kids mobile app.
Concerned that the videos could be dangerous or disturbing to children, the groups have updated a complaint asking the FTC to investigate the app for unfair and deceptive business practices. The original complaint, filed in April, went after Google-owned YouTube for including ads mixed with content. The groups also object to the inclusion of unboxing videos, which show people opening new toys or gadgets.
The videos have not led to a rash of noose tying, wine-swilling, profanity-spewing toddlers. A handful of parents did leave negative reviews of the YouTube app after their children found inappropriate videos. However, the most egregious examples were first compiled by adults — specifically by a company that makes a competing product called KidKam.
“It’s clear that Google is engaged in deceptive marketing when it tells parents that YouTube Kids is a safe place to explore,” said Josh Golin, who works for the Campaign for a Commercial-Free Childhood, one of the watchdog groups behind the complaint.
“Particularly concerning are videos that present adult content in cartoons or using familiar children’s characters such as an expletive-laced parody of the film Casino using Bert and Ernie,” said Golin.
Other videos the groups managed to dig up include a talk show host mentioning marijuana, a TED talk on suicide, Sarah Jessica Parker in a revealing skirt, and a dance lesson that includes a crotch grab.
Google released the Android and iOS app in February. It has a simplified design with big, kid-friendly buttons and no comments. Popular kid channels and content are highlighted on the main screen. Finding edgier clips requires using the app’s search tool for terms that aren’t blocked.
Instead of handpicking what videos show up in the app, YouTube relies on a combination of automated filtering and user flagging to remove inappropriate content.
“We work to make the videos in YouTube Kids as family-friendly as possible and take feedback very seriously,” said a Google spokesperson in a statement. “We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed.”
Google is blocking the inappropriate videos that are flagged, but the CCFC wants YouTube to switch to a pre-filtered approach, where each video is approved before it can appear in the app. That approach is typical for TV channels, but would be a huge undertaking for YouTube, which is working with a vast library of videos.
Google says concerned parents can turn off the search feature in the app’s settings.