Google says it will add new parental controls to its YouTube Kids app, after inappropriate videos were repeatedly discovered on the service.
One of the new options prevents channels that have not been vetted by human moderators appearing on the app.
Parents will be able to choose between human-curated playlists and letting YouTube’s algorithms decide what children get to watch in the app.
But one expert said the changes were “still nowhere near good enough”.
Children’s presenter and parent Ed Petrie, who has hosted programmes for Nickelodeon and the BBC, asked: “Why are these features only an option?
“Nickelodeon shows don’t have an option for your kids to stumble across an animation of SpongeBob SquarePants having his liver removed.
“YouTube just can’t get their heads around the fact that when you’re expressly providing content for kids, there is an ethical need for an actual human being viewing it with their eyes and ears before it gets inside a child’s brain.”
Children’s charity NSPCC said the stricter controls were “encouraging” but “long overdue”.
“Parents should have the confidence that a platform designed for children only shows appropriate content, and that videos which some children might find distressing or upsetting do not slip through the net,” said a spokesman.
YouTube currently uses algorithms to decide which videos can appear on YouTube Kids.
Any video uploaded to the regular version of YouTube can theoretically appear on YouTube Kids if the company’s algorithms judge it to be suitable. The company says its machine learning processes can take several days to evaluate a video.
However, inappropriate videos have repeatedly appeared on YouTube Kids. One, found by the BBC’s Newsround programme, showed characters from children’s cartoon Paw Patrol on a burning plane.
On Wednesday, YouTube announced plans to add three new settings to its Kids app.
These will let parents:
- choose “trusted collections” that their children are allowed to watch, from brands such as Sesame Street
- hand-pick every individual video and channel they are happy for their children to see, if they wish to do so
- stop the app offering any videos from channels that have not been approved by a human moderator
Parents will be prompted to switch the settings on or off when setting up the app, but they will remain optional and will not all be released at the same time.
One parent has gone as far as setting up his own video app after becoming concerned that his children had been exposed to inappropriate content online.
Hugo Ribeiro, founder of video app kiddZtube, said: “YouTube is abusing an asset they have and not thinking enough about safety.”
Rather than relying on algorithms, kiddZtube uses human moderators – four schoolteachers – to curate playlists and approve every video that appears in the app.
“We believe it’s better to have fewer videos, with higher quality,” he told the BBC.
The teachers also write quiz questions to accompany each video, to make watching a more interactive experience.
However, unlike YouTube Kids, which is free to download, kiddZtube costs £4.99.
“It’s a different business model,” he said.
“YouTube does not want an app that is more narrow in its scope. There are costs to that.
“But in terms of our children, we don’t want them to pay that cost.”
In addition to the new settings in YouTube Kids, which will be made available in the coming months, YouTube will also let parents block videos they do not want children to see.
“We’ve never stopped listening to feedback and we’re continuing to improve the app,” said YouTube’s Malik Ducard.