Beser said that the current version of the app will still be available for those who are content to use it, but that Google was continuing to "fine tune" its filters for the more open selection of content, encouraging parents to block and flag content they do not see as appropriate.
YouTube now uses something called an algorithm.
"Even though YouTube's terms of service say [the site] is for ages 13 and up", says Golin, "the platform is loaded with cartoons, nursery rhyme videos, toy commercials and all sorts of content for young children".
In controls rolling out later this year, parents will be able to manually approve individual videos or channels that their children can access through the app, giving them the ability to pre-vet and handpick a collection of videos to ensure they are appropriate. "One area of focus has been building new features that give parents even more control around the content available in the YouTube Kids app so they can make the right choice for their unique family and for each child within their family", the statement read.
The Google team today announced three new features for the popular YouTube Kids app that it says will give parents even more control around the content that can be accessed by the kids-friendly app. YouTube Kids had previously relied primarily on parents to report troubling videos, a practice that was criticized because children are often the only ones watching the content on tablets or phones.
Critic Of Trump's Travel Ban Weighs In On Supreme Court Hearing
But here's the thing, I thought, biased though I am, that Katyal did a great job answering those questions and hypotheticals. At oral arguments , the liberal justices appeared skeptical of the ban, and the conservatives appeared poised to support it.
However, inappropriate videos have repeatedly appeared on YouTube Kids. The company says its machine learning processes can take several days to evaluate a video.
And some consumer groups have charged that YouTube's main app has continued to enable plenty of kids programming, in violation of child privacy laws.
About 1.6 million videos were removed after users, activist organisations or governments flagged them. Enabling this will also disable any video recommendations. These videos have included Spiderman urinating on Frozen's Elsa, Peppa Pig drinking bleach, and Micky Mouse getting run over by a auto.
Jennifer Harris, the mother of a middle schooler, told CNBC that she used to make her son show her every video he wanted to watch on YouTube Kids before it started.