YouTube has announced a clampdown on disturbing and inappropriate children’s videos, following accusations that the site enabled “infrastructural violence” through the long-run effects of its content recommendation system.
The new policy, announced on Thursday evening, will see age restrictions apply on content featuring “inappropriate use of family entertainment characters” like unofficial videos depicting Peppa Pig “basically tortured” at the dentist. The company already had a policy that rendered such videos ineligible for advertising revenue, in the hope that doing would reduce the motivation to create them in the first place.
“Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetisation,” said Juniper Downs, YouTube’s director of policy. “We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right.”
Age-restricted videos can’t be seen by users who aren’t logged in, or by those who have entered their age as below 18 on both the site and the app. More importantly, they also don’t show up on YouTube Kids, a separate app aimed at parents who want to let their children under 13 use the site unsupervised.
But for the age restrictions to apply, the content does first have to be flagged for review by a normal user. The content is then reviewed by what YouTube says is one of thousands of moderators working around the world.