- YouTube said it plans to start putting age restrictions on videos involving children’s characters that have been flagged for having inappropriate content.
- Age-restricted videos are automatically blocked from showing up in its YouTube Kids app, the company said.
- The policy change follows revelations that YouTube is pointing kids to thousands of disturbing videos involving kids characters and that some of those videos are making their way to YouTube Kids.
Following revelations that YouTube is serving up to kids thousands of inappropriate and disturbing videos, the company said it will step up its efforts to prevent children from seeing such content.
YouTube will restrict access to videos involving children’s characters that are flagged for having content that’s inappropriate for kids, the company said in a statement. Users won’t be able to watch such videos unless they are logged into the site and are older than 17.
Such age-restricted videos are already automatically blocked from showing up in YouTube Kids, the company’s app that’s designed for children under 13, YouTube said. So when videos are flagged for inappropriate content, they shouldn’t show up in YouTube Kids.
“The YouTube team is made up of parents who are committed to improving our apps and getting this right,” the company said in a statement.
The Verge previously reported the move.
YouTube plans to start age-restricting such flagged videos in coming weeks. The company relies on unpaid volunteers and general users to alert it to videos that violate its guidelines. It has a team of moderators that review such flagged videos. If those moderators determine that flagged videos involving children’s characters are actually inappropriate for kids, they will place the age restrictions on them.
The company is in the process of training its moderators on the new policy.
The changes follow the publication of a popular Medium article and a New York Times story about the thousands of disturbing videos on YouTube that target young viewers. While the videos frequently depict popular children’s characters, they’re typically knockoffs made by obscure or anonymous producers, rather than by the recognized studios that own the rights to the characters. Although the inappropriate, knockoff videos depict those characters in lewd, violent, or disturbing scenarios, YouTube often lists them alongside benign official videos from the characters’ owners.
The videos are primarily found on YouTube’s main site and service and can generally be viewed by anyone visiting the site. But The Times reported that some of them have shown up even in YouTube Kids.
The move to restrict access to the flagged videos was not a direct response to the recent press reports but has been in the works for some time, YouTube told The Verge.
The policy update is YouTube’s second move this year to discourage the proliferation of disturbing videos featuring family-friendly characters. In August, the company updated its advertising policy to bar such videos from including ads.