YouTube is rolling out more artificial intelligence-powered technology to catch more videos that may require age restrictions, meaning more viewers will be asked to sign into their accounts to verify their age before watching.
Similar to how YouTube used machine learning techniques to try to better catch violent extremism and more of the platform’s most severe content in 2018, and then in 2019 to find videos that included hateful conduct, the same approach will be used in this case to automatically flag videos YouTube deems not age-appropriate. As a result, YouTube is expecting to see far more videos pop up with age-gated restrictions.
The company is preparing for there to be some mistakes in labeling, as is the case with any rollout of AI moderation tech. And as part of the changes, people watching YouTube videos embedded on third-party sites will be redirected to YouTube to sign in and verify their age.
One of the biggest questions facing creators in YouTube’s Partner Program (those who are able to monetize their videos) is whether these moderation measures will have an effect on their moneymaking potential. YouTube’s team doesn’t believe so because the majority of the videos it anticipates will receive automatic age restrictions also likely violate the company’s advertiser-friendly guidelines. Basically, those videos would already have limited or no ads, according to YouTube.
That doesn’t mean mistakes won’t happen; they will, as countless incidents of wrongfully applied labels and takedowns and all manner of copyright strike controversies have illustrated in the past. But YouTube is bulking up its appeals team to handle appeals as they come in. Another concern creators have is that age-restricted videos won’t appear on the homepage. While age-restricted videos are less likely to appear on the homepage, age-restricting doesn’t automatically prohibit videos from appearing on the homepage, according to YouTube.
The company is preparing for some mistakes in labeling
The rollout comes as YouTube tries to address global criticism from concerned parents groups and advocacy boards over the website being unsafe for children. YouTube’s team routinely says YouTube isn’t meant for anyone under the age of 13 due to federal privacy protections, and the company points to YouTube Kids as the supposedly safer alternative. Yet that doesn’t stop young children from using the app at home or elsewhere. Some of the most popular channels are built around creating content specifically for kids. Right now, YouTube’s trust and safety team applies restrictions to videos when they come across them during reviews. If it’s deemed inappropriate for people under 18, it gets an age gate.
“Because our use of technology will result in more videos being age-restricted, our policy team took this opportunity to revisit where we draw the line for age-restricted content,” a new blog post from YouTube reads. “After consulting with experts and comparing ourselves against other global content rating frameworks, only minor adjustments were necessary.”
YouTube’s post also notes that for people in countries in the European Union, there may be a few additional steps the new rules require. In line with upcoming regulations like the EU’s Audiovisual Media Services Directive (AVMSD), some European users may be asked to provide additional evidence of their age. Effectively, if the systems can’t verify that someone is over the age of 18, they may be asked to “provide a valid ID or credit card to verify their age,” according to the post. It’s a one-time process, and YouTube is supposed to delete the information after it’s sent. The process was built to adhere to Google’s privacy and security principles, YouTube says.
People may see these changes immediately, but often, rollouts take some time for the effect to be noticeable. Still, prepare to stay logged into your YouTube account if you don’t want to encounter a bunch of age-restricted videos as it seems like there will be plenty more age gates popping up.