Critics say that Meta has often caused serious psychological harm to younger users by exposing them to a variety of toxic and inappropriate content. Eating disorders, depression and even suicide have all been blamed on the company’s algorithms, and the calls to reform those algorithms have grown so loud lately that it’s apparently become impossible to ignore.
The social media giant announced Tuesday new changes to its content filtering system, the likes of which are designed to steer younger users away from harmful, disturbing content. The platform says it hopes these changes will allow teens to have “safe, age-appropriate experiences” whenever they visit Facebook and Instagram.
Advertisement
What do those changes actually look like? Meta says that it has instituted new guardrails for accounts that self-identify as under the age of 18. The system will now limit younger users’ ability to encounter certain types of content—like posts about eating disorders and self-harm—even if those posts are shared by someone they follow. According to a recent blog post from Meta, the company will…
…start to remove this type of content from teens’ experiences on Instagram and Facebook, as well as other types of age-inappropriate content. We already aim not to recommend this type of content to teens in places like Reels and Explore, and with these changes, we’ll no longer show it to teens in Feed and Stories, even if it’s shared by someone they follow.
Advertisement
The company says it will also be making it more difficult for younger users to find disturbing content in the “search” and “explore” fields, and that it won’t show sexual content to accounts that identify as under 16 years of age. If teens do search for disturbing content—like suicide or self-harm—they’ll be referred to “expert resources for help.” All of these changes are expected to go into effect over the next few weeks.
Why is Meta doing all this now? Well, it might have something to do with the fact that the company is currently being sued by over 40 state governments. The massive, joint lawsuit filed in October by dozens of attorney generals accuses the social media giant of using “dopamine-manipulating features” to hook young users on their sites and thus cultivate “addiction to boost corporate profits.” Meta now appears to be covering its ass with a too-little-too-late policy shift that really should have been done years ago.
Advertisement
If Meta’s changes are better than nothing, it’s somewhat unclear how the platform plans to deal with teens who, you know, lie about their age when they sign up. We all know that teens pretend to be older than they are for a variety of reasons—namely, to get their hands on drugs and alcohol. Given the fact that lying about your age when you join an internet platform is significantly easier than securing a fake ID, it’s sorta unclear why Meta thinks young users won’t be able to outsmart their less-than-optimal age-verification systems. Gizmodo reached out to Meta for more information and will update this story if the company responds.
In the past, Meta has made modest efforts to regulate who can see what on its platforms, although you could argue that its content moderation process is still a total mess. The company claims that it has “developed more than 30 tools and resources to support teens and their parents, and we’ve spent over a decade developing policies and technology to address content that breaks our rules or could be seen as sensitive.”
Services Marketplace – Listings, Bookings & Reviews