Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f95227%252ff62093f7 4257 48da 913f 96f8c0b0699a.png%252f930x520.png?signature=ym6qrzft9gilrr6oqhds2rryf3s=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

For the first time, Facebook is releasing stats concerning how the social media platform moderates what goes on in Facebook Groups.

Alongside the new numbers, the company has also announced new policies surrounding how it will deal with conspiracy theories and hate speech that often flourishes inside these groups.

First, the stats. Over the past year, according to Facebook, about 12 million pieces of content that violated the platform’s hate speech policies were removed from Facebook Groups. An additional 1.5 million pieces of content that fell under its organized hate policies were also removed from Facebook Groups. 

According to Facebook, the vast majority of this content — 87 percent of hate speech and 91 percent of organized hate content — was removed proactively. Basically, this means that Facebook’s content moderation AI took these violating posts down before any user even reported them.

If members of Groups repeatedly break Facebook’s rules, the company doesn’t just remove the content, it removes the Group entirely. Facebook says it deleted more than 1 million Groups this year that violated these policies.

The problem with Groups

Groups allow users to congregate with other members around a specific topic. Each has its own feed just made up of posts published in the Group. Groups can be public, meaning anyone can read the posts and join the Group to contribute. Or they can be private, which obscures the Group feed from a user until their membership is approved by a Group administrator. Private groups can even be completely hidden from Facebook users outside the Group, which would make it impossible for non-members to join unless they were specifically invited by other members.

There are many cases of conspiracy theories, misinformation, and hate festering in some of these Groups. Take, for example, the false stories earlier this summer about violent social justice protesters invading small towns across the United States. Just last week, the FBI and local police departments had to tackle the misinformation that was running rampant in Facebook Groups concerning lies about anti-fascist protesters starting wildfires on the West Coast. 

So, what’s Facebook going to do to deal with this problem? From the sounds of its announcement Thursday, it feels like it’s all but hitting the reset button on Groups.

Facebook’s new policies put a lot of responsibility on Groups’ administrators and moderators. In order to enforce these new policies in Groups… a Group actually has to have an admin and moderator in the first place. People who create a Group can also choose to leave it. That means that there are a number of Facebook Groups who don’t have any admins or moderators at all. 

Over the next few weeks, Facebook will suggest admin roles to members of those Groups. If no one steps up, it will begin the process of archiving the Group, basically creating a time capsule of it at that moment for members but closing it off to new members and posts. 

For groups with admins and mods, they will begin to play a more central role in Facebook’s moderation policies. If a Group member violate Facebook’s Community Standards, their posts will no longer be automatically published for 30 days. Moderators will need to approve all of that user’s posts first during that time period. If admins and moderators continuously approve violating content, the Group will be banned. Once a Group is banned, admins and moderators will be barred from creating any new Facebook Groups for 30 days.

In order to combat coronavirus misinformation, Facebook will no longer recommend health-related Groups to members. Users can still join them or find them in Facebook search, but the platform will not promote them.

“It’s crucial that people get their health information from authoritative sources,” says Facebook in its statement. The social media platform has attempted to take action against the COVID-19 misinformation that runs rampant on the site. However, considering so much of this originates from Groups, this may be its most critical move dealing with the issue yet.

Earlier this year, Facebook removed a number of Groups related to QAnon, a far-right conspiracy theory targeting President Donald Trump’s political opponents. It also removed some Groups related to the right-wing militia movement Boogaloo Bois.

Of course, none of these new moves will completely solve the problem. But they show that Facebook is now focusing on the source of so many dangerous conspiracy theories and hate that spreads on its platform: Facebook Groups.