Image for article titled Meta Is Making It Harder to Wind Up in Facebook Jail

Photo: Peter Macdiarmid (Getty Images)

Facebook jail is about to get less crowded. Under a new set of policies revealed this Thursday, parent company Meta says it’s now harder for users to wind up with their Facebook accounts suspended for lesser violations of its rules. Those changes come after years of pushback from civil society groups and Meta’s semi-independent Oversight Board, which criticized the company’s “disproportionate and opaque” policies around “strikes” that can result in otherwise benign content being flagged as harmful. Meanwhile, actual, more serious harmful content continues to seep through the moderation cracks.

With so much of the conversation around Meta’s content moderation these days dominated by debates over the platforms’ handling of unhinged politicians and deeply contentious political arguments, it’s easy to overlook the far greater amount of everyday users who, for right or wrong, find themselves locked up in Facebook Jail.

How Facebook’s jail is changing

Moving forward, Facebook’s penalty system will focus more on providing users with context and transparency about why a piece of content violates its rules versus immediately resorting to handing out a restriction or suspension. Thirty-day restrictions from posting content, one of the more severe penalties, will now only occur after a seventh violating post in most cases. The general idea here, Meta says, is to try and save account restrictions for “persistent violators” who continue to break rules even after being repeatedly admonished. In theory, that should give users the chance to learn from their mistakes and prevent others from getting locked out of their mistakes due to misunderstanding.

Advertisement

“Under the new system, we will focus on helping people understand why we have removed their content, which is shown to be more effective at preventing re-offending, rather than so quickly restricting their ability to post,” Facebook Vice President of Content Policy Monika Bickert said.

This softer edge to Facebook’s prosecutorial force only applies to more benign cases. In situations where users post containing child exploitation imagery, terrorist content, or other more severe material, Meta says it still maintains a policy of immediate action against those users’ accounts. That can include removing particularly noxious accounts from the platform altogether.

Advertisement

“We’re making this change in part because we know we don’t always get it right,” Bickert added. “So rather than potentially over-penalizing people with a lower number of strikes from low-severity violations and limiting their ability to express themselves, this new approach will lead to faster and more impactful actions for those that continuously violate our policies.”

What exactly is Facebook jail?

Anyone who’s spent a decent chunk of time on Facebook has probably come across examples of users who claim they’ve had their account suspended or blocked for what seems like no real justifiable reason. Welcome to Facebook Jail.

Advertisement

There are plenty of times where users who claim innocence actually did violate a Facebook term without necessarily knowing it. There are other cases, though, where Meta’s largely automated moderation system simply gets things wrong and flags users for inaccurate or nonsense reasons. That over enforcement leads to a perception by some users that Facebook rules its platform with an iron first. It’s also partly why a decent chunk of Republican lawmakers remain convinced Mark Zuckerberg is on a personal mission to silence conservative voices. He isn’t.

‘A meme is a meme’

Examples of user confusion and frustration over Facebook’s enforcement run through The Facebook Papers, a series of internal documents shared with Gizmodo by Facebook whistleblower Frances Haugen. The documents show examples of younger users who were annoyed after they were flagged for posting satirical content to morbid meme pages.

Advertisement

“This is what this page is for,” a 17-year-old user from the U.K. wrote. “Even though it [the meme] violated policy, this group is for memes like the one I posted. It wasn’t anything bad.”

“A meme is a meme,” another 16-year old user from Pakistan wrote.

In other cases, an adult user from Germany expresses frustration over having one of his posts removed without explanations. Other users actually even apologized to Facebook, claiming they weren’t even aware they had violated the company’s terms.

Advertisement

With the new, more lax approach, Meta’s trying to strike a sweet middle ground. The company claims its internal research shows 80% of users with a low number of strikes for violating rules don’t go to violate the policy again in the next 60 days. That suggests that warnings or other light signals to lower level offenders world pretty well at preventing repeat cases. That other 20% of deliberate assholes then become the focus of account restrictions. The obvious concern here is that the policy change could give harmful users more latitude at a time when misinformation, bullying and general toxicity still pervade social media. Meta seems confident that won’t’ happen.

“With this update we will still be able to keep our app safe while also allowing people to express themselves,” Bickert said.

Advertisement

‘Room for improvement remains’

Even though Facebook’s changes were driven in part by the Oversight Board’s feedback, the Supreme Court-like entity wasn’t unwavering in its praise. Though the board welcomed Facebook’s attempts at transparency it went on to criticize the company for only really focusing on “less serious violations.” The board claimed the new rules did little to address transparency questions around more “severe strikes” which they say can severely impact journalists or activists who have their accounts suspended for unclear reasons.

Advertisement

“Today’s announcement focuses on less serious violations,” the Oversight Board said. “Yet the Board has consistently found that Meta also makes mistakes when it comes to identifying and enforcing more serious violations.”

Meta did not immediately respond to Gizmodo’s request for comment.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums