A video laden with falsehoods about Covid-19 emerged on Facebook last week, and has now been viewed many millions of times. The company has taken steps to minimize the video’s reach; but its fact-checks, in particular, appear to have been applied with a curious—if not dangerous—reticence. The reason for that reticence should alarm you: It seems that the biggest social network in the world is, at least in part, basing its response to pandemic-related misinformation on a misreading of the academic literature.

At issue is the company’s long-standing deference to the risk of so-called “backfire effects.” That is to say, Facebook worries that the mere act of trying to debunk a bogus claim may only help to make the lie grow stronger. CEO and founder Mark Zuckerberg expressed this precise concern back in February 2017: “Research shows that some of the most obvious ideas, like showing people an article from the opposite perspective, actually deepen polarization,” he said. The company would later cite the same theory to explain why it had stopped applying “red flag” warnings to fallacious headlines: “Academic research on correcting misinformation,” a Facebook product manager wrote, has shown that such warnings “may actually entrench deeply held beliefs.”

WIRED OPINION

ABOUT

Ethan Porter is an assistant professor at George Washington University. Thomas J. Wood is an assistant professor at Ohio State University.

Facebook’s fear of backfire hasn’t abated in the midst of this pandemic, or the infodemic that came with it. On April 16, the company announced a plan to deal with rampant Covid-19 misinformation: In addition to putting warning labels on some specific content, it would show decidedly non-specific warnings to those who’d interacted with a harmful post and nudge them toward more authoritative sources. The vagueness of these latter warnings, Facebook told the website STAT, was meant to minimize the risk of backfire.

But here’s the thing: Whatever Facebook says (or thinks) about the backfire effect, this phenomenon has not, in fact, been “shown” or demonstrated in any thorough way. Rather, it’s a bogeyman—a zombie theory from the research literature circa 2008 that has all but been abandoned since. More recent studies, encompassing a broad array of issues, find the opposite is true: On almost all possible topics, almost all of the time, the average person—Democrat or Republican, young or old, well-educated or not—responds to facts just the way you’d hope, by becoming more factually accurate.

Yes, it’s possible to find exceptions. If you follow all this research very carefully, you’ll be familiar with the rare occasions when, in experimental settings, corrections have failed. If you have a day job, though, and need a rule of thumb, try this: Debunkings and corrections are effective, full stop. This summary puts you much closer to the academic consensus than does the suggestion that backfire effects are widespread and pose an active threat to online discourse.

We’ve demonstrated this fact about facts many times ourselves. Our peer-reviewed book and multiple academic articles describe dozens of randomized studies that we’ve run in which people are exposed to misinformation and fact-checks. The research consistently finds that subjects end up being more accurate in their answers to factual questions. We’ve shown that fact-checks are effective against outlandish conspiracy theories, as well as more run-of-the-mill Trump misstatements. We even partnered up with the authors of the most popular academic article on the backfire effect, in the hopes of tracking it down. Again, we came up empty-handed.

All those Snopes.com articles, Politifact posts and CNN fact-checks you’ve read over the years? By and large, they do their job. By our count, across experiments involving more than 10,000 Americans, fact-checks increase the proportion of correct responses in follow-up testing by more than 28 percentage points. But it’s not just us: Other researchers have reached very similar conclusions. If backfire effects exist at all, they’re hard to find. Entrenchment in the face of new information is certainly not a general human tendency—not even when people are presented with corrective facts that cut against their deepest political commitments.

Évoluez avec nos formations informatiques stargeek.