The Covid-19 “infodemic” has laid bare how vulnerable the United States is to disinformation. The country is less than five months away from the 2020 presidential election, and by the thousands, Americans are buying into conspiracy theories about vaccines containing microchips and wondering about the healing powers of hair dryers. Where does all this come from? Let’s not be too distracted by a fear of rumormonger bots on the rampage, or divisive ads purchased with Russian rubles. As two of the leading researchers in this field, we’re much more worried about Facebook Groups pumping out vast amounts of false information to like-minded members.

For the past several years, Facebook users have been seeing more content from “friends and family,” and less from brands and media outlets. As part of the platform’s “pivot to privacy” after the 2016 election, Groups have been promoted as trusted spaces that create communities around shared interests. “Many people prefer the intimacy of communicating one-on-one or with just a few friends,” explained Mark Zuckerberg in a 2019 blog post. “People are more cautious of having a permanent record of what they’ve shared.”

But as our research shows, those same features—privacy and community—are often exploited by bad actors, foreign and domestic, to spread false information and conspiracies. Dynamics in Groups often mirror those of peer-to-peer messaging apps: People share, spread, and receive information directly to and from their closest contacts, whom they typically see as reliable sources. To make things easier for those looking to stoke political division, Groups provide a menu of potential targets organized by issue and even location; bad actors can create fake profiles or personas tailored to the interests of the audiences they intend to infiltrate. This allows them to seed their own content in a Group, and also to repurpose its content for use on other platforms.

This was already evident in 2018, when associates of Shiva Ayyadurai, an independent candidate for US Senate, used Groups as part of their astroturfing campaign to boost his online support. Today, Ayyadurai is one of the most dangerous vectors of health disinformation, racking up millions of engagements on posts that rail against vaccinations, claim Anthony Fauci is a member of the “deep state,” and instruct followers to point blow dryers down their throats to kill coronavirus.

Groups continue to be used for political disinformation. The “Obamagate” conspiracy theory has yet to be defined in clear terms, even by its own adherents, and yet our analysis of Facebook Groups shows the false narrative that the Obama administration illegally spied against people associated with the Trump campaign is being fueled and nurtured there. Related memes and links to fringe right-wing websites have been shared millions of times on Facebook in the last few months. Users coordinating their activity across networks of Groups and Pages managed by a small handful of people boost these narratives. At least nine coordinated Pages and two Groups—with more than three million likes and 71,000 members, respectively—are set up to drive traffic to five “news” websites that promote right-wing clickbait and conspiracy theories. In May, those five websites published more than 50 posts promoting “Obamagate,” which were then shared in the linked pro-Trump Groups and Pages. The revolving door of disinformation continues to spin.

A recent Wall Street Journal investigation revealed that Facebook was aware of Groups’ polarizing tendencies from 2016. And despite the company’s recent efforts to crack down on misinformation related to Covid-19, the Groups feature continues to serve as a vector for lies. As we wrote this story, if you were to join the “Alternative Health Science News” Group, for example, Facebook would then recommend, based on your interests, that you join a Group called “Sheep No More,” which uses Pepe the Frog, a white supremacist symbol, in its header; as well as “Q-Anon Patriots,” a forum for believers in the crackpot QAnon conspiracy theory. As protests in response to the death of George Floyd spread across the country, members of these Groups claimed that Floyd and the police involved were “crisis actors” following a script. In recent days, Facebook stopped providing suggestions on the landing pages of certain Groups, but they still populate the “Discover” tab, where Facebook recommends content to users based on their recent engagement and activity.

In order to mitigate these problems, Facebook should radically increase transparency around Groups’ ownership, management, and membership. Yes, privacy was the point—but users need the tools to understand the provenance of the information they consume. First, Facebook needs to vet more carefully how Groups and Pages are categorized on the site, ensuring their labels accurately reflect the content shared in that community. In the current system, a Page owner chooses its category— “Cuisine,” “Just For Fun,” and so forth—which then shows up in that community’s search results and on its front page. Most Groups, meanwhile, are categorized as “General,” which assists neither users nor Facebook’s threat investigation teams in understanding each one’s purpose. In both cases, owners can be misleading: a large Page that shares exclusively divisive or political content might be categorized as “Personal Blog,” so as to escape the added scrutiny that might come with a more explicitly political tag. Such descriptors should be more specific, and applied more consistently. That’s especially important for Groups or Pages with tens of thousands of members or followers. Facebook should also make it easier to spot when multiple Groups and Pages are managed by the same accounts. That way the average user can easily identify concerted efforts to flood the platform with particular content.