A flu vaccine being prepared in Las Rozas, Spain on Oct. 14, 2020.

A flu vaccine being prepared in Las Rozas, Spain on Oct. 14, 2020.
Photo: Pablo Blazquez Dominguez (Getty Images)

YouTube announced on Wednesday that it will now be extending current rules about lies, propaganda, and conspiracy theories about the coronavirus pandemic to include misinformation about coronavirus vaccines.

Advertisement

Per Reuters, the video giant now says it will prohibit content about coronavirus vaccines that “[contradicts] expert consensus from local health authorities or the [World Health Organization]”—such as bogus claims the vaccine is a pretext to stick people with tracking chips or that it will kill recipients and/or secretly sterilize them. The company also told Reuters that it would limit the spread of content that borders on violating the rules, though it didn’t elaborate on how it would do that.

Google’s rules already covered topics relating to treatment, prevention, diagnostics, and transmission of the virus, though the previous rules only specifically mentioned vaccines in the context of false claims that one “is available or that there’s a guaranteed cure.”

Advertisement

“A COVID-19 vaccine may be imminent, therefore we’re ensuring we have the right policies in place to be able to remove misinformation related to a COVID-19 vaccine,” YouTube spokesperson Farshad Shadloo told The Verge.

YouTube has historically struggled to prevent misinformation about the pandemic, which racked up countless millions of views on the site throughout 2020.

A study released in September by the Oxford Research Institute and Reuters Institute, partially covering the period of October 2019 and June 2020, found that coronavirus misinformation videos on YouTube had been shared more than 20 million times on Facebook, Twitter, and Reddit. That outranked CNN, ABC News, BBC, Fox News, and Al Jazeera’s combined reach on those sites over the same period (15 million). The researchers were only able to identify 8,105 videos removed by YouTube containing “covid-related misinformation” in that time period, which was less than 1% of all coronavirus videos.

Interestingly, the researchers also found strong evidence the primary driver of viral coronavirus videos on YouTube was Facebook, not subscribers to the YouTube channels themselves. This also potentially helps that content circumvent YouTube’s community standards enforcement, which is highly reliant on user reports; Facebook has implemented some loophole-laden rules on anti-vax content in ads but does not have rules against organic or unpaid anti-vax posts. From the study:

Misinformation videos shared on Facebook generated a total of around 11,000 reactions (likes, comments or shares), before being deleted by YouTube… The Oxford researchers also found that out of the 8,105 misinformation videos shared on Facebook between October 2019 and June 2020, only 55 videos had warning labels attached to them by third party fact checkers, less than 1% of all misinformation videos. This failure of fact-checking helped Covid-related misinformation videos spread on Facebook and find a large audience.

Oxford researchers observed that despite YouTube’s investment in containing the spread of misinformation, it still took YouTube on average 41 days to remove Covid-related videos with false information. Misinformation videos were viewed on average 150,000 times, before they were deleted by YouTube.

Advertisement

YouTube has also been a hub for anti-vax content more generally. While research last year (before the pandemic) found it was on the decline, the anti-vax movement is far from forced off the site. A University of Pennsylvania Annenberg Public Policy Center study in February unsurprisingly found that those who relied on traditional media outlets to learn about vaccines were less likely to believe in anti-vax claims than those who did on social media. A recent Pew Survey found that some 26% of U.S. adults get news on YouTube and that the content they are consuming is more likely to be laden with conspiracy info.

Producers and consumers of misinformation are adept at evading crackdowns. According to Wired, YouTube’s internal teams tasked with hunting down and eliminating videos with false claims about the virus found that its recommendation system—which had been successfully tweaked to promote significantly less conspiracy content in 2019—was becoming increasingly less important to driving large amounts of traffic to false claims about the coronavirus. Instead, they had noticed a major uptick in the number of videos which were uploaded and quickly promoted off-site via a “mix of organic link-sharing and astroturfed, bot-propelled promotion” on other sites like Facebook and Reddit.

Advertisement

YouTube told the Telegraph in September that the Oxford and Reuters study used data that was out of date. A spokesperson told the Guardian on Wednesday the company has removed more than 200,000 videos since early February, though many of them could have been re-uploads, automatically generated, or otherwise posted in corners where they had little chance of going viral in the first place.

Another recent study by the Berkman Klein Center for Internet & Society at Harvard found that social media was secondary to the spread of conspiracy theories about voting by mail, with the main driver being fake claims by Donald Trump and Republican allies that were then amplified by coverage in the traditional media. This appears to match findings by Oxford and Reuters researchers in April, who found prominent public figures made just 20% of claims in a sample of 225 statements rated false by fact checkers, but generated 69% of social media engagement.

Advertisement

Platforms including YouTube have had some success limiting the spread of some misinformation efforts, such as a sequel to the infamous Plandemic video that racked up more than 8 million views in May (the sequel’s release, however, was announced in advance). In September, YouTube moved to delete clips from a Hoover Institution interview with White House coronavirus adviser Dr. Scott Atlas, who has sowed doubt about the effectiveness of social distancing and wink-wink, nudge-nudged the Trump administration toward a dangerous “herd immunity” strategy.

According to the Guardian, YouTube says it will announce more steps it is taking to limit the spread of misinformation about vaccines on its site in the coming weeks.

Advertisement