With less than 40 days until the U.S. presidential election, YouTube has rolled out a number of new tools to deal with voting misinformation on the platform.
Will these new tools actually fix the problem of fake news on the world’s biggest video website? Let’s start with the good stuff.
In its announcement on “authoritative voting information,” YouTube focuses on updates made to its information panels feature. Last year, the company began rolling out information panels — text prompts that automatically appear at the top of a page when a user searches for certain keywords linked with conspiracy theories and misinformation — in order to provide fact-checking to its users. The feature finally launched in the U.S. in April.
One major note from this new update is that YouTube will fact-check vote-by-mail misinformation in the same way it does with conspiracies about the coronavirus or the moon landing. Videos concerning voting by mail will be presented along with an information panel fact-check provided by the Bipartisan Policy Center, a bipartisan think tank.
In addition, when someone searches on YouTube for terms like “how to vote” or “how to register to vote,” an information panel will provide that user with a link to Google providing details on how to do just that.
The Google link offers the search engine’s own version of an information panel, known as knowledge panels, which includes key voting and voter registration details right on the page. When I try it, since I’m logged into my Google account, my search provides me with key dates and election information for New York state.
According to YouTube, this information comes from “non-partisan, third-party data partners,” such as Democracy Works. The data from these partners is sourced directly from state and county election officials.
This all seems like positive news so far. YouTube has taken a stronger stance against misinformation and conspiracy theories on its platform in recent years. In fact, sometimes the company has taken it a bit too far, even demonetizing all coronavirus-related content at one point before reversing the policy later.
The company has previously taken steps specifically related to elections, too. Earlier this year, YouTube announced it would ban certain types of fake news content, like those trying to make it seem like a living government official had died. It also took action against content making false claims against the eligibility of a political candidate, like birtherism videos.
However, there’s one new update in YouTube’s latest announcement that just seems weak.
YouTube’s information panels will now also provide info for searches on 2020 presidential or federal Congressional candidates. This seems like a good idea, just like the voting information, but here it fails. The information given is just so generic. Take a look at what comes up for President Donald Trump.
Not very informative.
YouTube also provides a Google link for users, entrusting them to do their own research. For the voting topics, at least the Google link provides non-partisan voting information right on the search page. For candidates, there’s none of that. Basically, whomever has the best SEO wins the information battle here.
One more example: Take Marjorie Taylor Greene, the Republican nominee for Georgia’s 14th Congressional district. Here’s what YouTube’s information panel says about her.
What they’re leaving out is that she also believes in QAnon, a right-wing conspiracy theory that claims President Trump is at war with a satanic global pedophile ring run by the Democratic Party and Hollywood elites. It also leaves out that she’s a 9/11 truther and coronavirus skeptic. The first thing you see when you clink the link to Google for more details isn’t a knowledge panel with some of this information. It’s a link to her very own Twitter page, promoting her candidacy.
As we grow closer to the U.S. election, misinformation is bound to ramp up. YouTube’s efforts to deal with that seem well-intentioned, but leave us hoping for even more.