In just the first quarter this year, Meta claims its removed 2.5 million pieces of content tied to “organized hate.” The company, which raked in $28 billion in its most recent quarterly earnings, applauded itself for investing around $5 billion on safety and security for the entirety of last year. Like it did during the 2020 elections, Meta says it will prohibit new political ads during the final week leading up to the elections. Meta will also use its home pages to send notifications regarding voter registration as well as information on how and where to vote.

Advertisement

That all sounds well and good, but recent criticisms around Meta’s handling of election information in Brazil potentially call into question the effectiveness of Meta’s safeguards. A new report published by international NGO Global Witness claims Facebook was unable to detect explicit election-related information. As part of their study, Global Witness submitted 10 Brazilian Portuguese-language ads, five of which contained blatant election misinformation and five others “aiming to delegitimize the electoral process.” Global Witness says all 10 of the posts were approved by Facebook.

“Facebook knows very well that its platform is used to spread election disinformation and undermine democracy around the world,” Global Witness Senior Advisor Jon Lloyd said in a statement. “Despite Facebook’s self-proclaimed efforts to tackle disinformation—particularly in high stakes elections—we were appalled to see that they accepted every single election disinformation ad we submitted in Brazil.”

Advertisement

Global Witness says Facebook approved ads that contained false information regarding when and where to vote, as well as incorrect information regarding methods for voting. Global Witness’s findings, come on the heels of similar criticisms surrounding Facebook’s handling of political content in Myanmar, Ethiopia, and Kenya.

In a statement sent to Gizmodo, A Meta spokesperson did not refute the Global Witness Findings but said the company is, “deeply committed to protecting election integrity in Brazil and around the world.”

Advertisement

“We have prepared extensively for the 2022 election in Brazil,” the spokesperson said. “We’ve launched tools that promote reliable information and label election-related posts, established a direct channel for the Superior Electoral Court to send us potentially-harmful content for review, and continue closely collaborating with Brazilian authorities and researchers. Our efforts in Brazil’s previous election resulted in the removal of 140,000 posts from Facebook and Instagram for violating our election interference policies and 250,000 rejections of unauthorized political ads.”

Facebook, and now by extension Meta, have had to eat a justifiably heaping pile of shit related to election misinformation since former president Donald Trump’s 2016 victory. Researchers and lawmakers condemned the platform for allegedly letting foreign actors manipulate the company’s newsfeed with torrents of fake or misleading content. In a rare admission of fault, Meta executives have previously come forward admitting they could have done more to bolster their platform.

Advertisement

To their credit, Facebook did implement an expansive list of new procedures and policies to try and limit supposed misinformation on the site during the 2020 election, though research shows that didn’t stop potentially false information from exploding in popularity. While some groups extolled Facebook for the extra steps it took in 2020—particularly for dutifully removing over 100,00 posts attempting to “obstruct voting” and for its moratorium of political ads in the months following the 2020 elections—its efforts weren’t universally praised.

A report released in March 2021 by Avaaz determined Facebook could have prevented an estimated 10.1 billion page views to 100 prominent pages known for spreading election misinformation if the company had not waited until October 2020 to make adjustments to its algorithm. Avaaz estimates those groups managed to triple their monthly interaction on the platform between October 2020 and October 2021.

Advertisement

Facebook’s own internal research revealed in The Facebook Papers show a majority of U.S. adults believe Facebook was at least partially to blame for the January 6 Capitol Hill attack. Other Facebook data collected in the immediate aftermath of the attacks pinpointed then President Donald Trump’s own account as being principally responsible for a surge in reports concerning violations of its violence and incitement rules. Elsewhere, the documents reveal Facebook employees were aware of growing fears among U.S. users of being exposed to election-related misinformation.

Premium black unisex t shirt.