Social media platforms brace for chaos during midterm elections – Press Enterprise

By DAVID KLEPPER | Associated Press

A Facebook search for the words “election fraud” first yields an article claiming that employees at a Pennsylvania children’s museum are brainwashing children into accepting stolen elections.

Facebook’s second suggestion? A link to an article from a site called MAGA Underground stating that the Democrats are planning to manipulate next month’s midterms. “You should still be mad about the fraud that happened in 2020,” the article emphasizes.

With less than three weeks before polls close, there is a plethora of misinformation about voting and elections on social media, despite promises from tech companies to tackle a problem attributed to increasing polarization and mistrust.

While platforms such as Twitter, TikTok, Facebook and YouTube say they have expanded their work to detect and stop harmful claims that could suppress the mood or even lead to violent confrontations, a review of some of the sites that they’re still catching up with 2020, when then-President Donald Trump’s lies about the election he lost to Joe Biden sparked a U.S. Capitol uprising.

“You’d think they would have learned by now,” said Heidi Beirich, founder of the Global Project Against Hate and Extremism and member of a group called the Real Facebook Oversight Board that has criticized the platform’s efforts. “This is not their first election. This should have been addressed before Trump lost in 2020. The damage is quite deep right now.”

If these US-based tech giants can’t properly prepare for a US election, how can anyone expect them to handle foreign elections, Beirich said.

Mentions of a “stolen election” and “voter fraud” have skyrocketed in recent months and are now two of the three most popular terms in discussions about this year’s election, according to an analysis of social media, online and broadcast content conducted by media outlets. intelligence agency Zignal Labs on behalf of The Associated Press.

On Twitter, Zignal’s analysis found that tweets reinforcing conspiracy theories about the upcoming election have been reposted many thousands of times, alongside posts reiterating debunked claims about the 2020 election.

Most major platforms have announced steps to curb voting and election misinformation, including labels, warnings, and changes to systems that automatically recommend certain content. Users who consistently break the rules may be banned. Platforms have also partnered with fact-checking organizations and news outlets such as the AP, which is part of Meta’s fact-checking program.

“Our teams continue to closely monitor the midterms and are working to quickly remove content that violates our policies,” YouTube said in a statement. “We remain vigilant before, during and after Election Day.”

Meta, the owner of Facebook and Instagram, announced this week that it had reopened its election command center, which oversees real-time efforts to fight election misinformation. The company rejected criticism that it is not doing enough and denied reports that it had reduced its number of election-focused employees.

“We are investing a significant amount of resources, with work spanning more than 40 teams and hundreds of people,” Meta said in an emailed statement to the AP.

The platform also said that starting this week, anyone who searches Facebook with keywords related to the election, including “election fraud,” will automatically see a pop-up window with links to trusted voting resources.

TikTok set up an election center earlier this year to help voters in the US learn how to register to vote and who’s on their ballot. The information is provided in English, Spanish and over 45 other languages. The platform, now a leading source of information for young voters, also adds labels to misleading content.

“Providing access to authoritative information is an important part of our overall strategy to combat electoral misinformation,” the company said of its efforts to prepare for the midterm elections.

But policies designed to stop harmful election misinformation are not always consistently applied. For example, false claims can often be buried deep in the comments section, where they can nevertheless leave an impression on other users.

A report released last month from New York University accused Meta, Twitter, TikTok and YouTube of amplifying Trump’s false statements about the 2020 election. The study cited inconsistent rules regarding misinformation and poor enforcement.

Concerned about the amount of misinformation about voting and elections, a number of groups have urged tech companies to do more.

“Americans deserve more than lip service and half measures from the platforms,” said Yosef Getachew, director of Common Cause’s media and democracy program. “These platforms are armed by enemies of democracy, both abroad and at home.”

Disinformation about elections is even more prevalent on smaller platforms popular with some conservatives and far-right groups such as Gab, Gettr and TruthSocial, Trump’s own platform. But those sites have a small audience compared to Facebook, YouTube or TikTok.

Beirich’s group, the Real Facebook Oversight Board, drew up a list of seven recommendations for Meta aimed at reducing the spread of misinformation ahead of the election. They include changes to the platform that would promote content from legitimate news outlets over partisan sites that often spread misinformation, as well as more focus on misinformation targeting voters in Spanish and other languages.

Meta told the AP that it has expanded its fact-checking network since 2020 and now has twice as many Spanish-language fact-checkers. The company also launched a Spanish-language fact-checking tip line on WhatsApp, another platform it owns.

Leave a Comment