By: Ilma Hasan
February 2 2024
Meta must focus on addressing the potential abuse of its platform by influential users who exploit loopholes during election periods, Pamela San Martín, a member of Meta's Oversight Board, told WIRED in an interview recently.
The Oversight Board, an independent body of experts, advises on content moderation-related decisions on the platform. San Martín emphasized the lessons learned from events like the January 6, 2021, Capitol riots in Washington D.C., which were fueled by online vitriol and conspiracy theories spread by the 'Stop the Steal' movement.
She said that this year, which will see high-stake elections in over 50 countries, "Meta has gone through in different countries how their own algorithms, their own newsfeeds, their own recommendation systems, their own political ads can play a part in the protection or the disruption of electoral processes. Meta has to address the different issues that arise in all the different elections—including the U.S., but not only the U.S. (sic)."
San Martín criticized Meta for not adequately addressing the harms of coordinated campaigns, citing the insurrection following the 2020 U.S. presidential election and the attack on official buildings in Brazil by supporters of former President Jair Bolsonaro in 2022 over a refusal to accept the results. She told WIRED that the Board has reiterated to Meta the need to address coordinated campaigns as part of election integrity measures.
Misinformation will pose the greatest threat to the world over the next two years. Some of Meta's decisions, like ignoring the Oversight Board's recommendation to suspend Cambodian Prime Minister Hun Sen's Facebook page after he posted a video with violent language, have been criticized by critics for potentially encouraging other political leaders to misuse the platform.
"This was a lost opportunity," San Martin told WIRED. "It's hard to imagine a clearer case of a political leader weaponizing social media to amplify threats, to silence his opposition, to intimidate the political opposition. And what we were seeking was for Meta to have clear guidance to set out regarding the process it would adopt to deter public figures from exploiting its platforms to threaten and silence political opposition and incite violence."
Despite some actions, such as banning generative AI in political ads, Meta has allegedly allowed political ads questioning the legitimacy of the 2020 U.S. election results to run on Facebook and Instagram, the Washington Post reported in November last year. This, the report said, is part of "several changes the social media company and other platforms have made to loosen constraints on campaign advertising for 2024."
Another point of concern is Meta's reduction of over 21,000 jobs since November 2022, raising doubts over its content moderation capacity, especially during election periods. Although Meta claims to have 40,000 staff working on safety and security, experts argue that the layoffs "threaten democracies." This is particularly true in the global south, where content in regional languages is often overlooked, reported Newslaundry.
"We still have problems with inadequate staffing in the underinvested countries, many of which will have elections this year. We are living through a worldwide democratic backlash. Meta has a heightened responsibility, especially in the global south, where its track record has been poor in living up to these expectations. Meta must have adequate linguistic and cultural knowledge, and the necessary tools and channels to escalate potentially violating content," San Martin told WIRED.
For the 2024 elections, Meta plans to block new political ads in the final week of the U.S. election campaign, require advertisers to disclose AI alterations and fight against both foreign and domestic influence operations.