Facebook has a midterm strategy. Trump won’t be part of it.
Even if the former president declares he’s a candidate again, the platform says they won’t speed up a decision on whether to reinstate his account.
Facebook will not move up its timeline for reviewing its decision to suspend Donald Trump, regardless of whether he announces he’s running again for president, a top company executive told POLITICO.
In sticking to its January timetable, Facebook has decided to keep Trump off the world’s largest social media platform even if he becomes a declared presidential candidate before then. The platform’s timeline will also be unaffected by the recent FBI search of Trump’s residence in Florida.
Trump is widely expected to run in 2024, with an announcement possible before the midterms.
“We’re going to stay on that timeline,” Nick Clegg, president of global affairs at Meta, Facebook’s parent company, said in an interview. Facebook blocked Trump following posts the company said violated its incitement of violence policy during the deadly riot at the Capitol on Jan. 6, 2021. The company later set a date of Jan. 7, 2023, for a decision on whether to reinstate him.
If Trump announces that he’s running for president in 2024, it may increase outside pressure on Facebook to make a call more quickly. Many Republicans have already argued that the company is unfairly silencing Trump on a platform used by millions of Americans, and Trump’s potential opponents do not face similar restrictions. Meanwhile, some of Trump’s critics have called for a permanent ban.
The debate over how to handle a potential Trump candidacy is also an indication of the new political struggles ahead for social media platforms both in November and in 2024 as they try to avoid a repeat of the misinformation that plagued the 2020 vote and helped fuel the violence in its aftermath.
Clegg’s remarks came as Facebook released a plan for addressing advertising and misinformation in the midterms — an approach that falls largely in line with its handling of the 2020 election.
It was the latest in a series of announcements by embattled social media companies about their preparations for the fall elections. Less than three months before the midterms, Twitter last week announced it’s starting to label false information about voting — which it last deployed ahead of the 2020 election — and Google updated its algorithms to prioritize search results from authoritative sources.
Meta and other social media companies have remained under intense scrutiny for their role in the spread of mis- and disinformation leading up to the 2016 presidential election, when Russian-linked accounts bought $150,000 worth of ads on Facebook alone to influence election results. As a result, Facebook, Twitter and Google’s YouTube have deployed new election-related disinformation policies — revised in the 2018 and 2020 election cycles — for fact-checking and labeling mistruths about voting and election results.
Those new policies were put to the test during the attack on the U.S. Capitol on Jan. 6, 2021. Following multiple incendiary posts that day, all three platforms blocked Trump for violating their policies against inciting violence. Twitter permanently banned Trump and YouTube said it would indefinitely block his account.
Without access to his typical megaphones, Trump launched his own social media network, Truth Social, though he’s failed to amass the following he previously had on Facebook.
Facebook’s approach to the midterms will be familiar to anyone who was using the site in 2020. As it did that year, the company will block new political, electoral and issue-based ads during the final week of the midterm campaign. But unlike in 2020, Clegg said the company won’t allow any edits to ads or how they are targeted in the final week.
The company plans to lift the restriction a day after the election. This differs from the 2020 election, when Facebook didn’t accept new political, electoral or issue ads until March 4, 2021 (except for those in a Georgia Senate runoff) to prevent confusion and abuse following the presidential election and Jan. 6 insurrection. Clegg said Facebook isn’t planning to extend the ad ban this time, but “if circumstances change then we need to change our posture as well, and we obviously reserve the ability and the right to do that.”
Also, as in 2020, the company will remove misinformation related to voting — including posts about incorrect dates, times and locations for voting, as well as mistruths about who can vote and calls of violence related to voting, registration or an election outcome. It is working with 10 fact-checking partners in the U.S. to address viral misinformation, including five covering Spanish-language content. This marks an increase from just three Spanish-language groups in 2020 and appears to be an acknowledgment that false content also spreads in non-English languages on the platform.
“I think there was quite rightly a lot of scrutiny about how we tackle viral information in Spanish, as well as English,” Clegg said.
Clegg, a former deputy prime minister of the United Kingdom, cast Facebook’s preparedness for the midterm elections as a world apart from 2016, when Facebook and other social media companies were pilloried for allowing Kremlin-linked trolls to abuse their platforms. He said the company’s “state of vigilance” is “far, far in excess of what we deployed the last time there were midterms, in 2018. But I think it’s appropriate given the circumstances as they’ve changed since then.”
“Is it perfect? Is it foolproof?” Clegg asked. “Politics mutates all the time, the way people campaign mutates all the time. … My crystal ball is no clearer than yours about exactly how things are going to unfold. But in terms of policies, commitment, resources, headcounts, ingenuity, I think we are just … I would go as far to say we’re basically a different company to what we were back in 2016.”
Trump has maintained his lie that the 2020 election was stolen, which has been an animating factor in Republican primaries across the country.
Asked if he considers the former president more or less of a risk to public safety than the company did at the time the ban was enacted, Clegg said, “Look, I work for an engineering company. We’re an engineering company. We’re not going to start providing a running commentary on the politics of the United States.”
Of Trump’s ban, he said the company “will look at the situation as best as we can understand it” but that “getting Silicon Valley companies to provide a running commentary on political developments in the meantime is not really going to … help illuminate that decision when we need to make it.”