Biden’s campaign set to counterpunch on misinformation

With social-media platforms pulling back from policing false political claims, and Trump gearing up for a fight, Biden’s 2024 campaign is rebooting its online defenses.

Biden’s campaign set to counterpunch on misinformation

Joe Biden’s presidential campaign is overhauling its strategy to fight misinformation on social media in the 2024 race, recruiting hundreds of staffers and volunteers to monitor platforms, buying advertising to fight bogus claims, pushing its own countermessages out through grassroots allies — with a bulldog aide helping lead the effort.

The change is driven by concern that social media companies are less willing to police political misinformation, and also by the risks of mistruths and attacks from Republican rival Donald Trump and other GOP candidates, according to interviews with five Biden campaign officials over the past several weeks.

One of the leaders of the fight against what it expects to be a flood of misinformation will be a controversial figure: Rob Flaherty, the former White House director of digital strategy, whose combative emails to social media firms have become part of a Republican-led federal court case and a congressional investigation. He’ll work with the campaign’s legal, communications and digital teams to fight false narratives during the race.

The new strategy offers a window into how campaigns will handle the fast-shifting online landscape of the 2024 election — and the increasingly precarious politics of pressuring social-media platforms to police misinformation.

Biden’s reelection campaign is expecting to combat a barrage of false claims by Trump and other GOP candidates about Biden’s past record; the White House’s Covid-19 vaccine push; and alleged efforts to suppress voter turnout.

Under pressure from conservatives to allow more open speech, platforms like X (formerly known as Twitter) have reinstated numerous far-right conservatives who had previously been kicked off for spreading false or harmful information, including Trump himself. Facebook and YouTube have followed suit. This past summer, YouTube announced it stopped removing content falsely claiming the 2020 election was stolen.

In that environment, Biden’s campaign advisers say they plan to rely less on companies’ willingness to police misinformation, and more on their own resources to counter it.

“The campaign is going to have to be more aggressive pushing back on misinformation from a communications perspective and filling some of the gaps these companies are leaving behind,” Flaherty, a deputy campaign manager for Biden’s 2024 reelection bid, told POLITICO.

The Biden administration is currently under scrutiny when it comes to social-media company outreach: White House employees, including Flaherty, are the subject of both a court case and a House committee investigation over the administration’s aggressive urging of platforms to take down posts on Covid-19 and the 2020 election.



Katie Harbath, previously Facebook’s public policy director and a Republican National Committee staffer, cautioned that the campaign needs to be careful in how hard it comes after social platforms — in part because GOP investigations in Congress are asking “very legitimate” questions about the White House’s past pressure on platforms to remove content.

“It doesn't feel great to have anybody trying to threaten their control or force platforms to be making moves,” she said.

A lightning rod gets a new role

Flaherty, who was promoted to the campaign leadership role in August, stands at the center of the political controversy around the fight against online misinformation. According to emails disclosed in a case filed in Louisiana by GOP state attorneys general, Flaherty harangued employees at Facebook and YouTube when he was at the White House, insisting the companies do more to combat rhetoric against the Covid-19 vaccine.

His aggressively worded messages have made him the target of conservative allegations that the White House and other Biden officials wrongly pressured private companies to take down internet speech.

The case itself is still a live issue: Several weeks ago, a federal court agreed that the Biden administration likely violated the First Amendment by coercing platforms to remove Covid-19 and election content, and issued an order limiting government officials from contacting the companies. Justice Samuel Alito temporarily paused that ruling last Thursday as the administration works on a formal request for the Supreme Court to block the order.

Republican House Judiciary Chair Jim Jordan is also demanding Flaherty sit for a deposition as part of a separate probe by his subcommittee on the same issue.

The Biden campaign said it couldn’t comment on the ongoing federal court case or the Jordan investigation.

Despite the controversy, Biden has continued to back Flaherty as his social media attack dog since he started as the digital director of his first campaign in 2019. Flaherty has found new ways to make his boss more engaging to young voters by recording TikTok and Instagram videos with celebrities like the Jonas Brothers and Olivia Rodrigo at the White House. Biden has continued to elevate Flaherty, making him the first digital strategy director to be named as an assistant to the president. He praised Flaherty when he left in June, saying he “operated with unparalleled creativity, innovative spirit and a bias toward action.”

Public shaming and countermessaging

As the campaign gets rolling, the Biden campaign’s legal team still plans to use the traditional playbook of reaching out to social media companies, and flagging content that violates the platform’s policies.

But campaign staff said the better strategy has been to publicly shame companies for not enforcing their own misinformation policies. They plan to use Facebook to run paid ads countering false messages, something they began doing in the 2020 campaign after the platform stopped fact-checking politicians’ lies in 2019, but will significantly amplify this effort in 2024.



According to campaign officials, the overhaul includes bulking up their ranks to hundreds of people by next spring, this includes expanding their own communications, legal, digital and rapid response teams — and working with staff from the Democratic National Committee, state Democratic parties and on-the-ground volunteers — to monitor and quickly correct misinformation. They also plan to create and push out paid ads targeted at susceptible voters to counter any disinformation against Biden as a candidate. They’ll also rely on grassroots organizing and volunteers to refute false claims from opponents that are intended to suppress voter turnout, as well as work with media outlets to fact-check untruths, the Biden campaign staff said.

The campaign is particularly focused on combating misinformation from leading Republican candidates, including Trump, whose campaign, supporters and personal accounts have consistently pumped out lies about the 2020 election results, as well as more personal attacks on Hunter Biden, who’s currently indicted on gun charges. The campaign is also homing in on Florida Gov. Ron DeSantis’ Covid anti-vaccine rhetoric, including his latest push against the CDC’s recommendation for everyone under 65 to get the updated Covid vaccine.

Tech industry observers aren’t surprised with the Biden camp’s switch in strategy. “The content moderation winds have shifted since 2020,” said Nu Wexler, who previously worked in policy communications for Google, Facebook, Twitter and for Democratic lawmakers.

Given the shift in the industry — and the political risks of heavy-handed pushback — both Wexler and Harbath suggested that counter-messaging is a better use of the campaign’s time and resources than directly pressuring platforms to enforce their policies.

The campaigns of Trump and DeSantis did not respond to a request for comment on their own mis- and disinformation policies headed into 2024.

Another misinformation threat that’s evolved since the 2020 campaign is the sudden explosion of generative artificial intelligence tools like ChatGPT. Generative AI can be deployed by campaigns to produce deep fakes — manipulated images or videos intended to deceive a viewer — which could be used to sway voters in 2024. And the threat isn’t even hypothetical — a pro-DeSantis super PAC has alreadyused generative AI in an ad this summer.

So the campaign is preparing for what it sees as a likely deluge of misinformation from candidates and AI tools. Said Maury Riggan, the campaign’s general counsel: “As bad as the issue was in 2020, it’s only gotten more complex.”