Why the fight to counter false election claims may be harder in 2024

By Shannon Bond (NPR)
Nov. 10, 2023 7:48 a.m.

A voter fills out a ballot in Jackson, Miss., on Tuesday. Federal and local officials have worked closely with researchers to track rumors and conspiracy theories in recent elections but that cooperation is fading under pressure from conservatives.

Brandon Bell / Getty Images

In Marion County, Fla., elections supervisor Wesley Wilcox has stopped using the word “misinformation.”

THANKS TO OUR SPONSOR:

Not because lies or misleading rumors about elections are any less prevalent in his county than the rest of the country. Wilcox says he regularly interacts with groups that aim to find what they see as rampant fraud in elections.

But Wilcox, who is a Republican, has had to shift his vocabulary to talk about those falsehoods because others in his party see the term as code for censorship of conservatives.

"In Republican circles, 'misinformation' is a dog whistle," Wilcox said. "All of a sudden, man, you got skewered if you even mention the word."

An election partnership that Wilcox helps lead has even stopped advertising a service that allows local officials to report false voting information online, for fear of conservative backlash — a sign of broader concerns of those who work to safeguard elections as America nears another round of presidential voting.

Experts say a campaign of legal and political pressure from the right has cast efforts to combat rumors and conspiracy theories as censorship. And as a result, they say, the tools and partnerships that tried to flag and tamp down on falsehoods in recent election cycles have been scaled back or dismantled. That's even as threats loom from foreign governments and artificial intelligence, and as former President Donald Trump, who still falsely claims to have won the 2020 contest, is likely to use the same tactics again as he pursues the White House in 2024.

Added Wilcox: "Everybody is gun-shy."

“Open season”

As Nina Jankowicz sees it, the opening salvo came in the spring of 2022, when a right-wing campaign quickly snuffed out a Department of Homeland Security initiative called the Disinformation Governance Board.

Jankowicz, who has written books about Russian information operations and online harassment, was tapped to lead the board. The federal government described it as a working group, without "any operational authority or capability," tasked with coordinating efforts to identify false and misleading claims and share facts about security concerns, from elections to natural disasters.

But the combination of the board's ominous-sounding name and DHS's poor efforts to communicate its purpose made it catnip for right-wing influencers, who quickly seized on the board and Jankowicz herself as avatars of a nefarious plot to censor Americans.

"The Biden Administration's decision to stand up a 'Ministry of Truth,' is dystopian in design," said now-House Speaker Mike Johnson, R-La., shortly after the board was announced. "The government has no role whatsoever in determining what constitutes truth or acceptable speech."

Supporters of former President Donald Trump look at merchandise ahead of his rally in Hialeah, Fla., this week. Trump continues to falsely claim he won the 2020 election, fueling conspiracy theories around the voting process.

Ricardo Ardeuengo / AFP via Getty Images

After a barrage of death threats and abuse, Jankowicz resigned, and DHS scrapped the board altogether. Jankowicz told NPR that the timid effort by the federal government to defend her or push back against the allegations sent a clear message.

"That showed ... that it was open season on researchers, on civil servants, on anyone who was working in this space," Jankowicz said.

Amid that furor, in May 2022, the Republican attorneys general of Missouri and Louisiana filed a lawsuit accusing the Biden administration of colluding with social media companies to censor conservative speech, by pressing platforms to take action on misleading posts about COVID-19 and elections.

This July, a Trump-appointed federal judge ruled the government had likely violated the First Amendment and issued a sweeping injunction blocking agencies' communications with platforms about most content. The injunction was narrowed by an appeals court, before being put on hold last month by the Supreme Court, which is slated to hear the case this term.

Pressure is coming from Congress as well, where the House Judiciary Committee's Select Subcommittee on the Weaponization of the Federal Government, led by GOP Rep. Jim Jordan of Ohio, a Trump ally, is conducting its own probe into alleged collusion between the Biden administration and tech companies to unconstitutionally shut down political speech.

To be sure, there is an open debate about what role the government should take in countering rumors or lies about high-risk subjects like elections and public health, and widespread skepticism about the power social media companies wield over public discourse.

But the government, platforms and researchers say the lawsuit and investigation unfairly mischaracterize their communications. They say that while officials and outside groups flag content they believe may break the social networks' rules, it's ultimately up to the tech companies to decide what action to take.

Rep. Jim Jordan, R-Ohio, center, is leading a congressional investigation into what he and other conservatives describe as a joint effort by the federal government, researchers and tech companies to censor conservative points of view. He's seen here after a March hearing with Rep. Matt Gaetz, R-Fla., left, and now-Speaker Mike Johnson, R-La.

Manuel Balce Ceneta / AP

Jordan is subpoenaing researchers and social media companies, demanding years of email correspondence and conducting hours-long interviews, which his staff has used to make explosive accusations against federal agencies, nonprofit organizations and academic institutions.

"What the federal government could not do directly, it effectively outsourced to the newly emerging censorship-industrial complex," committee staff wrote in a report published this week.

"The Committee's ongoing investigation focuses on the federal government's involvement in speech censorship, and the investigation's purpose is to inform legislative solutions for how to protect free speech," said Nadgey Louis-Charles, a spokesperson for the House Judiciary Committee.

The rise of content moderation, and the backlash

Election misinformation isn't a new phenomenon; there have been false and misleading claims as long as there have been elections.

But efforts by Russia in 2016 to use social media to meddle in the U.S. presidential election were a wake-up call for tech companies and the government about the power of online platforms to amplify rumors, conspiracy theories and lies.

Under public pressure, Facebook, Twitter and other social media sites created policies prohibiting false election claims and set up teams devoted to monitoring abuse, from cracking down on posts telling people the wrong day to vote to disrupting coordinated influence operations backed by foreign governments.

At roughly the same time, in January 2017, DHS designated the country's voting systems as critical infrastructure, meaning that keeping local and state systems safe from hostile actors was officially under the federal government's purview.

But it was becoming increasingly clear that foreign adversaries were also targeting the minds of American voters, so the federal government started wading into the thorny world of online information. Ahead of the 2018 midterms, DHS began coordinating with social media companies around election misinformation.

The tech platforms got more aggressive about policing content, with moderation reaching what some experts argue was a peak in 2020 and 2021, amid the COVID-19 pandemic and the 2020 presidential election. They began labeling false and misleading claims, including, controversially, posts by then-President Trump. After the Jan. 6 attack on the U.S. Capitol, the big platforms all suspended Trump.

The rise of those policies, along with high-profile disputes over how they have been applied, sparked a backlash from conservatives and some disaffected liberals who say platforms went too far in limiting speech.

Efforts to the back burner, and a rebranded ‘Rumor Control’

Wilcox, the election administrator in Florida, sits on the executive committee of a nationwide partnership for election officials called the Elections Infrastructure Information Sharing and Analysis Center (EI-ISAC) that was created after the 2016 election.

But he says he's had to downplay that work when talking to voters because the EI-ISAC is funded by DHS, which has become the target of conspiracies. One conspiracy theory info packet viewed by NPR, acquired in a records request in Texas by the transparency group American Oversight, detailed how the EI-ISAC was "likely manipulating our County vote tallies."

THANKS TO OUR SPONSOR:

"I have to be very careful," Wilcox said. "I can't even go out locally and tell anybody that I sit on this board because DHS is the bad guy."

Voters cast ballots in Louisville, Ky., this week. In recent years, local election officials have come to rely a variety of mechanisms to share information about rumors and conspiracy theories about elections.

Michael Swensen / Getty Images

One of the functions the EI-ISAC took on around 2020 was to essentially give thousands of election officials across the U.S. somewhere central to report false and misleading information. The group then disseminated that information to researchers at a few universities for narrative tracking and to the tech companies for review to see if it violated their policies, through an effort called the Election Integrity Partnership.

Those efforts, which took place under the Trump administration, have become the subject of claims by Jordan's weaponization subcommittee that they're part of a coordinated push by the government to "conduct censorship by proxy."

Wilcox says today, the EI-ISAC is no longer advertising that service. They're back to focusing strictly on cybersecurity.

"It's still out there, but it's really run back to a back burner," Wilcox said. "[DHS] has kind of shied away from even those types of things, just because of the exposure."

The agency's cybersecurity and infrastructure security arm, CISA, confirmed to NPR that it had no contact on Tuesday's Election Day with any social media companies.

Another feature of DHS's election security work that seems to be fading into the background is a website called "Rumor Control." As votes were being counted in 2020, the site served almost as a live-debunking tool, responding to election misinformation in real time.

Kathy Boockvar oversaw elections in 2020 in Pennsylvania as its secretary of the commonwealth, and she remembers leaning on Rumor Control when she began getting questions from Republican state legislators in her state about a conspiracy theory in Arizona involving Sharpies.

"I remember specifically that CISA had a section on the Rumor Control website to explain that there was no -gate in Sharpiegate," Boockvar said, referring to a rumor that spread on election night 2020 in Arizona that using permanent markers on ballots would bleed through and invalidate votes. "I was actually able to share something that had the credibility of the federal government security agency — and frankly, it was the federal government under President Trump."

Boockvar, who has since started a consulting firm focused on improving elections, noticed during last year's midterm elections that Rumor Control was not being updated as frequently, and seemed to be more limited in its scope. The site has since been rebranded, not to focus on "jurisdiction-specific claims," but instead explaining election processes more broadly.

CNN also reported last year that CISA turned down a multimillion-dollar proposal to protect election officials from harassment ahead of the midterms.

During a background briefing call with reporters on Tuesday, a senior CISA official downplayed concerns from local election officials that CISA was being more hands-off when it comes to misinformation.

"When it comes to foreign influence operations and disinformation, this is an area where we remain committed to ensuring that state and local election officials are provided with the techniques and tactics and procedures that we know foreign adversaries are using so that they have awareness of the threats," the official said. "We continue to amplify election officials as trusted sources of information."

But experts worry that local election officials often don't have the resources or time to combat false information online, especially in a rapidly changing technological environment, in addition to their other responsibilities.

"There is only so much you can ask an election official to do and to do well. And we are nearing capacity on people," Wilcox said. "You want me to be a cybersecurity expert? You want me to be a database expert? Now I've got to be an AI expert. I'm like, I'm sorry. At some point, I got to tap out and say, 'I just don't have it.'"

“Weaponized criticism”

The pressure campaign and the uncertainty caused by the paused legal injunction have also chilled communications between people who track and study the spread of rumors and false narratives and the platforms themselves.

Academics and researchers who participated in the Election Integrity Partnership in 2020 have been targeted by Jordan's probe, as well as a private lawsuit brought by America First Legal, an organization run by former Trump adviser Stephen Miller. Those involved say it's unlikely the partnership will have the same level of information sharing next year.

"The weaponized criticism of research on misinformation is having a negative impact on our ability to understand and address what many of us feel to be a pretty large societal problem," said Kate Starbird, a co-founder of the University of Washington's Center for an Informed Public, one of the members of the EIP. She's been the target of harassment and threats over that work.

"They are limiting our ability just to do the research, because we're spending too much time with our lawyers, but also limiting our ability to have conversations with the people who need our work the most, whether that's election officials or whether that's social media platforms," she said.

Starbird says despite the pressure, her UW team is continuing its work and plans to do rapid response to viral rumors once again in 2024. That means they will be tracking, analyzing and communicating — even if it looks different than in previous cycles.

"How do we make sure people can use our outputs, even people that can't maybe talk to us directly because they're afraid about being given a lawsuit or being tied into the next conspiracy theory?" she said.

Dialogue between the people running elections, researchers and the tech companies is critical, said Eddie Perez, who oversaw Twitter's election integrity work before leaving in November 2022 after Elon Musk purchased the company. He's now a board member at the OSET Institute, a nonprofit focused on election infrastructure.

"There are moments where [the platforms] need to rely on substantive information from subject matter experts," Perez said. He cites the Sharpiegate rumors in 2020 that Boockvar also mentioned and which Rumor Control knocked down.

"It's rare for a social media company to have on board somebody that is that familiar with voting technology, for example, to be able to say, 'Look, here's the real story,'" Perez said. "They have an interest in trying to get the answers really quickly, because by definition, when something is viral, the damage is done very quickly."

Social media platforms step back from the spotlight

The headquarters of X, formerly Twitter, in San Francisco. The company has largely stopped trying to police viral lies on its site and has dismantled teams that deal with election integrity.

Justin Sullivan / Getty Images

Worsening matters is the pervasive sense that social media platforms are backing off efforts to more aggressively police and counter false and misleading election claims and conspiracy theories. Layoffs in the tech industry have hit teams working on trust and safety, leaving many working on election integrity uncertain about who to talk to.

"[The platforms] would reach out to us all the time. And that's just not happening. It's gone silent," Starbird said.

Meta, the owner of Facebook and Instagram, still has channels for election officials to reach out and teams responsible for different states, according to a person familiar with the company's work.

YouTube said it continues to invest in making the video platform a reliable source for election news and information and prohibits content that misleads people on how to vote or encourages interference in the democratic process.

Both companies declined to comment on their interactions with government officials and researchers.

The most drastic pullback has come at X, formerly Twitter, where Musk has eliminated the election integrity team entirely, unwound many of the platform's policies, and made changes that critics say amplify the spread of harmful content. Musk has also made it difficult for researchers to study the platform and even sued a nonprofit research group whose work he objected to.

A press email address for X responded to a request for comment with an auto-reply saying "Busy now, please check back later."

Boockvar said even in 2020, it was difficult enough getting the platforms to respond quickly to bad information as it spread.

"If they're taking a step back, or 10 steps back. ... We're basically empowering the disinformation to spread," she said.

Copyright 2023 NPR. To see more, visit https://www.npr.org.

THANKS TO OUR SPONSOR:
THANKS TO OUR SPONSOR: