[ad_1]
A once-robust alliance of federal agencies, tech companies, election officials and researchers that worked together to thwart foreign propaganda and disinformation has fragmented after years of sustained Republican attacks.
The GOP offensive started during the 2020 election as public critiques and has since escalated into lawsuits, governmental inquiries and public relations campaigns that have succeeded in stopping almost all coordination between the government and social media platforms.
The most recent setback came when the FBI put an indefinite hold on most briefings to social media companies about Russian, Iranian and Chinese influence campaigns. Employees at two U.S. tech companies who used to receive regular briefings from the FBI’s Foreign Influence Task Force told NBC News that it has been months since the bureau reached out.
In a testimony last week to the Senate Homeland Security Committee, FBI Director Christopher Wray signaled a significant pullback in communications with tech companies and tied the move to rulings by a conservative federal judge and appeals court that said some government agencies and officials should be restricted from communicating and meeting with social media companies to moderate content. The case is now on hold pending Supreme Court review.
“We’re having some interaction with social media companies,” Wray said. “But all of those interactions have changed fundamentally in the wake of the court rulings.”
Wray didn’t elaborate, but sources familiar with the matter told NBC News that all the FBI’s interactions with tech platforms now have to be pre-approved and supervised by Justice Department lawyers.
The FBI told the House Judiciary Committee that, since the court rulings, the bureau had discovered foreign influence campaigns on social media platforms but in some cases did not inform the companies about them because they were hamstrung by the new legal oversight, according to a congressional official.
“This is the worst possible outcome in terms of the injunction,” said one U.S. official familiar with the matter. “The symbiotic relationship between the government and the social media companies has definitely been fractured.”
The FBI declined to comment.
More than a dozen current and former government and tech employees who have been involved in fighting online manipulation campaigns and election falsehoods since 2020 echoed those concerns. Most agreed to speak only on the condition that they not be named, all citing the current climate of harassment against people who work in election and information integrity.
A common theme among those interviewed: The chilling effect that Republican attacks had on the sharing of information about possible interference, which could make it easier for foreign adversaries to manipulate U.S. public opinion and harder for 2024 voters to sort out what’s real from what’s fake.
Beyond the FBI briefings, other coordination efforts have folded after facing pressure from conservatives. The Cybersecurity and Infrastructure Security Agency (CISA), which oversees federal election cybersecurity and has become a favorite target of Republicans, has halted its outreach to Silicon Valley, and the Department of Homeland Security has shuttered a board designed to coordinate its anti-disinformation programs.
“Some of these efforts really are designed to isolate people and make them feel like they can’t communicate with CISA, like they can’t communicate with their peers in other states,” a person who works in state election administration said.
“People feel that things are really, really fraught, and common sense does not rule today,” the person added.
Some politicians are sounding the alarm. Sen. Mitt Romney, R-Utah, said efforts to stop foreign manipulation of U.S. politics are well within the government’s remit.
“I understand we don’t want to interdict constitutionally protected speech, but what is constitutionally protected speech?” he said. “Certainly foreign agents don’t have constitutionally protected speech because they’re not subject to our Constitution. I presume bots don’t have constitutionally protected speech. American citizens do.”
Microsoft recently said it expects Russia, Iran and China to engage in sophisticated influence operations ahead of the 2024 election.
Sen. Mark Warner, D-Va., the head of the Senate Intelligence Committee who had vocally pushed for election security coordination after 2016, told NBC News he had “grave concerns” about the setbacks of the system that defends against social media and election manipulation.
“We are seeing a potential scenario where all the major improvements in identifying, threat-sharing, and public exposure of foreign malign influence activity targeting U.S. elections have been systematically undermined,” Warner said.
Before 2016, there was little political will in the U.S. for the government or for tech companies to share intelligence with each other or protect voters from foreign influence campaigns. That year, Russia launched a multifaceted interference campaign that included the Kremlin-tied Internet Research Agency reaching tens of millions of Facebook and Twitter users. Hackers working for Russian intelligence stole and leaked emails from Hillary Clinton’s presidential campaign, probed an election machine company and stole voter information from the state of Illinois.
In the aftermath, President Barack Obama’s outgoing secretary for the Department of Homeland Security declared elections to be critical infrastructure, a move that drew immediate criticism from conservative election officials. Congress voted for the Department of Homeland Security to spin out its cyber and infrastructure protection efforts into CISA.
Meanwhile, the FBI created the Foreign Influence Task Force, meant to act as an intermediary that ferried information between the U.S. intelligence community and tech companies. The National Security Agency declined to comment for this story, but its director said in 2022 that the agency had fed intelligence about foreign propagandists to the task force to share with tech platforms.
CISA started holding its own meetings with tech companies, briefing them on election administration nuances and helping set up a “switchboard” system to flag election falsehoods online. The new system allowed a local election official to, for example, communicate to Facebook that a local group was directing people to the wrong polling site, in violation of the company’s policies.
These partnerships between government, corporations and legal and academic researchers were praised after 2020 as a crucial part of ensuring a secure election.
After the election, a victory for the Democrats and Joe Biden, President Donald Trump and many other conservatives refused to accept the loss and lashed out at political enemies. They targeted a number of election integrity operations, including the channels that shared information on disinformation, often accusing them of censoring conservative voices.
Many of them focused on Twitter and Facebook’s decision to temporarily limit the reach of a New York Post story about Biden’s son, Hunter. Published a few weeks before the election, to the tech platforms it had echoes of when Russia leaked Hillary Clinton’s emails in 2016. While Facebook CEO Mark Zuckerburg said FBI statements about certain threats fit the pattern of the Hunter Biden story, both later said the agency didn’t specifically say the Biden emails were a foreign intelligence campaign. Digital forensics experts have verified that at least some of those emails were authentic but much remains unknown about the origins of the files.
Since then, Republicans have sent many election integrity efforts into retreat.
Last year, the attorneys general offices of Missouri and Louisiana filed a joint lawsuit against the Biden administration, alleging that federal government outreach to tech companies about content on their platform — including law enforcement tips about election integrity and Covid-19 — constituted intimidation and a violation of First Amendment protections to free speech.
The case is still winding its way through the courts. Last week, the Supreme Court blocked a lower court’s ruling in favor of the conservative states’ case while it hears an appeal.
Other efforts have been stopped before they could get started. In March 2022, the Department of Homeland Security created a board to help coordinate its own response to viral falsehoods, prompting outcry from conservatives who claimed the government was policing speech. Nina Jankowicz, a researcher who studies disinformation and technology, was brought in to run the board and quickly became the target of a debilitating harassment campaign. Homeland Security shut down the board five months later.
Jankowicz said that the decision likely sent a message to federal workers that they might face retaliation for speaking out in a way that upset vocal critics.
“If you’re the one who’s raising the alarm about foreign interference or about something that is disenfranchising people and letting the platforms know, but it might cost you your job or your safety and security, you think twice about doing that,” Jankowicz said.
Biden’s head of CISA, Jen Easterly, a decorated intelligence official who had no prior experience in a public government role, started her tenure with optimism that her agency played a major role confronting disinformation.
“One could argue we’re in the business of critical infrastructure, and the most critical infrastructure is our cognitive infrastructure, so building that resilience to misinformation and disinformation, I think, is incredibly important,” she said at a talk hosted by Wired magazine in her first year on the job.
But Easterly, who frequently characterizes herself as nonpartisan, soon withdrew the agency from roles that most actively fought disinformation. Aside from maintaining a webpage that corrects common misconceptions about how elections work, CISA now focuses more on goals like protecting poll workers’ physical safety, connecting election officials with cybersecurity resources, and pushing software companies to do a better job building secure programs.
CISA stopped briefing platforms about how U.S. elections are administered after the 2022 midterms, a current CISA employee said, though they did not attribute the move to Republican pressure. Two people familiar with the agency said Easterly had pulled back from outreach to social media companies after being surprised by the severity of conservatives’ attacks
Republican demonization of the agency hasn’t abated. After the GOP took the House of Representatives in 2022, the House Judiciary Committee, led by Rep. Jim Jordan, R-Ohio, has spent much of this term focused on grievances from the 2020 election. It subpoenaed Easterly earlier this year, then issued a report that claimed “CISA metastasized into the nerve center of the federal government’s domestic surveillance and censorship operations on social media.”
In a podcast interview on “On with Kara Swisher” in June, Easterly explained that CISA will also no longer help flag state and local election officials’ concerns to social media companies.
“I need to ensure we are able to do our core mission, to reduce risk to critical infrastructure. And at this point in time, I do not think the risk of us dealing with social media platforms is worth any benefit, quite frankly,” Easterly said, though she did not specify the source of the risk.
“I made a decision not to do that. So we are not doing that. Local election officials can give that to the platforms themselves, and I think that’s the right place for us to be,” she said.
Through an agency spokesperson, Easterly declined to comment. In an emailed statement, she said: “Election security is one of CISA’s top priorities, and along with our interagency partners, we’re fully focused on supporting state and local election officials as they prepare for the 2024 election cycle. As we have since 2017, CISA will continue to lean forward and do our part to ensure the American people can have confidence in the security and resilience of our most sacred democratic process.”
Meanwhile, some platforms have cut back on trust and safety teams. Tech companies are still sharing their findings with each other, a Meta spokesperson told NBC News. The exception is X, whose owner Elon Musk released a giant cache of emails and company documents related to its previous trust and safety efforts and also made huge cuts to its trust and safety and election integrity teams. During the Israel-Hamas war, X has left viral terror videos from Hamas go viral and linger on the site for days.
One current X employee, who wasn’t authorized to speak for the company, said they had no remaining faith that the company could handle propaganda campaigns.
“The company no longer has the team, the tools, or the capabilities to identify and mitigate these attacks,” they said.
[ad_2]
Source link