WASHINGTON (AP) — Facebook owner Meta is quietly cutting some safeguards designed to stop voting misinformation or foreign interference in U.S. elections as the November midterms loom.
It’s a sharp departure from the social media giant’s multibillion-dollar effort to improve the accuracy of posts about U.S. elections, and amid growing concerns among lawmakers and the public that the company exploited people’s data and allowed disinformation in 2016 political campaign.
The shift has raised concerns about Meta’s priorities and how some are using the world’s most popular social media platform to spread misleading claims, open fake accounts and anger partisan extremists.
“They’re not talking about it,” said Katie Harbath, a former Facebook policy chief who is now chief executive of technology and policy firm Anchor Change. “Best case: They’re still doing a lot behind the scenes. Worst case: They retreat, and we don’t know how that’s going to play out in the mid-term on the platform.”
Since last year, Meta has halted an investigation into how disinformation in Facebook’s political ads is amplified by expelling researchers from the site indefinitely.
CrowdTangle, an online tool the company provides to hundreds of newsrooms and researchers so they can identify popular posts and misinformation on Facebook or Instagram, is unavailable some days.
Public communications about the company’s response to election misinformation have gone pretty quiet. Between 2018 and 2020, the company issued more than 30 statements detailing how it will stifle U.S. election misinformation, prevent foreign adversaries from running ads or posts around voting, and stop divisive hate speech.
Top executives held a question-and-answer session with reporters about the new policy. CEO Mark Zuckerberg has pledged to remove fake voting information in a Facebook post and wrote an opinion piece calling for more regulations to address foreign interference in U.S. elections through social media.
But this year Meta has only released a one-page document outlining plans for the fall election, although the potential threat to voting remains clear. Several Republican candidates are promoting false claims about the U.S. election on social media. In addition, Russia and China have continued aggressive social media campaigns aimed at further deepening the political divide among American audiences.
Meta said elections remain a priority, and policies developed in recent years around election misinformation or foreign interference are now closely tied to company operations.
“In each election, we incorporate what we’ve learned into new processes and establish channels to share information with the government and our industry partners,” said Meta spokesman Tom Reynolds.
He declined to say how many employees will be involved in the program this year to protect the U.S. election.
During the 2018 election cycle, the company provided tours and photos, and produced headcounts for its election response war room. But Meta disputed a report by The New York Times that the number of Meta employees involved in this year’s election had been reduced from 300 to 60.
Reynolds said Meta will pull in hundreds of employees from 40 of the company’s other teams, with an undisclosed number of employees, to oversee the upcoming vote with the election team.
The company is continuing many of its initiatives to limit election misinformation, such as a fact-checking program started in 2016 that, with the help of news outlets, investigates the veracity of popular falsehoods spread on Facebook or Instagram. The Associated Press is part of Meta’s fact-checking program.
This month, Meta also launched a new feature for political ads, allowing the public to search for details on how advertisers are targeting them based on their interests on Facebook and Instagram.
However, Meta stifled other efforts to identify election misinformation on its website.
It has stopped making improvements to CrowdTangle, the website it provides to newsrooms around the world with insights on trending social media posts. Journalists, fact-checkers and researchers use the site to analyze Facebook content, including tracking popular misinformation and who is responsible for it.
Brandon Silverman, the former CrowdTangle CEO who left Meta last year, told the Senate Judiciary Committee this spring that the tool is now “dead.”
Silverman told The Associated Press that CrowdTangle has been working on an upgrade to make it easier to search the text of Internet memes, which, for example, can often be used to spread half-truths and evade fact-checkers.
“There is no shortage of ways to organize this data so that it is useful to the fact-checking community, newsrooms, and many different parts of civil society more broadly,” Silverman said.
Not everyone at Meta agrees with this transparent approach, Silverman said. The company hasn’t rolled out any new updates or features for CrowdTangle in more than a year, and it has experienced hours-long outages in recent months.
Meta also stopped efforts to investigate how misinformation spreads through political advertising.
The company indefinitely revoked access to Facebook from two NYU researchers who said they collected unauthorized data from the platform. The move comes after NYU professor Laura Edelson said she shared with the company a number of plans to investigate plans to spread disinformation on platforms that were attacked on the U.S. Capitol around January 6, 2021 hours, the incident is now the subject of a House investigation.
“When we looked closely, we found that their system could be dangerous for many of their users,” Edelson said.
Privately, former and current Meta employees have said that exposing these dangers during the U.S. election has created a public and political backlash for the company.
Republicans have often accused Facebook of unfairly censoring conservatives, some of whom have been fired for violating the company’s rules. At the same time, Democrats have often complained that the tech company is not doing enough to curb disinformation.
“It’s so politically concerning, they’d rather avoid it than dive right in,” said Habas, the former Facebook policy chief. “They just see it as a whole bunch of old headaches.”
Meanwhile, the prospect of U.S. regulation no longer hangs over the company, and lawmakers have failed to reach any consensus on what kind of oversight the multibillion-dollar company should be subject to.
Shaking off that threat, Meta’s leaders have devoted the company’s time, money and resources to a new project in recent months.
Zuckerberg plunged into Facebook’s massive rebranding and restructuring last October, when he renamed the company Meta Platforms Inc. He plans to spend years and billions of dollars growing his social media platform into a nascent virtual reality construct called the Metaverse — kind of like life brought by the internet, rendered in 3D.
His public Facebook page posts are now focused on product announcements, praising artificial intelligence, and photos of him enjoying life. News about election preparations was announced in a company blog post not written by him.
In an October post by Zuckerberg, he defended the company after a former Facebook employee leaked internal documents showing how the platform amplifies hate and misinformation. He also reminded his followers that he pushed Congress to modernize election regulations for the digital age.
“I know it’s frustrating to see the great work we do being mischaracterized, especially for those who make important contributions to safety, integrity, research and products,” he wrote on October 5. .” But I believe it will be better for our community and our business over the long term if we continue to work hard to do the right things and deliver experiences that improve people’s lives. “
It was the last time he discussed the Menlo Park, California-based company’s election efforts in a public Facebook post.
Associated Press technology writer Barbara Outotec contributed to this report.