WASHINGTON (AP) — Facebook owner Meta is quietly cutting some safeguards designed to stop voting misinformation or foreign interference in U.S. elections as the November midterms loom.
It’s a stark departure from the social media giant’s multibillion-dollar effort to improve the accuracy of posts about the U.S. election, and after lawmakers and the public learned that the company exploited people’s data and allowed falsehoods during the U.S. election It has regained the trust of lawmakers and the public after being outraged that information flooded its website. 2016 election campaign.
The shift has raised concerns about Meta’s priorities and how some are using the world’s most popular social media platform to spread misleading claims, open fake accounts and anger partisan extremists.
“They’re not talking about it,” says former Facebook policy chief Katie Harbas, now CEO of technology and policy firm Anchor Change. “Best case: They’re still doing a lot of stuff behind the scenes. Worst case: They back off and we don’t know how that’s going to play out in the mid-term on the platform.”
Since last year, Meta has halted an investigation into how disinformation in Facebook’s political ads is amplified by expelling researchers from the site indefinitely.
CrowdTangle, the company’s online tool for hundreds of newsrooms and researchers so they can identify popular posts and misinformation on Facebook or Instagram, isn’t available some days now.
Public communications about the company’s response to election misinformation have gone pretty quiet. Between 2018 and 2020, the company issued more than 30 statements detailing how it will stifle misinformation about U.S. elections, prevent foreign adversaries from running ads or posts around voting, and stop divisive hate speech.
Top executives held a question-and-answer session with reporters about the new policy.CEO mark Zuckerberg Posting on Facebook promising to remove fake voting information and writing opinion pieces calling for more regulations to address foreign interference in U.S. elections through social media.
But this year Meta has only released a one-page document outlining plans for the fall election, although the potential threat to voting remains clear. Several Republican candidates are promoting false claims about the U.S. election on social media. In addition, Russia and China have continued aggressive social media campaigns aimed at further deepening the political divide among American audiences.
Meta said elections remain a priority, and policies developed in recent years around election misinformation or foreign interference are now closely tied to company operations.
“In each election, we incorporate what we’ve learned into new processes and establish channels to share information with the government and our industry partners,” said Meta spokesman Tom Reynolds.
He declined to say how many employees will be involved in the program this year to protect the U.S. election.
During the 2018 election cycle, the company provided tours and photos, and produced headcounts for its election response war room. But Meta disputed a report by The New York Times that the number of Meta employees involved in this year’s election had been reduced from 300 to 60.
Reynolds said Meta will pull in hundreds of employees from 40 of the company’s other teams, with an undisclosed number of employees, to oversee the upcoming vote with the election team.
The company is continuing many of its initiatives to limit election misinformation, such as a fact-checking program launched in 2016 that, with the help of news outlets, investigates the veracity of popular falsehoods spread on Facebook or Instagram. The Associated Press is part of Meta’s fact-checking program.
This month, Meta also launched a new feature for political ads, allowing the public to search for details on how advertisers are targeting them based on their interests on Facebook and Instagram.
However, Meta stifled other efforts to identify election misinformation on its website.
It has stopped making improvements to CrowdTangle, the website it provides to newsrooms around the world with insights on trending social media posts. Journalists, fact-checkers and researchers use the site to analyze Facebook content, including tracking popular misinformation and who is responsible for it.
Brandon Silverman, the former CrowdTangle CEO who left Meta last year, told the Senate Judiciary Committee this spring that the tool is now “dead.”
Silverman told The Associated Press that CrowdTangle has been working on an upgrade to make it easier to search the text of Internet memes, which, for example, can often be used to spread half-truths and evade fact-checkers.
“There is no shortage of ways to organize this data so that it is useful to the fact-checking community, newsrooms, and many different parts of civil society more broadly,” Silverman said.
Not everyone at Meta agrees with this transparent approach, Silverman said. The company hasn’t rolled out any new updates or features for CrowdTangle in more than a year, and it has experienced hours-long outages in recent months.
Meta also stopped efforts to investigate how misinformation spreads through political advertising.
The company indefinitely revoked access to Facebook from two NYU researchers who said they collected unauthorized data from the platform. The move comes hours after NYU professor Laura Edelson said she shared with the company plans to investigate the attack on the U.S. Capitol around January 6, 2021, spreading disinformation on the platform , the incident is now the subject of a House investigation.
“When we looked closely, we found that their system could be dangerous for many of their users,” Edelson said.
Privately, former and current Meta employees say that exposing these dangers during the U.S. election has created a public and political backlash for the company.
Republicans have often accused Facebook of unfairly censoring conservatives, some of whom have been fired for breaking company rules. Meanwhile, Democrats have often complained that the tech company isn’t doing enough to curb disinformation.
“It’s so politically worrisome that they’d rather avoid it than jump into their heads first.” said Habas, former Facebook policy director. “They just saw it as a whole bunch of old headaches.”
Meanwhile, the prospect of U.S. regulation no longer hangs over the company, and lawmakers have failed to reach any consensus on what kind of oversight the multibillion-dollar company should be subject to.
Shaking off that threat, Meta’s leaders have devoted the company’s time, money and resources to a new project in recent months.
Zuckerberg He dived into Facebook’s massive rebranding and restructuring last October, when he Change company name to Meta Platforms Inc. he Plans to spend years and billions of dollars developing his The social media platform has turned into an emerging virtual reality construct called the “Metaverse” — sort of like the internet brought to life, rendered in 3D.
in one of ZuckerbergThe October post came after a former Facebook employee leaked internal documents showing how the platform amplifies hate and misinformation. he Defend the company. he also remind his follower he Push Congress to modernize regulations for elections in the digital age.
“I know it’s frustrating to see the great work we do being mischaracterized, especially for those who make important contributions to safety, integrity, research and products,” he Written on October 5th. “But I believe it will be better for our community and our business in the long run if we continue to strive to do the right things and deliver experiences that improve people’s lives.”
this is the last time he The Menlo Park, California-based company’s election efforts were discussed in public Facebook posts.
Associated Press technology writer Barbara Outotec contributed to this report.