SAN FRANCISCO >> Meta said on Tuesday that it was ending its long-standing fact-checking program, a policy instituted to curtail the spread of misinformation across its social media apps, in a stark sign of how the company was repositioning itself for the Trump presidency and throwing its weight behind unfettered speech online.

Meta, which owns Facebook, Instagram and WhatsApp, said it would now allow more speech, rely on its users to correct inaccurate and false posts, and take a more personalized approach to political content. It described the changes with the language of regret, saying it had strayed too far from its values over the previous decade.

“It’s time to get back to our roots around free expression,” Mark Zuckerberg, Meta’s CEO, said in a video announcing the changes. The company’s fact-checking system, he added, had “reached a point where it’s just too many mistakes and too much censorship.”

Zuckerberg conceded there would be more “bad stuff” on the platforms as a result of the decision. “The reality is that this is a trade-off,” he said. “It means that we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”

Moving toward Trump

Ever since Donald Trump’s victory in November, few big companies have worked as overtly to curry favor with the president-elect, who, during his first administration, accused social media platforms of censoring conservative voices. In a series of announcements during this presidential transition period, Meta has sharply shifted its strategy in response to what Zuckerberg called a “cultural tipping point” marked by the election.

Zuckerberg dined with Trump at Mar-a-Lago in November and Meta later donated $1 million to support Trump’s inauguration. Last week, Zuckerberg elevated Joel Kaplan, the highest-ranking Meta executive closest to the Republican Party, to the company’s most senior policy role. And on Monday, Zuckerberg said Dana White, the head of the UFC and an ally of Trump’s, would join Meta’s board.

Musk’s influence

Meta executives recently gave a heads-up to Trump officials about the change in policy, said a person with knowledge of the conversations who spoke on condition of anonymity. The fact-checking announcement coincided with an appearance by Kaplan on “Fox & Friends,” a favorite show of Trump’s, where Kaplan said there was “too much political bias” in Meta’s fact-checking program.

The influence of Elon Musk, the world’s richest man who leads the social platform X, SpaceX and Tesla, also loomed large over Meta’s shift. Since buying X in 2022, Musk has thrown out the platform’s restrictions on online speech and has turned to a program called Community Notes, which depends on X’s users to police false and misleading content. Musk, who has become a key adviser to Trump, also moved X to Texas from California, where it had been based, and has criticized California’s policies.

On Tuesday, Meta said it would also turn to a Community Notes program after seeing “this approach work on X.” In addition, Zuckerberg said his company would run its U.S. trust and safety and content moderation operations from Texas instead of California “to do this work in places where there’s less concern about the bias of our teams.”

Turnabout for Meta

Misinformation researchers said Meta’s decision to end fact-checking was deeply concerning. Nicole Gill, a founder and the executive director of the digital watchdog organization Accountable Tech, said Zuckerberg was “reopening the floodgates to the exact same surge of hate, disinformation and conspiracy theories that caused Jan. 6 — and that continue to spur real-world violence.”

In 2021, Facebook shut down Trump’s account after the Jan. 6 riot at the Capitol for inciting violence, before later reinstating him. Multiple studies have since shown that interventions like Facebook’s fact-checks were effective at reducing belief in falsehoods and reducing how often such content was shared.

But Meta’s move elated conservative allies of Trump, many of whom have disliked Meta’s practice of adding disclaimers or warnings to questionable or false posts. Sen. Rand Paul, R-Ky., said in a post on X that Meta “finally admits to censoring speech” and called the change “a huge win for free speech.”

Other Republicans were skeptical. Sen. Marsha Blackburn, R-Tenn., said in a post on X that Meta’s change was “a ploy to avoid being regulated.”

Company reaction

Inside Meta, Zuckerberg’s announcements were met by praise and horror. For some employees, Zuckerberg was finally being his “authentic self,” uninhibited by “woke” critics, three current and former employees said.

Others said Zuckerberg was throwing current and former employees under the bus despite their efforts on content moderation. As upset employees posted about the changes on internal message boards, human resources workers quickly removed the posts, the people said, saying they broke the rules of a company policy on community engagement. Meta put the policy in place in 2022 to keep contentious social issues out of the workplace.

Meta’s decision to move moderation teams from California to Texas to “eliminate bias” attracted particular internal attention, the people said.