Social media company Meta Platforms has scrapped its US fact-checking program and reduced curbs on discussions around contentious topics such as immigration and gender identity.
The tech giant is bowing to criticism from conservatives as President-elect Donald Trump prepares to take office for a second time.
The move is Meta’s biggest overhaul of its approach to managing political content on its services in recent memory and comes as CEO Mark Zuckerberg has been signalling a desire to mend fences with the incoming administration.
The changes will affect Facebook, Instagram and Threads, three of the world’s biggest social media platforms with more than three billion users globally.
Last week, Meta elevated Republican policy executive Joel Kaplan as global affairs head and on Monday announced it had elected Dana White, CEO of Ultimate Fighting Championship and a close friend of Trump, to its board.
“We’ve reached a point where it’s just too many mistakes and too much censorship. It’s time to get back to our roots around free expression,” Zuckerberg said in a video.
He acknowledged the role of the recent US elections in his thinking, saying they “feel like a cultural tipping point, towards once again prioritising speech.”
When asked about the changes at a press conference, Trump welcomed them. “They have come a long way – Meta. The man (Zuckerberg) was very impressive,” he said.
Asked if he thought Zuckerberg was responding to his threats, which have included a pledge to imprison the CEO, Trump said “probably”.
In place of a formal fact-checking program to address dubious claims posted on Meta’s platforms, Zuckerberg instead plans to implement a system of “community notes” similar to that used on Elon Musk-owned social media platform X.
Meta also would stop proactively scanning for hate speech and other types of rule-breaking, reviewing such posts only in response to user reports, Zuckerberg said.
It will focus its automated systems on removing “high-severity violations” like terrorism, child exploitation, scams and drugs.
The demise of the fact-checking program, started in 2016, caught partner organisations by surprise.
“We’ve learned the news as everyone has today. It’s a hard hit for the fact-checking community and journalism. We’re assessing the situation,” AFP said in a statement provided to Reuters.
The head of the International Fact-Checking Network, Angie Drobnic Holan, challenged Zuckerberg’s characterisation of its members as biased or censorious.
“Fact-checking journalism has never censored or removed posts; it’s added information and context to controversial claims, and it’s debunked hoax content and conspiracies,” she said in a statement.
Kristin Roberts, Gannett Media’s chief content officer, said “truth and facts serve everyone – not the right or the left – and that’s what we will continue to deliver.”
Meta’s independent Oversight Board welcomed the move.
Zuckerberg in recent months has expressed regret over certain content moderation actions on topics including COVID-19. Meta also donated $US1 million to Trump’s inaugural fund, in a departure from its past practice.
“This is a major step back for content moderation at a time when disinformation and harmful content are evolving faster than ever,” said Ross Burley, co-founder of the nonprofit Centre for Information Resilience.
“This move seems more about political appeasement than smart policy.”
For now, Meta is planning the changes only for the US market, with no immediate plans to end its fact-checking program in places like the European Union which take a more active approach to regulation of tech companies, a spokesperson said.
Musk’s X is already under European Commission investigation over issues including the “Community Notes” system.
The commission began its probe in December 2023, several months after X launched the feature.
Meta said it would start phasing in Community Notes in the US over the next couple of months and improve the model over the year.
“It’s going to mean a free-for-all on misinformation, disinformation, abuse and trolling,” Greens Senator Sarah Hanson-Young told ABC radio on Wednesday
“This is a very, very dangerous move at a time when members of the community, parents, young people – women in particular – are increasingly concerned (about) the unsafe environment on these big platforms.”
A recent federal inquiry, which Senator Hanson-Young was part of, highlighted more Australians were concerned misinformation and disinformation than the global average.
An estimated 78 per cent of Australians are on at least one social media platform, with even higher proportions in younger populations.
Nearly half of all young Australian adults also turn to social media as their main source of news, according to a 2024 report by the federal media authority.
“Those spaces need to be safe and platforms should have a responsibility to do that,” Senator Hanson-Young said.
The fact-checking program typically involves journalists at internationally accredited agencies investigating and reviewing claims on social media through rigorous questioning, consideration of evidence and verification using multiple sources.
Posts deemed to be “False” or “Altered” have a fact-check article appended to them and may receive reduced distribution across Meta’s platforms Facebook, Instagram and Threads.
Australian fact-checking agency AAP FactCheck played a critical role in responding to disinformation with factual, objective journalism and through media literacy education, Australian Associated Press chief executive Lisa Davies said.
“Independent fact-checkers are a vital safeguard against the spread of harmful misinformation and disinformation that threatens to undermine free democratic debate in Australia and aims to manipulate public opinion,” she said.
“AAP FactCheck’s contract with Meta in Australia, New Zealand and the Pacific is not impacted by its US decision and our fact-checking work continues in 2025.”