Meta CEO Mark Zuckerberg claims he will “restore free speech” to the platform

Meta, the tech giant behind Facebook and Instagram, has announced significant changes to its content moderation approach, aiming to “restore free speech.”

In a January 7 statement, the company revealed plans to end its third-party fact-checking program in favor of a Community Notes system, while criticizing the European Union for its restrictive policies.

Mark Zuckerberg, Meta’s CEO, praised the U.S. for having “the strongest constitutional protections for free expression in the world” and criticized the EU for what he called “institutionalized censorship.” He contrasted this with authoritarian regimes, citing Latin America’s secret court orders and China’s outright censorship of Meta’s apps.

Meta admitted its content moderation systems had become overly complex, partly due to political and societal pressure. This complexity led to frequent mistakes, frustrating users and stifling the free expression the company claims to champion. “Too much harmless content gets censored, too many people find themselves wrongly locked up in ‘Facebook jail,’ and we are often too slow to respond when they do,” Meta’s statement noted.

Zuckerberg expressed regret over Meta’s compliance with what he described as pressure from President Joe Biden’s administration to suppress certain content. Reflecting on the 2020 U.S. election, he admitted his company demoted reports about Hunter Biden at the FBI’s urging, labeling it “wrong” in hindsight.

Under the upcoming Donald Trump administration, Zuckerberg sees an opportunity to “restore free expression.” Meta plans to lift restrictions on mainstream discourse while concentrating enforcement on serious violations like terrorism, child exploitation, and fraud.

Joel Kaplan, Meta’s chief global affairs officer, acknowledged political bias in third-party fact-checking. “It has become clear there is too much political bias in what they choose to fact-check,” Kaplan told Fox News.

Meta’s updated policies will include:

  • Ending automatic content demotions.
  • Requiring user reports for potential infractions.
  • Establishing a higher threshold for content removal.
  • Introducing a better appeals process.

The company will also expand its transparency reporting to regularly disclose data on moderation errors.

The EU’s content moderation practices and funding for fact-checking initiatives have long supported stricter oversight. Meta’s shift follows criticism of rival platform X (formerly Twitter) for abandoning similar policies. Brussels threatened fines for X under its Digital Services Act (DSA), which mandates compliance with the EU’s strengthened Code of Practice on Disinformation.

Meta, however, argues that the EU’s rules stifle innovation and are too restrictive. “It’s not right that things can be said on TV or the floor of Congress but not on our platforms,” the company stated.

Meta plans to roll out these changes in the coming weeks, signaling a pivot toward less restrictive policies and greater user control. As debates over content moderation and free speech continue, Meta’s bold shift may reshape the landscape for social media platforms worldwide.

Zdieľaj tento článok
ZDIEĽATEĽNÁ URL
Posledný Príspevok

Czech shoppers flock to January sales as prices for clothing, toys, and electronics rise

Ďalšie Články

Over 30,000 evacuated as wind-driven wildfires devastate Southern California

Pridaj komentár

Vaša e-mailová adresa nebude zverejnená. Vyžadované polia sú označené *

Read next