Meta has announced the cessation of its content moderation program in favor of a community-based fact-checking system. According to CEO Mark Zuckerberg, this move aims to reduce mistakes and promote free expression.
A New Approach to Content Moderation
Under the previous system, Meta identified posts that might be promoting misinformation based on user responses. Independent fact-checkers could mark these posts, which were then shown lower in feeds awaiting review. This practice applied to Facebook and was expanded to Instagram in 2019 and Threads in 2024. Misinformation posts were de-prioritized until independent reviewers verified the content's accuracy.
Impact on Social Media Platforms
Meta's community-based checking system, inspired by a similar system at X, will be implemented across Facebook, Instagram, and Threads. According to Reuters, the latest changes will affect these platforms, recognized as some of the largest social media networks with over 3 billion users worldwide.
Political Implications
Meta's shift in content moderation policy coincides with Donald Trump's election as the new US President. Despite previous tensions with Trump, the company is taking steps towards better understanding, including appointing Republican representatives to key positions and donations to his inaugural fund. This might reflect strategic changes within the company in light of political developments.
Meta's new community-driven fact-checking system marks a significant shift in the company's policy. By adopting new approaches, Meta aims to decrease errors, simplify moderation policies, and restore freedom of expression across its platforms.