Webp ykvzu0zx2rzh8x6syd12wctpuz1m
Chris Cox Chief Product Officer | Meta

Meta revamps content policies; ends fact-checking program in US

ORGANIZATIONS IN THIS STORY

Meta is making significant changes to its content management systems, aiming to reduce errors and enhance free expression on its platforms. The company acknowledges that the complexity of its current systems has led to excessive moderation, often hindering legitimate speech.

In a 2019 speech at Georgetown University, Mark Zuckerberg highlighted the importance of free expression as a catalyst for societal progress. He stated: “Some people believe giving more people a voice is driving division rather than bringing us together. More people across the spectrum believe that achieving the political outcomes they think matter is more important than every person having a voice. I think that’s dangerous.”

Meta plans to end its third-party fact-checking program in the United States and transition to a Community Notes system. This new approach draws inspiration from X's model, where community members collaboratively decide when posts require additional context. Meta aims for this system to provide unbiased information about online content.

The Community Notes will be created and rated by users with diverse perspectives, ensuring balanced evaluations. Meta intends to phase in this program in the U.S., replacing intrusive fact-checking labels with less obtrusive notifications indicating available additional information.

The company also plans to loosen restrictions on topics such as immigration and gender identity, which are frequently debated politically. Automated systems will continue focusing on severe violations like terrorism and fraud, while less severe issues will rely on user reports before action is taken.

Meta recognizes past mistakes in enforcement actions and aims for greater transparency by regularly reporting these errors. Changes include relocating trust and safety teams from California to Texas and other U.S. locations.

Efforts are underway to streamline appeal processes for enforcement decisions, including testing facial recognition technology and utilizing AI large language models (LLMs) for second opinions before taking action.

Additionally, Meta will personalize civic content visibility based on user preferences rather than broadly reducing it across platforms like Facebook, Instagram, and Threads. Users can expect recommendations based on explicit signals such as likes or implicit signals like post views.

These adjustments reflect Meta's commitment to free expression while addressing policy impacts on user engagement as outlined by Zuckerberg during his Georgetown address.

ORGANIZATIONS IN THIS STORY