Meta's preparations for the 2024 UK general election

Webp bg2z6s3nifk96mpvji9wpid5l6l2
Mark Zuckerberg Chairman and CEO of Meta Platforms (formerly Facebook, Inc.) | Meta Platforms (formerly Facebook, Inc.)

Meta's preparations for the 2024 UK general election

ORGANIZATIONS IN THIS STORY

In 2024, several countries and regions worldwide are heading to the polls to elect their leaders. Meta has been preparing for this, including the UK General Election on July 4th. Last year, Meta activated a dedicated team to develop a tailored approach to help preserve the integrity of these elections on its platforms.

Drawing on lessons from over 200 elections since 2016 and adhering to the UK's Online Safety Act, Meta aims to focus its teams, technologies, and investments for maximum impact. Since 2016, Meta has invested more than $20 billion into safety and security and quadrupled its global team working in this area to around 40,000 people, including 15,000 content reviewers across Facebook, Instagram, and Threads.

Meta has introduced transparency tools for ads about social issues, elections or politics; developed policies to prevent election interference and voter fraud; and built the largest third-party fact-checking program among social media platforms. Recently, it committed to responsibly handling new technologies like Generative AI. A new ad campaign on Facebook and Instagram will raise awareness of these tools.

Meta will also activate a UK-specific Elections Operations Centre with experts from various company departments to identify potential threats in real time.

Key areas of focus include:

**Combating misinformation:** Meta removes serious misinformation that could lead to violence or interfere with voting. Independent fact-checking organizations such as Full Fact, Reuters, Logically Facts, and FactCheckNI review content. Debunked content is labeled with warnings and reduced in distribution. Ads containing debunked content or discouraging voting are not allowed.

**Tackling influence operations:** These efforts range from covert campaigns using fake identities (coordinated inauthentic behavior) to overt efforts by state-controlled media entities. Meta labels state-controlled media on its platforms so users know when content is government-influenced. Stronger enforcement measures have been applied against Russian state-controlled media.

**Countering risks related to GenAI technologies:** Content generated by AI is subject to Community Standards and Ad Standards. Fact-checkers can rate AI-generated content as "Altered," leading to labeling and down-ranking in feeds. Advertisers must disclose if they use digitally created or altered images or audio in ads related to social issues or politics.

**Candidate safety:** Measures are in place to protect Members of Parliament (MPs) and candidates from malicious posts and behavior while allowing open discussion on public figures versus private individuals. Policies cover hate speech removal based on protected characteristics; removing language inciting violence; protecting MPs from bullying; harassment; repeated unwanted contact; sexual harassment; abusive posts where public figures are tagged or report them directly.

Meta collaborates closely with political parties encouraging candidates to report concerns about social media security while monitoring pages for abuse threats. Training sessions for candidates on safety measures are being held ahead of the election period.

For further information about how Meta approaches elections visit their Preparing for Elections page.

___

ORGANIZATIONS IN THIS STORY