Today, we want to share our work to protect the integrity of presidential elections taking place in Brazil in October 2022. In recent years, we’ve increased our efforts to combat misinformation by investing in teams, technology and partnerships to ensure the safety of people using Meta’s platforms.
Since 2016, we’ve quadrupled our security and integrity workforce to more than 40,000 people globally. Last year alone, we invested nearly $5 billion in both areas.
We know that local knowledge is essential for this work to be effective, so we also have a large team of specialists based in Brazil who have a deep understanding of the situation. These efforts are intensified as the election approaches, and our work to protect the integrity of our platforms will continue after the vote.
Preventing and Stopping Election Interference
Removing content that violates our policies on voter suppression, such as posts that discourage people from voting, is among our many responses to potential interference in the electoral process. We take many actions to prevent hate speech or the incitement of violence on our platforms.
Currently, 99.7% of the fake accounts we remove from Facebook are deleted by artificial intelligence, before they are reported by users. We also investigate and disrupt networks that use fake accounts in a coordinated way to influence public debate.
Closer to October, we will activate an Elections Operations Center focused on Brazil, an initiative we’ve implemented since 2018, to bring together experts from across the company – including intelligence, data science, engineering, research, operations, public policy and legal teams. They work together to identify potential threats on our platforms in real time, accelerating our response time.
Collaborating With Authorities
In partnership with Brazil’s Superior Electoral Court (TSE), in December 2021 we started adding a label to posts about political elections on Facebook and Instagram, directing people to reliable information on the Electoral Justice website. In the first two months after its launch, the label led to a 10-fold increase in visits to the Electoral Justice portal.
Between the end of April and the beginning of May, we posted reminders on Facebook for users to request or update their voter cards. The content was seen by the majority of adults using Facebook in Brazil and more than three million people clicked to see more information. Closer to the upcoming election, we will again display reminders on Facebook and Instagram about voting day to raise awareness among voters and reduce abstention rates.
For the first time, the TSE will be able to report content directly on Facebook and Instagram that may violate our policies. We will analyze the reports once they are received.
WhatsApp launched an extrajudicial channel of communication in the 2020 municipal election to receive complaints from the TSE. The focus is on quick response to potential cases of bulk messaging, which is forbidden by local electoral law and by the app’s terms of service.
We also developed a virtual assistant on WhatsApp with the TSE, as we did during Brazil’s 2020 municipal election. The chatbot is accessible through the number +55 61 9637-1078. It allows voters to interact directly with the electoral authority and receive relevant information about the vote.
Meta has hosted training sessions for electoral officials all over Brazil to explain our actions to curb misinformation, share details on how Facebook and Instagram work, and detail our content rules, which we call our Community Standards and Community Guidelines. We also offer workshops to candidates and their campaign teams.
The partnership with the TSE also includes booklets with information for the electoral community and a guide to combating online violence against women in politics, also supported by the Women’s Democracy Network (WDN) – Brazil Chapter.
We remove content on Facebook and Instagram that discourages voting or interferes with voting, such as incorrect information about the election date or candidates’ numbers.
We also work with independent fact-checking organizations to verify the veracity of reported posts that don’t violate our Community Standards. When fact-checkers mark a post as false, we reduce its reach on Facebook and Instagram.
People who still see this content in their feeds will see it covered with a label and a link directing them to more information from the fact-checker. In July, we increased the number of partners in our fact-checking initiative in Brazil from four to six including: Agência Lupa, AFP, Aos Fatos, Estadão Verifica, Reuters Fact Check and UOL Confere.
Since messages on WhatsApp are end-to-end encrypted, we fight misinformation on WhatsApp through measures to reduce message virality.
Messages forwarded on WhatsApp are identified with a tag. Since 2020, messages with five or more forwards can be resent to just one conversation, which has led to a 70% global reduction in the number of frequently forwarded messages. This year, we implemented a new forwarding limit on WhatsApp: now, any forwarded message can only be forwarded again to one WhatsApp group at a time.
In 2018, we launched our transparency tools for ads about politics and elections on Facebook and Instagram in Brazil. In 2020, we began requiring advertisers who wish to run ads about elections or politics to complete an authorization process and include “Paid for by” disclaimers on these ads. This year, we’ve expanded that requirement to ads about social issues such as economics, security and education.
All posts with the “Paid for by” disclaimer go to the Ad Library, where they are stored for seven years. The tool is open and provides anyone with detailed information about political ads including the ad source account, audience demographics and estimated spending range, among other data.
Protecting the integrity of the Brazilian election in 2022 on our apps is a priority for Meta. We will continue to share updates on how we move forward with this work.