Facebook announces -Measures- to -Protect- the U.S. elections

Facebook has set out some measures that it will be taking to ensure that the U.S. elections are as fair as possible. Facebook was criticised for its handling of content on its platform during the last presidential election and it doesn’t want a repeat of that experience.

During the upcoming elections, Facebook will take the following steps to ensure people don’t get tricked by misinformation:

  • No new political ads will be allowed to run in the week before the election.
  • Posts that claim people will be infected with COVID-19 if they go out to vote will be removed. Additionally, posts that discourage voting over COVID-19 will be tagged with a link to an authoritative source about the disease.
  • An information label will be attached to content that looks to delegitimise the outcome of the election or discuss the legitimacy of voting methods.
  • Finally, if a candidate or campaign tries to claim victory before the final votes are in, a label will be added to the post directing people to official election results hosted by Reuters and the National Election Pool.

Commenting on the measures, Facebook CEO Mark Zuckerberg, said:

“Today, we’re announcing additional steps we’re taking at Facebook to encourage voting, connect people with authoritative information, and fight misinformation. These changes reflect what we’ve learned from our elections work over the past four years and the conversations we’ve had with voting rights experts and our civil rights auditors.”

The Facebook CEO also revealed that in the last three days, Facebook has seen 24 million clicks to voter registration websites that it showed to users. He and his wife Priscilla Chan Zuckerberg have also personally donated $300 million to non-partisan organisations supporting states and local counties in strengthening voting infrastructure.

Read More:  Dorsey: Twitter blocking of emails on Hunter Biden's laptop was Wrong

In addition to the measures listed above, Facebook announced today that it was going to limit the spread of misinformation on Messenger by limiting the number of people or groups that users can forward a message to. With the new limits, a message can only be shared with five people or groups at a time.

Facebook is making another change to Messenger in an effort to restrain the spread of misinformation and harmful content. The company announced today that it is limiting the number of people or groups you can forward a message to via Messenger.

With the latest change, you can now forward messages to only up to five people or groups at a time, similar to the forwarding limit Facebook introduced to WhatsApp early last year in order to curb the spread of false news. Jay Sullivan, Director of Product Management, Messenger Privacy and Safety, wrote in a blog post:“Limiting forwarding is an effective way to slow the spread of viral misinformation and harmful content that has the potential to cause real world harm.

“We believe controlling the spread of misinformation is critical as the global COVID-19 pandemic continues and we head toward major elections in the US, New Zealand and other countries.”

The restriction is the latest step taken by the social networking giant as it combats misinformation on the platform, especially amid the COVID-19 pandemic and as elections worldwide inch closer. In June, it started warning users if they shared articles that were more than three months old. Earlier than that, Facebook introduced a way for people to see messages in their News Feed if they previously interacted with false news related to COVID-19. It remains to be seen how forwarding limit can help fight the spread of this type of content.