Recently, in an effort to debunk the false articles that can manipulate the electoral process, Facebook introduced a new tool to warn people when they come across content that is deemed to be “illegal, false, or partly false” by its independent fact-checking partners.
According to Rebecca Stimson, Head of UK Public Policy, Facebook “Helping protect elections is one of our top priorities and over the last two years we’ve made some significant changes – these broadly fall into three camps:
- We’ve introduced greater transparency so that people know what they are seeing online and can scrutinize it more effectively;
- We have built stronger defenses to prevent things like foreign interference;
- And we have invested in both people and technology to ensure these new policies are effective.”
Users will now see a grey screen sitting over a post, which reads ‘false information’ displaying the fact-checkers’ articles debunking the claims.
According to Marketingweek.com, across Facebook there are more than 35,000 people working on safety and security sitting across 40 teams, 500 of whom are focused on elections.
Regarding the political ads, Facebook “don’t believe a private company (…) should censor politicians. This is why we don’t send content or ads from politicians and political parties to our third party fact-checking partners.
This doesn’t mean that politicians can say whatever they want on Facebook. They can’t spread misinformation about where, when or how to vote. They can’t incite violence. We won’t allow them to share content that has previously been debunked as part of our third-party fact-checking program. And we of course take down content that violates local laws.
But in general we believe political speech should be heard and we don’t feel it is right for private companies like us to fact-check or judge the veracity of what politicians and political parties say.”