Facebook Fights ‘Misinformation’ through Third Party Fact Checking

  • The company defined ‘misinformation’ as false information that is often shared unintentionally.
06 Feb, 2021

Amid growing concerns of misinformation spreading online, the world’s largest social media platform Facebook highlighted the aggressive steps the platform has taken to combat it.

These include creating a global network of over 80 fact-checking partners and promoting accurate information to removing content when it breaks Facebook’s rules, the platform informed in a recently held virtual session.

The company defined ‘misinformation’ as false information that is often shared unintentionally. The content is shared on an individual basis and is not part of any coordinated attempt to mislead or deceive people. Whereas, ‘disinformation’ refers to sharing content with the deliberate intent to mislead as part of a manipulation campaign or information operation. This activity is coordinated and can involve the use of fake accounts.

Facebook said that its approach to counter misinformation involves a three-part strategy for addressing misinformation on Facebook - Remove, Reduce and Inform. Part of this strategy is their third party fact checking program.

“We do not believe any single entity - either a private company or government - should have the power to decide what is true and what is false. When one single actor is the arbiter of truth, there is a power imbalance and potential for overreach. With this in mind, we rely on independent fact-checkers to identify and review potential misinformation, which enables us to take action,” said Facebook while explaining its Third Party Fact Checking.

Facebook partners with over 80 independent third party fact checkers globally, working in over 60 languages. In the past year, Facebook extended support to the fact-checking communities including $2 million in grants from Facebook and WhatsApp - for third-party fact-checkers in highly affected regions to help them increase capacity as they do this essential work.

The speakers also shed some light on the measures Facebook has taken due to COVID-19. “We remove COVID-19 misinformation that could contribute to imminent physical harm including false claims about cures, treatments, the availability of essential services, or the location and severity of the outbreak. We also remove false claims in relation to the COVID-19 vaccine that have been debunked or are unsupported by evidence such as false claims about the safety, efficacy, ingredients or side effects of COVID-19 vaccines. Between March and October 2020, we removed 12 million pieces of COVID-19 misinformation content.”

Facebook said that it is also removing a number of ad targeting options, such as "vaccine controversies," that might have been used to help spread this sort of misinformation.

Read Comments