Amid growing concerns of misinformation spreading online, Facebook recently hosted a virtual briefing session to highlight the aggressive steps the platform has taken to combat it. These include creating an unparalleled global network of over 80 fact-checking partners and promoting accurate information to remove content when it breaks Facebook’s rules.
The session covered the definition of ‘misinformation’ as false information that is often shared unintentionally. The content is shared on an individual basis and is not part of any coordinated attempt to mislead or deceive people. ‘Disinformation’ refers to sharing content with the deliberate intent to mislead as part of a manipulation campaign or information operation. This activity is coordinated and can involve the use of fake accounts.
Facebook’s own approach to counter misinformation involves a three-part strategy for addressing misinformation on Facebook – Remove, Reduce and Inform. Part of this strategy is their third party fact checking program.
Explaining the platform’s Third Party Fact Checking further, Facebook informed participants of the session, “We do not believe any single entity – either a private company or government – should have the power to decide what is true and what is false. When one single actor is the arbiter of truth, there is a power imbalance and potential for overreach. With this in mind, we rely on independent fact-checkers to identify and review potential misinformation, which enables us to take action.”
Facebook partners with over 80 independent third party fact checkers globally, working in over 60 languages. In the past year, Facebook extended support to the fact-checking communities including $2 million in grants from Facebook and WhatsApp – for third-party fact-checkers in highly affected regions to help them increase capacity as they do this essential work.
Facebook also launched a year-long fellowship with 10 fact-checking organizations, including several from this region, to bring on new team members to help build capacity within the region. Facebook partners have been certified through the independent, non-partisan International Fact-Checking Network.
The speakers also shed some light on the measures Facebook has taken due to COVID-19. “We remove COVID-19 misinformation that could contribute to imminent physical harm including false claims about cures, treatments, the availability of essential services, or the location and severity of the outbreak.
We also remove false claims in relation to the COVID-19 vaccine that have been debunked or are unsupported by evidence such as false claims about the safety, efficacy, ingredients or side effects of COVID-19 vaccines. Between March and October 2020, we removed 12 million pieces of COVID-19 misinformation content.”
Facebook advertising policies have long-prohibited misleading claims, but the platform has implemented new advertising policies in relation to COVID-19. Facebook is also removing a number of ad targeting options, such as “vaccine controversies,” that might have been used to help spread this sort of misinformation.