Facebook adds Healthy Indian Project as fact-checking partner in India
Social media giant Facebook on Wednesday said it is expanding its third-party fact-checking programme in the country to include its first health-specialist partner The Healthy Indian Project (THIP).
The onboarding of THIP as a fact-checking partner is part of Facebook's effort to combat COVID-19 and all other health-related misinformation on the platform, a statement said.
THIP Media works with verified medical professionals to fact check misleading news and claims about health, medicines, diet, and treatment in English, Hindi, Bengali, Punjabi, and Gujarati, it added.
During the pandemic, Facebook has removed more than 18 million pieces of harmful misinformation across its platform and Instagram and labelled over 167 million fake news posts on COVID-19 with the help of third-party fact-checkers.
The partnership with THIP will enhance its capabilities to understand and curb health-related misinformation on the platform, the statement said.
Globally, Facebook works with 80 fact-checking partners that help in content monitoring in more than 60 languages. Facebook's fact-checking partners have been certified through the independent, non-partisan International Fact-Checking Network.
In India, Facebook has 10 fact-checking partners, making it one of the largest after the US.
This includes India Today Group, Vishvas News (Dainik Jagran), Factly, Newsmobile, Fact Crescendo, BOOM Live, AFP, NewsChecker, and Quint, and these partners fact-check content in English and 11 Indian languages, including Hindi, Bengali, Telugu, Malayalam, Tamil, Marathi, Punjabi, Urdu, Gujarati, Assamese, and Kannada.
Facebook said third-party fact-checkers evaluate stories, check if the stories are factual, and rate their accuracy. When a fact-checker rates a story as false, Facebook shows it lower in the News Feed, significantly reducing its dissemination and reducing the number of people who view it.
Pages and domains that repeatedly share false news see their distribution reduced and their ability to monetise and advertise are temporarily removed.
Community members are presented with a pop-up notice if someone tries to share a fact-checked post so people can decide for themselves what to read, trust, and share, Facebook said.
People who shared a story that's later debunked are notified so they know there is additional reporting on that piece of content, it added.