Brands
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
ADVERTISEMENT
Advertise with us

Meta platforms in India were found exposed to human rights risks due to third-party action

The report is based on an independent human rights impact assessment (HRIA) commissioned in 2019 by Meta on potential human rights risks in India and other countries related to its platforms.

Meta platforms in India were found exposed to human rights risks due to third-party action

Saturday July 16, 2022 , 2 min Read

Meta platforms, which include Facebook and Whatsapp, were found exposed to human rights risks, including "restrictions of freedom of expression and information" and "hatred that incites hostility", due to action of third parties, the first human rights report of the social media giant has said.

The report is based on an independent human rights impact assessment (HRIA) commissioned in 2019 by Meta on potential human rights risks in India and other countries related to its platforms. The project was undertaken by Foley Hoag LLP.

"The HRIA noted the potential for Meta's platforms to be connected to salient human rights risks caused by third parties, including restrictions of freedom of expression and information, third party advocacy of hatred that incites hostility, discrimination, or violence, rights to non-discrimination, and violations of rights to privacy and security of person," the report said.

The HRIA involved interviews with 40 civil society stakeholders, academics, and journalists.

The report found that Meta faced criticism and potential reputational risks related to risks of hateful or discriminatory speech by end users.

The assessment also noted a difference between company and external stakeholder understandings of content policies.

"It noted persistent challenges relating to user education, difficulties of reporting and reviewing content, and challenges of enforcing content policies across different languages. In addition, the assessors noted that civil society stakeholders raised several allegations of bias in content moderation. The assessors did not assess or reach conclusions about whether such bias existed," the report said.

According to the report, the project was launched in March 2020, and it experienced limitations caused by COVID-19, with a research and content end date of June 30, 2021.

The assessment was conducted independently of Meta, the report said.

The HRIA developed recommendations for Meta around implementation and oversight, content moderation, product interventions, etc., which Meta is studying and will consider as a baseline to identify and guide related actions, the report said.


Edited by Suman Singh