Brands
YS TV
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Yourstory

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

Videos

Facebook 'actioned' about 31.7M content pieces in August, shows compliance report

Facebook's photo sharing platform, Instagram, took action against about 2.2 million pieces across nine categories during the same period proactively.

Facebook 'actioned' about 31.7M content pieces in August, shows compliance report

Saturday October 02, 2021 , 4 min Read

Facebook "actioned" about 31.7 million content pieces across 10 violation categories proactively during August in the country, the social media giant said in its compliance report on Friday.


Facebook's photo sharing platform, Instagram, took action against about 2.2 million pieces across nine categories during the same period proactively.


Facebook had "actioned" over 33.3 million content pieces across 10 violation categories proactively during June 16-July 31 in the country. Instagram took action against about 2.8 million pieces across nine categories during the same period proactively.


On Friday, Facebook said it had received 904 user reports for Facebook through its Indian grievance mechanism between August 1-31.

"Of these incoming reports, Facebook provided tools for users to resolve their issues in 754 cases. These include pre-established channels to report content for specific violations, self-remediation flows where they can download their data, avenues to address account hacked issues etc," it added.

Between August 1-31, Instagram received 106 reports through the Indian grievance mechanism.


Over the years, we have consistently invested in technology, people and processes to further our agenda of keeping our users safe and secure online and enable them to express themselves freely on our platform.


"We use a combination of Artificial Intelligence, reports from our community and review by our teams to identify and review content against our policies," a Facebook spokesperson said.

In accordance with the IT Rules, the company has published its third monthly compliance report for the period for 31 days (1 August - 31 August), the spokesperson added.

"This report will contain details of the content that we have removed proactively using our automated tools and details of user complaints received and action taken, the spokesperson said.


In its report, Facebook said it had actioned about 31.7 million pieces of content across 10 categories during August 2021.

Facebook


This includes content related to spam (25.9 million), violent and graphic content (2.6 million), adult nudity and sexual activity (2 million), and hate speech (242,000).


Other categories under which content was actioned include bullying and harassment (90,400), suicide and self-injury (677,300), dangerous organisations and individuals: terrorist propaganda (274,200) and dangerous organisations and Individuals: organised hate (31,600).

Proactive actioning

"Actioned" content refers to the number of pieces of content (such as posts, photos, videos or comments) where action has been taken for violation of standards. Taking action could include removing a piece of content from Facebook or Instagram or covering photos or videos that may be disturbing to some audiences with a warning.

The proactive rate, which indicates the percentage of all content or accounts acted on which Facebook found and flagged using technology before users reported them, in most of these cases ranged between 80.6-100 per cent.

The proactive rate for removal of content related to bullying and harassment was 50.9 per cent as this content is contextual and highly personal by nature. In many instances, people need to report this behaviour to Facebook before it can identify or remove such content.


Under the new IT rules, large digital platforms (with over 5 million users) will have to publish periodic compliance reports every month, mentioning the details of complaints received and action taken thereon. The report is to also include the number of specific communication links or parts of information that the intermediary has removed or disabled access to in pursuance of any proactive monitoring conducted by using automated tools.


For Instagram, about 2.2 million pieces of content were actioned across nine categories during August 2021. This includes content related to suicide and self-injury (577,000), violent and graphic content (885,700), adult nudity and sexual activity (462,400), and bullying and harassment (270,300).


Other categories under which content was actioned include hate speech (37,200), dangerous organisations and individuals: terrorist propaganda (6,300), and dangerous organisations and individuals: organised hate (2,300).


Edited by Teja Lele