The latest edition of Facebook’s transparency report was released on Thursday.
NEW DELHI: Facebook restricted access to 878 pieces of content in India on the directions of the Union ministry of Electronics and Information Technology between July and December 2020, the social media giant revealed in the latest edition of its transparency report. This included “content against security of the state and public order”, the report said.
“Of these, 10 were restricted temporarily. We also restricted access to 54 items in compliance with court orders,” Facebook further said in the report released on Thursday.
After a piece of content is restricted, users of that country cannot see it anymore.
The restrictions were enforced for violating Section 69A of the Information Technology (IT) Act, which allows the state to take down content without providing any reason. There has been an increase of 28.9% requests for restrictions in the second half of 2020, compared to the first half, when the government made 681 such requests.
Globally, government requests for user data increased by 10 per cent from 173,592 to 191,013 in the second half of 2020. India made 40,300 requests for user data between July and December, second only to the United States which made 61,262 such requests. In the first half of 2020, India had made 35,560 requests. The social media company complied with 52% of the India requests. In the case of the US, Facebook complied with 89% of the requests.
Facebook also revealed the reasons behind its Oversight Board reinstating a piece of content after it was “erroneously” removed on the first review. The content in question is a 17-minute video posted by a channel called ‘Global Punjab TV’.
“On March 2, 2021, the Oversight Board selected a case appealed by someone on Facebook regarding a post with a video from Global Punjab TV and accompanying text, claiming the Rashtriya Swayamsevak Sangh (RSS) and members of the Indian government are threatening Sikhs with violence,” the statement read. It added that Facebook had taken down this content initially for violating our policy on dangerous individuals and organisations. “However, upon further review, we determined we removed this content in error and reinstated it,” the Oversight Board ruled.
The oversight board, supposed to be an independent body of experts, take the final word on content moderation on Facebook. Their word is supposed to be final, and cannot be overruled even by CEO Mark Zuckerberg.