Comment to Meta’s Oversight Board in the Case of ‘India Sexual Harassment Video (2022-012-IG-MR)’

Oversight Board is an external, independent body created by Meta that people can appeal to if they disagree with Meta’s content enforcement decisions on Facebook or Instagram. Out of the appeals it receives, the Case Selection Committee of the Board will select the cases that the Board will hear based on some criteria that it sets (e.g., importance and precedential impact), which may change over time. The content decisions of the Board are binding and Meta will restore or remove the content accordingly. Committed to bringing diverse perspectives into the case review process, the Board invites public comments to respond to the cases that it takes up for hearing.

On 15 September 2022, the Oversight Board announced a case from India for consideration and invited public comments on the same. The case pertained to a video depicting a woman being sexually assaulted by a group of men, which went viral. The video was posted by an Instagram account describing itself as a platform for Dalit perspectives, and the text accompanying the video stated that a "tribal woman" was sexually assaulted. The video was initially removed by human reviewers of Meta as it violated its Adult Sexual Exploitation Policy. However, it was restored under its “newsworthy allowance” by another specialized team of Meta. The details of the case can be found here.

In our comments to the Board, we recommend that in assessing whether a content qualifies as newsworthy in the Indian context, the long-standing injustice faced by Dalits and the tribal community, and the power imbalance between the dominant social groups and the marginalized majority should be duly considered. But this consideration must be balanced with the rights of women who face double disadvantage and discrimination on account of their gender as well as caste/tribal identity. We argue against permitting newsworthy allowance to the content in question as it instrumentalizes the traumatic experience of the woman (especially if the video was shared non-consensually), thereby denying her the right to dignity and privacy. We also recommend that the Board inquire into the basis for the decision of Meta’s team of human reviewers to adjudge the content in question as newsworthy, and their knowledge of the deeply discriminatory structures of caste and patriarchy in India.

Read our complete comment here.

Oversight Board Decision

On 15 December 2022, the Oversight Board announced its decision regarding the case, upholding Meta’s decision to restore the post. The majority view of the Board found that restoring the content to the platform, with the warning screen, is consistent with Meta's values and human rights responsibilities. While it agreed that the content violates Meta's Adult Sexual Exploitation policy, it found the content to be suitable for applying the newsworthiness allowance. The Board recognized that content depicting non-consensual sexual touching can lead to a significant risk of harm, both to individual victims and more widely, for example by emboldening perpetrators and increasing acceptance of violence. However, since the video does not include explicit content or nudity, and the majority of the Board found that the victim is not identifiable, it reached the conclusion that the benefits of allowing the video to remain on the platform, behind a warning screen, outweigh the risk of harm. The majority opined that where a victim is not identifiable, their risk of harm is reduced significantly.

A minority of the Board believed that the content should be removed from the platform on the basis that there is some possibility that the victim could be identified. Viewers of the video who have local knowledge of the area or the incident might be able to identify the victim even if their face is not visible. The likelihood of this is especially high as the incident was widely reported by local news outlets. The majority acknowledged the minority's concerns but did not believe that local awareness of an incident should, by itself, mean that a victim is "identifiable".

The Board also quoted from IT for Change’s comment while making its case:

"The Board also considered broader risks. Social media in India has been criticized for spreading caste-based hate speech (see the report on Caste hate speech by the International Dalit Solidarity Network, March 2021). Online content can reflect and strengthen existing power structures, embolden perpetrators and motivate violence against vulnerable populations. Depictions of violence against women can lead to attitudes that are more accepting of such violence. Public comments highlighted that sexual harassment is an especially cruel form of harassment (see, for example, PC-10808 (SAFEnet), PC-10806 (IT for Change), PC-10802 (Digital Rights Foundation), PC-10805 (Media Matters for Democracy)).

The majority balanced those risks with the fact that it is important for news organizations and activists to be able to rely on social media to raise awareness of violence against marginalized communities (see the public comments PC-10806 (IT for change), PC-10802 (Digital Rights Foundation), PC-10808 (SAFEnet)), especially in a context where media freedom is under threat (see the report by Human Rights Watch). In India, Dalit and Adivasi people, especially women who fall at the intersection of caste and gender (see PC-10806, (IT for Change)), suffer severe discrimination, and crime against them has been on the rise. Civil society organizations report rising levels of ethnic-religious discrimination against non-Hindu and caste minorities which undermines equal protection of the law (see, Human Rights Watch report)."

Read the Board’s full case decision here.

Focus Areas
What We Do
Resource Type