Submission on the Draft Amendment to Intermediary Guidelines Rules 2018

The following is our submission to the Ministry of Electronics and Information Technology on the Draft Amendment to the Information Technology (Intermediary Guidelines) Rules 2018 with special reference to gender-based cyberviolence against women

Even as the digital sphere has emerged as a site for new forms of community building, online sexism, misogyny and gender-based cyberviolence are incontrovertibly acknowledged to diminish the ability of women users and gender minorities to have meaningful and equal access to digital spaces. Research suggests that one in ten women has already experienced some form of cyberviolence since the age of fifteen. The threat of violence contributes to the perpetuation of the gender gap in connectivity. In South Asia, for instance, women are de facto minorities in the digital sphere, and are 58% less likely than men to use the mobile internet, due to safety and security concerns.

Ranging from cyberstalking to non consensual sharing of intimate images (NCII), gender-trolling and doxxing, the internet has proven to be as treacherous as offline spaces for women and gender minorities to navigate, in their quest for fundamental freedoms. The latest NCRB figures on cybercrimes (2017) show that 245 cases were registered for violation of (bodily) privacy (Section 66E), and 542 incidences of cyberstalking of women children (Section of 354D of the IPC). Undoubtedly, this is but the tip of the iceberg.

Parallel to the notification of the draft Intermediary Guidelines, two cases have seen developments in respect to intermediary responsibility for content governance:

  • In the suo moto petition Re: Prajwala (SMW (Crl) No. 3/2015), which was concerned with the proliferation of rape videos online, the Supreme Court directed the state to expeditiously frame guidelines for eliminating “child pornography, rape and gang rape imageries, videos and sites in content hosting platforms and other applications.”
  • The Supreme Court recently transferred to itself a host of petitions relating to social media messaging accountability and traceability for a hearing in January 2020. The issue at stake is compelling Facebook to identify the originator of a message, for the purpose of aiding investigation by law enforcement agencies. This gains importance not just in cases relating to terrorism and national security but also in the circulation of morphed images, NCII, doxxing, etc. for identifying the perpetrator of such violations to privacy and dignity of women.

IT for Change’s empirical research with 881 young women aged 19-23 years across Kerala, Karnataka and Tamil Nadu in 2018-19 revealed that incidents of identity-based violence and sexual harassment in online spaces have become naturalized and normalized. Researchers and practitioners working with women victims/survivors observe that gender-based violence in the digitally mediated context often lacks corresponding provisions in the law. This is because existing taxonomies in the law often fail to capture emerging, and hitherto unseen, forms of violations. For example, the traditional categories of ‘heckling’ or ‘street sexual harassment’ cannot adequately capture ‘gendertrolling’. In addition to legal reform to account for new forms of cyberviolence, we also require regulation to address the responsibility of internet intermediaries in combating digitally- mediated violence and harms. The proposed Intermediary Guidelines Amendment Rules provide an opportunity to create redressal mechanisms to combat not only misinformation and economic offenses, but also gender-based cyberviolence and sexist hate speech.

The Intermediary Guidelines Rules 2011 were conceived of as a fault-based liability mechanism under legally granted safe harbour exemptions for the limitation of liability as envisioned by Section 79 of the Information Technology Act 2000. Safe harbour provisions around the world have taken inspiration from Section 230 of the US Communications Decency Act. With mass access to the internet and increasing seamlessness in online-offline experiences, the exact role of the intermediary in curating and moderating content is increasingly in the spotlight. The intention behind the offering of “safe harbour” to intermediaries, it must be noted, is not to provide blanket immunity for the content they carry, but to encourage them to take proactive steps to build standards for content governance.

In India, our experience of the decade since the previous IT Amendment Act came into force in 2008 has demonstrated that as the internet is increasingly socialized, the role of platforms becomes more everyday and complex. This means there is a need for changed scope and parameters for self-regulation. However, what we are seeing is that there is an erosion of the democratic fibre of the public sphere, notably politicization and the exclusionary nature of intermediary platforms, reflecting the crisis of self-regulation. That the law must step up to offer a) standards for intermediary self-regulation, and b) avenues for redress, is beyond doubt.

Key Recommendations:
1. The Intermediary Guidelines Rules, 2018 must respect the fundamental rights guaranteed by the Indian Constitution. Part III of the Constitution defines social and political liberties in the form of fundamental rights, calling upon the state to particularly establish equality before the law and non-discrimination on the grounds of protected identities, including gender, as well as declaring the freedoms guaranteed to citizens and the reasonable restrictions imposed on them. However, the internet is proving to be a treacherous territory for women – a pathway to their self-actualization, yet, paved with the risk of violence and abuse. The inextricable intertwining of the internet and its disruption of social norms and behaviour, calls for a re-articulation of freedoms and their limits, particularly the balance between freedom of expression and freedom from violence online. In this context, a new civil rights framework for the internet, akin to Brazil’s Marco Civil da Internet, may be an urgent necessity.

2. The overarching notice-and-takedown regime recommended by the Intermediary Guidelines Rules, 2018 lacks a nuanced view of how liability differs according to type of content at issue (vertical differentiation) and according to the kind of function performed by the intermediary (horizontal differentiation). As the Supreme Court of Argentina has observed in 2014, it is important to distinguish between infringing content and manifestly unlawful content, when determining intermediary liability. And for the latter, greater liability must be placed on internet intermediaries. For manifestly unlawful content such as rape videos and child pornography, intermediaries must be bound by a notice-and-takedown regime where they have to take down the offending content on being notified by any user, even when s/he is not an affected party. For all other forms of infringing content (cases of reported non-consensual circulation of intimate images, copyright violation, defamation etc.), a notice-and-notice regime should be adopted to guard against overcensorship. The Brazilian experience with Marco Civil da Internet, and Canadian copyright law, shows that rather than a notice-and-takedown regime, a notice-and-notice regime has the advantages that it: 1) provides the intermediary with more information to add context to a request for removal of content, 2) offers the author the opportunity to remove the content themselves, 3) triggers a fault-based liability on the intermediary only when there is a court order for restricting access.

3. The state must define public standards for algorithmic content management to be followed by intermediaries. Though algorithms could be used for flagging potential violating content, the final decision for content take-down should be human-supervised.

 

What We Do
Resource Type