Unpacking the Supreme Court’s Emerging Stance on Online Censorship

This position paper looks at the issue of intermediary filtering in the context of two Supreme Court cases. The first was a petition filed by activist Sabu Matthew George in 2008, asking for a ban on advertisements on search engines related to pre-natal sex determination. The second case was a suo-moto PIL taken up by the Court in 2015, in response to women’s’ rights activist Sunitha Krishnan’s letter on the circulation ofn rape videos on social media sites. Our paper analyses these cases, alongside the arguments related to freedom of speech and the act of intermediary filtering of content, and argues for a more nuanced intermediary liability framework which acknowledges that all content cannot be treated equally. Read the full paper below:

Writing Wrongs or Righting Violations? – Unpacking the Supreme Court’s Emerging Stance on Online Censorship

By: Anita Gurumurthy, Nandini Chami and Amrita Vasudevan, IT for Change

March 12, 2017

As the second anniversary of the landmark judgment in Shreya Singhal vs Union of India approaches, we find ourselves in the midst of another contentious moment on content regulation. The mercurial rapidity with which the digital phenomenon is shifting seems to demand new frames to relook at what was deemed settled. The present controversy pertains to the stance assumed by the Supreme Court of India in two public interest litigations (PILs) that it is currently hearing. One is a petition filed by the activist Sabu Mathew George in 2008, that calls for a ban on advertisements related to prenatal sex determination on search engines. And the other is a suo-moto PIL taken up by the Court in 2015, in response to a letter from the women’s rights activist Sunitha Krishnan on the rampant circulation of rape videos on social networks and social media platforms.

In both instances, the Court has adopted the view that there must be proactive filtering and preemptive blocking of the content in question; content that may be seen as playing a constitutive role in acts of sex-selective abortion, or rape and aggravated sexual assault. In the words of the judicial bench hearing the suo-moto PIL on rape videos, “we want prevention, not cure” (preemptive blocking rather than post-facto take downs of content). With respect to responsibility for such filtering and blocking, the Court leans towards the position that it must be distributed between government agencies and Internet intermediaries.

Free speech activists have expressed their anxiety that in its zeal to effectively curb access to unlawful online content, the Court may be going down the path of progressively increasing censorship”, which will end up rendering “entire swathes of the Internet off-limits for everyone”. The unstated concern here is that the Court may be reversing the gains of the Shreya Singhal judgment that outlawed ‘excessive’ and ‘unreasonable’ curbs on citizens’ right to free speech, through two key actions: a) repeal of Section 66A of the Information Technology Act 2000 that penalised speech of a ‘grossly offensive’ or ‘menacing’ character, for its arbitrary and vague wording that facilitated misuse by state agencies; and b) reading down the intermediary liability guidelines (2011) by completely taking away the discretionary powers of Internet platforms to implement suo-moto take downs of illegal content. On the contrary, the Court explicitly held that content take downs were permissible only on the basis of specific judicial or executive orders.

At first glance, it seems that in these two cases, by pushing for a greater role for intermediaries in content regulation, the Court seems to be rolling back this liberal, expansive ‘safe harbour’ regime, and reintroducing draconian censorship legislation. But is that really the case? A closer reading reveals a different picture.

According to the Census of India 2011, the child sex ratio in the country was 918 girls per 1000 boys, plummeting to the lowest levels ever recorded, since independence. In some districts of the country, the sex ratio is as low as 774. The Pre-conception and Pre-natal Diagnostic Techniques (Prohibition of Sex Selection) Act, 1994 (PC-PNDT Act) seeks to address this deep patriarchal malaise, regulating sex selective abortions. In the Supreme Court case on blocking content that violated Section 22 of the PC-PNDT Act for sex-determination tests, the Court ordered the government to constitute a single point nodal agency to receive complaints regarding “anything that has the nature of an advertisement or any impact in identifying a boy or a girl in any method, manner or mode by any search engine” and take action against the same by requiring the concerned search engine to take down the content within 36 hours of notification. At the same time, the Court also insisted that the respondent Internet intermediaries (search engines Yahoo, Microsoft Bing and Google) must constitute an in-house mechanism to pro-actively filter and block any content that violated the letter and spirit of Section 22 of the PC-PNDT Act.

In issuing these interim orders, the Court seems to have implicitly adopted a broad interpretation of the term ‘advertisement’.1 This is also revealed by its directions to the respondent intermediaries, where the Court recommends blocking of specific websites advertising such services, as well as keyword filtering. With respect to any uncertainty that intermediaries face with respect to the ‘legality’ of a particular piece of content, they are to approach the single point nodal agency for clarifications. In this case, the Solicitor General of India reiterated the need for auto-blocking. The advocate for Google India Pvt Limited submitted the view that the respondent could comply with the directions on banning sponsored adverts and blocking content that had been notified as illegal by a government agency from appearing on their search results. But auto-blocking was not possible, as Google was not in a position to develop in-house mechanisms to prohibit such content. In his own words, “You cannot have a preventive blockage. You can have curative blockage.” In addition, the counsels for all respondent intermediaries argued that auto-blocking could lead to weeding out of perfectly legitimate content, resulting in overcensorship.

In the suo-moto PIL on circulation of gang rape videos, the Court has adopted a similar stance on the need for preemptive blocking – by asking the government and Internet intermediaries to evolve a mechanism that can prevent the very upload of videos of sexual offences.2 In their responses to the Court, the government as well as respondent intermediaries cited technical impediments to the creation of any such system, considering the sheer volumes of uploads that take place on the Internet every day. Instead, they emphasized the need to opt for speedy content take down mechanisms. The government has offered to set up a nodal agency that would focus on blocking such videos uploaded on social network and social media platforms. Google, as a respondent intermediary in this case, has implied that a notice and take down system may be all that is possible, by arguing that it is not technically feasible to develop a system to crawl through billions of web uploads every day, in order to “nip such content in the bud”.

Understandably, in both cases, Internet intermediaries have resisted the idea of creating an in-house mechanism for content filtering and blocking. They have sought the preservation of the existing legal framework that limits their liability to removing those specific pieces of content with respect to which take down orders have been issued by the executive/ judiciary. Of course, the political economy imperatives that make Internet intermediaries push back any move to increase their responsibilities in the area of online content censorship is easily understood: negative publicity for decisions gone wrong that result in over-censorship and the enormous costs of having to set up an in-house content filtering mechanism.

What is curious however, in these cases, is the stance of the Supreme Court. Is the Court, in its attempt to curtail sexism and violence against women in online spaces, trampling the right to free speech?

The Supreme Court’s stance on online censorship –right direction, faulty steering

The main objection to the Supreme Court’s stance on online censorship in these two cases is that it sets us off on a slippery slope of unaccountable privatised censorship, in which the power to discern what is legitimate content and what is unlawful is passed on wholesale to in-house committees of Internet intermediaries or officers of the executive – without any accompanying checks and balances. The fear is that the course the Supreme Court is embarking on, of recommending proactive filtering and preventive blocking by intermediaries, will result in the censorship debate being recast from a ‘political’ debate into a ‘technical’ debate. The casualty, from a free speech point of view, is the space for citizen engagement, given the thin line dividing ‘political dissent’ from ‘illegal actions’. Gautam Bhatia’s observation echoes this apprehension, “Today, the Court wants Google to block access to search results involving the word “gender selection”. What will it be tomorrow? “Secession”? “Terrorism”?”

The Supreme Court’s failure to evaluate the risks of over-censorship before recommending the setting up of new institutional mechanisms for content filtering and blocking is a major omission. However, the counter view that there must be absolutely no preventive content take-down and blocking measure, ever, is equally problematic. For, the latter position fails to distinguish content censorship as a measure against criminal violations of women’s human rights in online spaces from content censorship as a measure against civil wrongs, such as copyright infringement. Once we acknowledge this difference, we begin to see that there can be no universal response to the question of preventing circulation, distribution and sharing of unlawful content.

The ‘notice and take down’ regime that we currently use to deal with copyright infringement cannot therefore be an adequate response to tackle circulation of rape videos or the advertising of sex determination tests. The Supreme Court of Argentina observed in 2014 that it is important to distinguish between infringing content and manifestly unlawful content, when determining intermediary liability. And for the latter, greater liability must be placed on Internet intermediaries. Therefore, the Court had taken the stance that for infringing content, intermediaries need to take down content only upon receiving judicial orders, whereas for manifestly unlawful content, they had to take down content upon being notified by any user, even when s/he was not an affected party.

The Supreme Court of India seems to be adopting and extending a similar line of reasoning, when it insists upon proactive action by Internet intermediaries to curb circulation of content that is patently criminal. In the suo moto PIL on the circulation of gang rape videos, the Bench asked the counsel for the respondent intermediaries: “Take for instance, nobody has reported (about any such material), do you act on your own to decipher it?” Surely, unlike in the case of copyright infringement, waiting for a court/ executive order before blocking content such as videos of rape and child pornography and information advertising sex-selective abortion is unacceptable? These are grievous crimes that demand a stronger response than a standard ‘notice and take down’ approach.

We therefore think that preemptive filtering of manifestly unlawful content is a useful direction that the Supreme Court of India is pointing to, though keyword filtering may not be the best modality to go about this. Also, it is untenable for Internet intermediaries to hold that there are technical limits to developing a fool-proof preemptive filtering system, as this claim is belied by their actions with respect to tackling copyright infringement and child porn. ISPs already do some amount of filtering when it comes to child pornography – either voluntarily, or in certain jurisdictions because of legal obligations. Similarly, when it comes to copyrighted material, platform intermediaries are willing to cooperate with big media houses to auto-block infringing content.

So, if the same is not being done with respect to blocking the upload of rape videos, it is more a question of willingness than technical ability. Child porn cannot be anymore intolerable than videos of rapes of adult women. When developments in artificial intelligence permit us to deploy a ‘Project Cease’ to warn uploaders of child porn, why can’t we have a similar solution for video uploads of rapes?

And finally, the technology for filtering and blocking in these cases must be recognised for what it is – an instrument that aids the enforcement of the law and not a replacement. The code at the heart of such a filtering and blocking mechanism will reflect the sophistication, or lack thereof, of our understanding of the problem. So, instead of resisting efforts to look for a technical route to curb manifestly unlawful acts, a more productive approach may be to work to ensure that such tools are founded upon a gender just, rather than patriarchal, code and subject to social scrutiny and debate.


Notes

 

1Since this article was written, the Supreme Court on 13 April, 2017 delivered another judgment on the case in which it distinguishes advertising content and organic search results and limited intermediary liability to only the former. Key word filtering, the court reasoned, could lead to the curtailment of the right to knowledge and wisdom and freedom of expression, and hence it strictly interpreted the spirit of Section 22 of the PC-PNDT Act, however without actually defining what is an ‘advertisement’, http://supremecourtofindia.nic.in/FileServer/2017-04-13_1492086489.pdf

 

2Since this article was written, the Supreme Court has constituted an expert panel consisting of government officials and representatives of Google, Yahoo, Microsoft and Facebook to find technological solutions to block videos of rape and child pornography from being uploaded online, http://www.thenewsminute.com/article/shame-rapist-campaign-sc-forms-panel-block-rape-videos-online-59071

 

Focus Areas
What We Do
Resource Type
Project