Submission to Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association

Response to Call for Inputs by the Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association for his report to be presented at the 75th session of the General Assembly on Women and Girls and FoAA.

The inter-related political rights of expression, assembly, association, petition and protest are the cornerstones of a democratic society. The vibrancy of a democracy is predicated on the universal enjoyment of these rights. In the digital paradigm, as Maina Kiai, the UN Special Rapporteur on the Rights to Freedom of peaceful assembly and association (FoAA), noted in his thematic report to the United Nations Human Rights Commission (UN HRC) in 2012, the Internet, particularly social media, has emerged as a basic tool for the exercise of these freedoms. The digital rights community has consistently articulated that in the Internet era, the freedom of assembly and association is near impossible to achieve without “Internet freedom” -- the right to an Internet that is free of censorship, universally accessible, and respectful of privacy. Freedom of information and expression online is also essential for the right to full participation in public life, as former UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Frank La Rue, has observed.

For women, girls and individuals from non-normative gender locations who have been historically marginalized and excluded from the public sphere, the Internet has not only been a critical site of self-building and self-exploration to claim the ‘personal as the political’, but equally, a site for forging a prefigurative politics of feminism.

Our contributions to the Special Rapporteur’s call for inputs reflects on critical issues for the realization of FoAA by women and girls in the digitally-mediated public sphere, from the standpoint of the indivisibility of human rights online and offline, drawing upon our research and action partnerships.

Issue 1. Lack of access to the Internet and household surveillance cultures impede FoAA of women and girls in digitally-mediated public spheres.

As of 2020, a significant gender divide still characterizes mobile phone and internet use in most developing countries in the world, with the notable exception of Brazil. India, similar to its South Asian neighbors, Bangladesh and Pakistan, has some of the highest gender digital gaps in mobile internet use. The digital gender gap is 51% in South Asia where, for instance, only 14% of Indian women have smartphones (compared to 37% men). As of 2020, only 21% of women in India were found to be using mobile and internet services. This is significant, as statistical analysis in earlier studies has revealed a strong correlation between access to the Internet and public-political participation. Women who are active in civic and public life offline are more likely to use the internet (and more frequently) than others suggesting the positive correlation between access to information/new imagination and unexplored worlds out there and feminist agency. Gender cultural norms are a major barrier to access. Caste/clan leaders have often issued diktats/ fiats prohibiting women of their communities from accessing the Internet. Family members tightly control women and girls’ use of the Internet and permit access only through shared devices. In IT for Change’s recent research examining Internet use patterns of 800 college students across 3 states in India, we found numerous instances of male anxiety about female agency (especially sexual agency) leading to continuous social surveillance and household monitoring by men of their women. We heard numerous accounts of young women forced to share their social media account/password with male family members, thus having to give up their right to free association.

In addition to restrictions at home, young women also find themselves navigating gendered restrictions on smartphone use in educational institutions. In 2019, an academic institution in the southern Indian state of Kerala introduced a rule prohibiting the use of mobile phones in the women’s hostel between 10 pm to 6 am. The unreasonable restriction was struck down by a High Court order upholding women’s right to Internet access as a fundamental freedom. Individuals from non-normative Sexual Orientation and Gender Identities (SOGI) locations face invasive and dehumanizing forms of intrusive sousveillance from family members. It is in spite of such pervasive household and social surveillance cultures that women, girls, and gender minority individuals continue to stake their claim to the public spaces of the Internet – a moving testimony to their tenacity and fortitude.

 

  • The Special Rapporteur (FoAA) should call upon states to recognize that Internet access is an enabler of women’s political freedoms. Women, girls and gender minorities can realize their freedom of assembly and association in the digital conjuncture only when they enjoy “universal, acceptable, affordable, unconditional, open, meaningful and equal access” to the Internet.
  • The Special Rapporteur (FoAA) should direct states to make policy proposals such as initiating public broadband, mobile data allowance and digital literacy programs to promote women’s active citizenship. They must also institute appropriate legal frameworks to safeguard the rights to encryption and online anonymity, and ensure freedom from online misogyny.   

Issue 2. Draconian surveillance measures pose a huge risk to the FoAA of women and girls, especially in the state of exception naturalized by the COVID crisis.

Surveillance technology has always produced a chilling effect on freedom of speech and FoAA, particularly for women, girls and LGBTQI individuals who engage in political and sexual expression that defies official state ideology, destabilizing the gender social contract.

Women leading political action are increasingly facing a push back the world over. This is specially true for religious minorities, students, SOGI, or journalists recording daily life during the coronavirus pandemic. The indiscriminate use of drones and facial recognition technologies as well as mass surveillance measures without due legal backing to single out protesters and slap false charges against them presents grave challenges to and impede women’s right to democratic and peaceful protests.

The state of exception produced by the COVID-19 crisis has enabled governments around the world to normalize invasive mass surveillance practices, in the name of disease surveillance. For instance, there are currently over 19 contact tracing apps deployed by various state departments/agencies in India, all functioning in the absence of “necessity and proportionality” safeguards or a personal data protection law. While Brazil and Thailand are due to have data protection bills come into force in 2020, South Korea has three data protection bills awaiting approval and India’s controversial personal data protection bill proposed in 2018 is still pending in the Parliament.

In a datafied governance context, the ability of the state to combine and recombine information about citizens through these array of surveillance technologies is terrifying for citizens at the margins. For gender queer individuals, women and girls from minority communities, and women human rights defenders, the normalization of digital surveillance tech in times of COVID-19 will exacerbate the state’s penalization of their acts of dissent.

  • The Special Rapporteur (FoAA) should adopt a feminist approach to tackling the problem of surveillance, by recognizing that personal data protection frameworks are not sufficient to address the growing dangers of mass surveillance for individuals in marginalized locations. The need of the hour is for states to reform their legal frameworks on surveillance to ensure their compliance with the foundational principles of legitimacy (have a clear legal basis), necessity and proportionality, with adequate safeguards against abuse of powers. Digital surveillance tech advancements should be deliberately put on hold where there are no clear safeguards in sight to prevent state overreach. A noteworthy instance is the case of facial recognition tech where the outright bans on its deployment adopted by the EU parliament and cities in the US may be the way to go.
  • In light of the Coronavirus pandemic, the Special Rapporteur (FoAA)  must call on all governments to ensure that the grounds of legality, necessity and proportionality are met and justified by a legitimate public health objective, and the pandemic does not become an excuse for indiscriminate mass surveillance.

Issue 3. Protectionist approaches to online content governance and intermediary irresponsibility push women back from public spaces on the Internet.

Online sexism, misogyny and gender-based cyberviolence have assumed ominous proportions across the world with even women Cabinet Ministers and public figures, or first responders in a global pandemic not being spared. 

At the height of the COVID pandemic, in May 2020, the exposé of an Instagram group run by teenage boys called the Bois Locker Room made headlines in India, crucially demonstrating how inscrutable male homosocial spaces on the Internet perpetuate the objectification of young girls and women, reifying toxic masculinities. Where women’s first-order claim to being a human being is not recognized, how can any FoAA remain?

Dehumanizing gender-based hate speech thrives in the public spaces of the Internet due to archaic laws that reinforce legal protectionism and the expedient stances of social media companies. Legal provisions and notice and takedown regimes that give social media companies immunity (safe harbour) from primary liability against objectionable content has created a wild west. Without the responsibility to create effective mechanisms to take down harmful content, social media platforms get away, often taking no action on user complaints of hate speech and abuse. An Equality Labs study found that Facebook failed to remove 93% of reported hate speech posts targeting Indian caste, religious, gender and queer minorities.

As observed by IT for Change, “the siloing and parsing of narratives in a digitally mediated public sphere into fleeting, monetizable data bytes has not augured well for women’s claims.” Entrenched misogyny is virally amplified in digital spaces, while feminist digital counterpublics get lost in eddies.

  • The Special Rapporteur (FoAA) must condemn the extension and amplification of misogyny, discrimination and violence online, and coordinated campaigns (sometimes state-supported) which aim to silence, discredit and intimidate women and shrink civic space online.
  • The Special Rapporteur (FoAA) should recommend that states enact legal frameworks against gender-based cyberviolence that are reflective of women’s agency, consent and choice, grounded in the cornerstones of equality, dignity and privacy.
  • The Special Rapportuer (FoAA) should recommend that states enact intermediary liability legislation that effectively holds platforms accountable for content governance without privatizing censorship. A notice and notice approach is most appropriate in this regard. As the Brazilian experience with Marco Civil da Internet and Canadian copyright law demonstrates, a notice and notice regime has the advantages of: 1) providing the intermediary with more information to add context to a request for removal of content, 2) offering the author the opportunity to remove the content themselves, and 3) triggering a fault-based liability on the intermediary only when there is a court order for restricting access. In this context, the EU Commission’s Code of Conduct for platform companies on countering illegal hate speech online is a necessary step in the right direction for filling the information asymmetry regarding hate speech reporting and actioning.
  • The Special Rapporteur (FoAA)  should recommend that platform companies need to be cognizant of and uphold their corporate responsibility for human rights. Not only should they ensure their content moderation norms respond in culturally sensitive and contextual ways, they must engage in human rights audits to assess the human rights impacts of their presence in any market. Platforms must provide transparency reports for takedown requests.
  • The Special Rapporteur (FoAA) should recommend platform companies to undertake techno-design measures to address sexism at scale, such as the development of hate speech signifiers in non-English languages; tackling sexist databases for AI training; deploying algorithms that can check the virality of sexist content.

Issue 4. The effect of shut-downs is pernicious for women from locations that are already marginalized

Across the world, cutting off access to the internet is becoming a distressingly common response to civil unrest. India, has one of the highest rates of internet shutdowns in the world. Areas under Internet shutdown are doubly affected in the coronavirus crisis that has made access to political and economic freedoms almost entirely contingent on access to the internet. The international NGO Human Rights Watch has observed that Internet shutdowns can have a greater impact on women, lesbian, gay, bisexual, and transgender individuals, people with disabilities, and older people who may rely on the internet for online support services, which is especially vital during the pandemic.

  • The Special Rapporteur (FoAA) should call on all governments to immediately stop all internet shutdowns. The UN Human Rights Committee in General Comment 34 emphasizes that restrictions on speech online must be strictly necessary and proportionate to achieve a legitimate purpose. Internet shutdowns have a sweeping impact on all users, thereby restricting access to information and emergency services communications for all people during crucial moments. The UN Human Rights Commission in landmark resolution, (A/HRC/32/L.20) has condemned unequivocally “measures to intentionally prevent or disrupt access to dissemination of information online in violation of international human rights law and calls on all States to refrain from and cease such measures.”

Recommendations flowing from the above observations:

In light of our observations on safeguarding and expanding women’s right to exercise their freedom of association and assembly in the Internet era, we recommend that the Special Rapporteur (FoAA) make the following recommendations to state and business actors.

States

  • States should recognize Internet access as an enabler of the human rights of women and girls, including their right to free expression, assembly and association. In addition to launching universal access programs and investing in women’s digital literacy, states must institute appropriate legal safeguards to create accessible and safe digital public spaces free of sexist and misogynistic hate, intimidation or violence. The right to encryption, online anonymity and personal data protection must be seen as integral to the right to access.
  • States must desist from draconian surveillance measures. Any digital surveillance technology introduced, including for public health tracking, must be bounded by the principles of legitimacy, necessity and proportionality. States need to adopt a rights-based Internet policy and regulatory framework addressing measures for emerging developments in digital surveillance tech, while specifying clear no-go areas such as the use of facial recognition technologies.
  • States should ensure that pre-digital laws addressing violence against women are updated to reflect digital realities. Such legislation should be grounded in privacy, equality and dignity principles, stepping away from a “patriarchal protectionism” framework.
  • States should update their intermediary liability frameworks so that online platform companies are obligated to uphold their “duty of care” to create safe digital spaces free of sexism and gender-based hate.
  • States should completely desist from Internet shutdowns.

Platform companies
       

  • Platform intermediaries must be cognizant of and uphold their corporate responsibility for women’s rights as human rights, in accordance with the UN guiding framework on business and human rights.
  • Platforms need to ensure their content moderation norms respond in culturally sensitive and contextual ways without undermining the dignity and agency of women and gender minorities. Content governance mechanisms of platforms must be subject to annual human rights audits in the markets they operate. Platforms must provide transparency reports for takedown requests.
  • Social media platforms must put in place techno-design measures that effectively respond to misogyny at scale.
Focus Areas
What We Do
Resource Type