Archismita Choudhury, Breakthrough India, New Delhi
The online has helped grassroots movements collaborate with one another, empowering feminists from the global South to speak out, and speak to each other. But there are challenges, especially in the form of gender-based cyber violence. Social media banks on virality, and although it can have positive effects, it can also have severe negative effects. Breakthrough India remixed a popular Tamil/Hindi song with a feminist lens which went viral last year. Although many enjoyed the video, the women who starred in the video were at the receiving end of massive amounts of trolling that tended towards cyberbullying. Gender trolling is particularly vitriolic and can involve multiple coordinated attacks. It usually spans a long period, effectively preventing women from participating in public life. Communities of support can be very helpful in coping with trolling.
Breakthrough India has worked on digital codes of conduct in its campaigns with Facebook and Twitter.
Ahmar Afaq, Symbiosis Law School – Hyderabad and Mohd.Imran, Aligarh Muslim University, Murshidabad Centre
In video games, there is a strong sense of identification between the player and the character. Video games have become popular social artifacts with the potential to shape behavior, attitudes and identities of players. Just like pornography, video games can become extremely addictive because they promise instant rewards. Further, the increasing realism employed in these games necessitates a discussion on the violence against women that is often portrayed through them.
Limiting access to such violent video games is fraught with numerous legal challenges. In the US, for example, a California state legislation banning the sale of violent video games to minors without parental consent has been struck down by the Supreme Court for its unconstitutionality. The Supreme Court ruled that just like other media, video games are covered by the First Amendment protection of free speech. Self-regulation through encouraging voluntary rating of games by the video game industry is now being explored.
Radhika Radhakrishnan, TISS, Mumbai
Radhika Radhakrishnan presented her primary research on the effectiveness of smart device-based virtual assistants in responding to user queries for gender-based violence crisis support, in the Indian context. The responses of five virtual assistants – Siri, Google Now, Bixby, Cortana, and Alexa – to a set of standardized help requests for sexual and cyber violence were evaluated against three criteria:
1) recognition of the concern raised by the user
2) use of empathetic language in the response, and
3) the ability to provide directions to a helpline number
The results were mixed. For sexual violence, only Siri and Alexa were able to provide personalized responses. The other three failed to identify the crisis or their response was incoherent. For cyber violence, none of the virtual assistants even recognized the concerns raised. Gender sensitivity in design is a critical imperative and the priorities of the designers and engineers, and the choices of those who develop these technologies must be scrutinized.
Crisis support mechanisms in India are weak. There is low awareness and victims do not approach support services because of stigma, loss of confidentiality and fear of retaliation. Given the increased penetration of the smart phone, virtual assistants have the potential to aid crisis support. Artificial Intelligence offers techniques that allow virtual assistants to converse with humans, answer questions, make searches etc. Unfortunately, in India, virtual assistants are not able to coherently recognize gender-based violence nor point to resources that can help during crises.
Ingrid Brudvig, Web Foundation, South Africa
Technological architectures designed by digital platforms are shaping social structures by encoding social hierarchies through data-based profiling. There is no accountability or transparency in the data they mine from individuals. The artifacts built from that data become the means of manipulating an individual, by nudging her on how to think and act. There are many ethical feminist concerns around such data practices, such as the absence of meaningful consent, the opacity of terms of service, the reproduction of social hierarchies, including that of gender through the use of Artificial Intelligence solutions, the lack of gender by design, and a post-facto ‘add-gender-and-stir’ approach.
In the case of online violence against women, we need to think of the accountability mechanisms that are needed to ensure that platforms comply with international human rights standards. It is important that platform intermediaries do not end up resorting to broad or excessive measures that infringe upon freedom of expression or privilege a particular ideological standpoint that reinforces Western bias. Some recommendations for better governance by platforms would include following due process like timely and effective response, putting out transparency reports in cases of gender-base violence they have dealt with and of the action that was taken.
Kalpana Sharma, who chaired the session, raised some key concerns, building upon the presentations, before opening up the floor to the audience.
– We must interrogate why the discourse in online spaces is the way it is, rather than limiting ourselves to saying that new technologies have triggered a violent reaction. In a capitalist economy, what yields profit stays, and if violence against women yields such profit, it will continue to be sold and consumed.
– Pursuing a technological fix to online gender-based violence maybe a dead end.
Comments from the floor
1. In early 2017, the Ministry of Information Technology had mandated that all mobile phones should have a built-in panic button connected to GPS. Phone companies, however, contended that they would not be able to recall the mobile phones that were already dispatched for sale. It is important to dialogue with smart phone manufacturers on how they can improve their responses to gender-based violence, and if possible, the research on virtual assistants should be taken to smart phone manufacturers.
2. The educational sector has a role to play in developing awareness around content – such as the viral video games that reach minors. The sector must also input into regulation developed around these video games
3. Safety apps invariably adopt a protectionist, rather than a rights-based approach. With most of these apps, a large amount of personal data – location data, photographs, contacts etc – are being handed to the company, making them honey-pots for hacking.