Anita Gurumurthy participated in a session on Technology and Surveillance for the Janta Parliament, a virtual peoples' Parliament, on Tuesday, August 18, 2020. Some of the participant organisations and individuals presented resolutions before the parliament that were voted on at the end of the session. IT for Change presented a resolution on a framework law for digital rights.
Anita's statement at the session, summarising IT for Change's resolution, has been quoted below. Watch the complete video of her statement here.
Part 1
The crisis of democracy we are going through is not an accident – it is part of what Latin American scholars have referred to as a process of de-democratisation in neoliberal capitalism – a systematic way by which institutions are undermined for greater centralisation of power by the state.
In our context, we see the many facets of the surveillance state:
1. Broadly, we have a PDP (Personal Data Protection) Bill that is fraught because of how it gives sweeping powers to the state. Additionally, it really falls short in its vision of the institutional independence of the Data Protection Authority (DPA).
Section 35 of the Bill confers the Central Government with the power to exempt any agency of the Government from application of any or certain provisions of the Bill. This provides the executive with a wide berth to circumvent application of the PDP Bill.
Section 86 grants the Central Government the powers to issue to the DPA such directions as it may think necessary in the interest of the sovereignty and integrity of India and the security of the state. Thus, the Central Government has the power to override the discretion and expertise of the Authority.
Section 91 specifies that “The Central Government may, in consultation with the Authority, direct any data fiduciary or data processor to provide any personal data / NPD to enable better targeting of delivery of services or formulation of evidence-based policies by the Central Government.” De-anonymisation of data has not been penalised under the Bill. In fact, the Bill does not even consider that there is a possibility of de-anonymisation.
Section. 38 permits the Authority to exempt personal data processing for research, archiving or statistical purposes from “any application of any of the provisions of this Act”, subject to regulations. The rights of data principals and the obligations of data are not spelt out and it is likely that data sets from India can easily be used for ostensible research – training for American AI.
2. We also have a situation where the draft Intermediary Liability (IL) guidelines framework is preoccupied with terrorism – requiring “tracing out of originator of information”. There is a genuine need to involve the intermediary in investigating crimes – especially the hate industry – but this needs other kinds of regulatory approaches – it needs us to think of the public sphere and how we can curtail the power of corporate-controlled public agora. However, with regard to misinformation, under current proposals, we may not really be able to trace back anything much, other than troll armies and chat bots, while seriously eroding privacy.
Social media governance needs a dedicated framework – content governance cannot be piecemeal.
3. The interest expressed by the National Crime Records Bureau (NCRB) in Automated Facial Recognition System (AFRS) calls for great vigilance.
Notably, the European parliament has stated that it will not use facial recognition tech for detaining suspects etc.
Democrat lawmakers in the US have also put forth a Bill to ban use of facial recognition by law enforcement in the US.
Additionally, Microsoft, IBM and Amazon have taken stances against facial recognition software by the police. Microsoft noted that facial recognition software can be put in place only once there is a specific law allowing for review factors.
4. What the Covid moment tells us is that our inhuman tech solutionism has not only negated the lived experience of the poor, but also used tech to actively inflict misery and disenfranchise people.
Contact tracing apps were made mandatory, but this was later rescinded due to public outrage.
As SFLC pointed out, Aarogya Setu’s privacy policy enables the government to share personal information uploaded to the cloud with “such other necessary and relevant persons” to “carry out necessary medical and administrative interventions”.
Part 2
Surveillance and tech needs to be understood for another kind of social crisis – this is about the a new way economic production is organised due to Big Data and AI – enabled through, what may be called, the cornering of the “intelligence premium”.
This poses a serious threat to people’s sovereignty and their ability to reap the benefits in ways they determine from the wealth that is data. It also appropriates labour power – not only surveilling workers but also expropriating their data for private profit. None of the draft labour codes have any provisions regarding the personal data of workers.
Therefore, the only legal provision governing workers’ data rights is Section 13 of the PDP Bill, which itself rests on an incorrect and dangerous assumption that employers can serve as fiduciaries of the personal data of their employees.
Additionally, the Bill provides no safeguards against the imbalance of power between employers and employees.
All workers have this problem, but platform workers are really affected by it.
The State is attempting to promote digital public goods but we know that the right to access is a long way off and meaningful access, an unrealised project since 25 years.
The National Open Digital Ecosystems (NODEs) – data infastructures for digital common good – consultation paper put forth by the Ministry of Electronics and Information Technology (MeitY) requires a clearly articulated governance framework that explicitly adheres to constitutional rights and values.
The NODE paper fails to clarify as to whether a NODE can be entirely privately run. There is also a lack of clarity regarding the terms of operation between the public and private sector.
This could put public data architectures in private hands, furthering intelligence capital. And so, we should ensure that openness does not lead to a tragedy that undermines publicness.
At the same time, public standards are vital because corporate, like Facebook, can afford to build user trust through its currency, but small business cannot. They need a UPI. The aspects of economic democracy are as important as political democracy.
The critical missing piece in the discussion is the huge governance deficit with respect to data as a resource. Big Tech has stolen the aggregate data of people, which needs to be reclaimed.
The idea of data as a resource needs to be engaged with for ideas of value, interests and its rapacious commodification. As a feminist, I think it absolutely necessary to grapple with the question of data’s multivalence. Else, we will be like ostriches, burying their heads in the sand.
One thing I want to point out, as I conclude, is that in our fight against neoliberal capital, and against jingoism and state impunity, we need to be sure not to deny the idea that value inheres in data – this value can be private, public or social – and this is where people come in. How do we create the boundaries to nurture the social, preserve the public and limit the private?