This is one of the narratives of the communities impacted from data injustice which are published in the research report on data justice by Digital Empowerment Foundation.
The injustices that algorithms of platform and gig-economy apps cause has been documented previously. In India, the workers in the gig-economy are counted as “clients,” depriving them of many protections labour laws provide. In such an unorganised sector, Shaik Salauddin of the Indian Federation Of App Based Transport Workers (IFAT) is one of the leaders organising and unionising people working in ride-hailing and delivery apps. We speak to him in detail about the algorithms that cause injustices.
In December 2019, the Indian Parliament passed the controversial Citizenship Amendment Bill, along with the government’s commitment to enforce a National Register of Citizenship. As Booker Prize winning author and activist Arundahti Roy put it, “Coupled with the Citizenship Amendment Bill, the National Register of Citizenship is India’s version of Germany’s 1935 Nuremberg Laws, by which German citizenship was restricted to only those who had been granted citizenship papers—legacy papers—by the government of the Third Reich. The amendment against Muslims is the first such amendment.” Noting the use of an automated tool to decide the lineage of people in Assam, we spoke to Abdul Kalam Azad , a researcher from Assam, now at Vrije Universiteit Amsterdam, who had looked into detail the issues and exclusions created by the NRC in Assam. Learning of exclusions of Trans People from the same list, (already facing an undemocratic law like the Trans Act), we spoke to two activists from the Trans Community, Sai Bourothu, who had worked with the Queer Incarceration Project and the Automated Decision Research team of The Campaign to Stop Killer Robots, and Karthik Bittu, a professor of Neuroscience at Ashoka University, Delhi and an activist who had worked with the Telangana Hijra, Intersex and Transgender Samiti.
Another exclusion we noted in our primary research was the homeless in any of the data enumerations. We spoke to Jatin Sharma and Gufran, who is part of the Homeless Shelter in Yamuna Ghat on these exclusions and how it leads to the homeless people being denied basic healthcare and life-saving TB treatment.
Four researchers, activists and civil society leaders who had done considerable work on data related exclusions, surveillance, and identification software such as the Aadhar offered their perspectives on the debates, conversations and potential reimaginings of data injustices. Srinivas Kodali, independent activist and researcher; Nikhil Dey, of the Mazdoor Kisan Shakti Sangathan; Apar Gupta, lawyer and director of the Internet Freedom Foundation, and Rakshita Swamy, an NLU professor who also heads the Social Accountability Forum for Action and Research were the people who provided their insights.
Data Protection, Surveillance and Privacy: Legal Perspectives and Community Resistances
In a larger group conversation we had, we brought in experts from civil society, law and movements, who flagged several issues regarding the state of data in India. Apar, lawyer and director of the Internet Freedom Foundation mentioned the lack of cohesiveness in the policy or strategy documents that have been developed by various government departments and states. There are missing social or independent audits that demonstrate the effectiveness of outcomes in AI-based applications and deployments, as in the example of Aarogya Setu App (India’s only App to deal with Covid-19 at pan India level). Most of these technologies become vaporware, basically something announced promisingly, but never really delivered. Utility audits are therefore necessary for AI-based systems, as with any technology that requires a lot of public expenditure, and collects a lot of data on the public. There is also the larger issue of a lack of a strong data protection law16 , leading to the possible deployment of AI-based systems utilising personal data for targeting. Without legal limits defined, there is no way to enforce data-related concerns, and these need to be done via an independent body. As Rakshita Pointed out, there should also be a legal audit that goes beyond a broad ethical checklist. As Nikhil of MKSS had explained, when large data systems are being built with public money, what will ensure they are used to proactively provide entitlements to people instead of dealtying and leading to exclusion? But another question that Rakshita raises is regarding making a conversation around data justice, and all the issues regarding its impacts on developments and democracy, go beyond domain experts, policy makers and lawyers, and get grassroots civil society groups, grassroots movements, trade unions and people themselves to be part of this conversation. The imagination required there will be different, but crucial. There must be a fine balance between where data can be provided to supplement decision making and where data actually makes the decision itself. There has to be clear safeguards, clear processes in place marking where data only facilitates and where it actually takes all the role for itself and makes a decision. It is very hard to separate data from ‘the right to information’ ; it’s a subset of the other that is getting increasingly larger. There is a need to look at how the decision making vis-a-vis data takes place, in the gathering stage, the aggregation and amalgamation stage and also in the stage of use.
The primary points that these law and policy experts bring in are regarding the pillar of participation and power. Without a conversation where the people at the grassroots are involved, there is a violation of participation, where data work needs to be democratised to ensure inclusiveness of all voices, and an understanding of the pillar of power is a deeper critique of embedded power structures of the state.