This is one of the narratives of the communities impacted from data injustice which are published in the research report on data justice by Digital Empowerment Foundation.
The injustices that algorithms of platform and gig-economy apps cause has been documented previously. In India, the workers in the gig-economy are counted as “clients,” depriving them of many protections labour laws provide. In such an unorganised sector, Shaik Salauddin of the Indian Federation Of App Based Transport Workers (IFAT) is one of the leaders organising and unionising people working in ride-hailing and delivery apps. We speak to him in detail about the algorithms that cause injustices.
In December 2019, the Indian Parliament passed the controversial Citizenship Amendment Bill, along with the government’s commitment to enforce a National Register of Citizenship. As Booker Prize winning author and activist Arundahti Roy put it, “Coupled with the Citizenship Amendment Bill, the National Register of Citizenship is India’s version of Germany’s 1935 Nuremberg Laws, by which German citizenship was restricted to only those who had been granted citizenship papers—legacy papers—by the government of the Third Reich. The amendment against Muslims is the first such amendment.” Noting the use of an automated tool to decide the lineage of people in Assam, we spoke to Abdul Kalam Azad , a researcher from Assam, now at Vrije Universiteit Amsterdam, who had looked into detail the issues and exclusions created by the NRC in Assam. Learning of exclusions of Trans People from the same list, (already facing an undemocratic law like the Trans Act), we spoke to two activists from the Trans Community, Sai Bourothu, who had worked with the Queer Incarceration Project and the Automated Decision Research team of The Campaign to Stop Killer Robots, and Karthik Bittu, a professor of Neuroscience at Ashoka University, Delhi and an activist who had worked with the Telangana Hijra, Intersex and Transgender Samiti.
Another exclusion we noted in our primary research was the homeless in any of the data enumerations. We spoke to Jatin Sharma and Gufran, who is part of the Homeless Shelter in Yamuna Ghat on these exclusions and how it leads to the homeless people being denied basic healthcare and life-saving TB treatment.
Four researchers, activists and civil society leaders who had done considerable work on data related exclusions, surveillance, and identification software such as the Aadhar offered their perspectives on the debates, conversations and potential reimaginings of data injustices. Srinivas Kodali, independent activist and researcher; Nikhil Dey, of the Mazdoor Kisan Shakti Sangathan; Apar Gupta, lawyer and director of the Internet Freedom Foundation, and Rakshita Swamy, an NLU professor who also heads the Social Accountability Forum for Action and Research were the people who provided their insights.
‘AI for Social Good’: Experiences from Rural India.
There are several AI-powered projects that are in development and under execution right now, that are run in collaboration with various government departments. One such AI-powered tech was an app designed for ASHA workers that helped them provide accurate, timely, geo-tagged and tamper-proof weight estimation of new- born under a month of age.
The World Health Organization classifies newborn infants with weight less than 2500g as Low Birth Weight (LBW) infants. 21 It is also estimated that a quarter of Indian newborns are LBW 22 , and this is directly linked to the baby’s survival chances. The WHO’s goals are also to cut the incidence of LBW infants. Therefore, accurately identifying low birth weight babies is an important first step to providing them with further healthcare, and decreasing child mortality rates. This proved to be a challenge in practice, because when records were examined, almost all the babies noted were recorded being exactly 2.5 kgs, the minimum healthy weight to not fall under the LBW criteria. It was clear that there was some fudging of numbers going on at some levels of the data collection and recording process.
One planned solution consisted of software that converted a video taken with the smartphone the ASHA 23 workers are provided with into a 3D mesh of the baby, which the software can then use to accurately estimate the weight of the baby 24. Here, AI solution is one part of the technology stack that has to fit into the workflows of everyone involved.
Designing solutions requires a different approach to avoid exclusions. ‘Product innovation is about working with the users and identifying the market gaps,’ the developer mentioned. Designing an AI solution demands one to look into other parts of the societal chain- the example of the anthropometry solution ran into very different sets of errors that did not have to do with data gaps or biases. For one, even as the health system tried out the solution, it could not take into account the lighting conditions of rural Indian homes- which are not ideal for mobile cameras to measure such particular detail. Data collected under ideal conditions to build the software, say, from hospitals, would have much better lighting conditions. However, because of a very stark existence of caste- certain parts of a village are caste ghettos- if the health worker doesn’t visit the place, none of these data factors would apply in the otherwise ideal AI-based datasets.
This was one of the issues with a software developed to improve TB detection. While the solution effectively detected TB samples with a high rate of precision, the problem with India’s TB infrastructure does not actually lie in the detection part. Connecting this to the narratives from the representatives of the Hausla homeless shelter on India’s TB crisis, one could understand how India’s problem with TB is more social- it is a lack of policies and welfare benefits that directly help the patients continue their course of treatment and provide them with nutritious food that needs to be looked at, much more than the stage of testing and detecting the disease.
Taking these examples on AI systems, human workflows have to be appropriately modified so all of this forms part of the larger solution, and this requires a multidisciplinary approach. The institute Rahul worked for had to work with “agricultural experts, with people who have social sector background in deploying programmes, doctors, product designers, engineers” to work between these workflows. AI can reflect the creators’ intent as “technology is fundamentally an amplifier of human intent’. This points to the problem of weak institutions. There cannot just be ‘unintentional bad,’ but also ‘intentional bad’. The developers have to be conscious of this. “There are significant power disparities and these power disparities also apply across communities, across religion, across castes, across social-economic strata, gender, age, education levels,” and the solutions cannot be for just the literate or digitally literate people. While designing any AI- based solution, the developers’ community is and should be engaged in this understanding. There cannot be Business to Consumer approaches without human intermediaries which in these cases, are the agricultural extension workers or the ASHA workers.
For developers, there is a need to be sensitive ‘about preventing unintentional bad,’ and need to involve the community in the AI based data processes. India faces an acute shortage of doctors and agricultural scientists in relation to the population. In addition to talking about minimising damage from AI based data distortions and ‘blind automation’, there is a real need to understand how it can be used for good. By trying to combat the statistical injustices and challenge presumed structures of authority of knowledge, focusing on the transformative potentials, and trying to look at the subjects of the AI tool in the relational sense, the pillars of equity, participation and knowledge can be best related with the conversation we had.