The text below illuminates certain issues of interest, including algorithmic bias, positionality, exclusion, and resistance to algorithmic interventions. It comes from p. 62 in Amrute, S. (2019). Of Techno-Ethics and Techno-Affects. Feminist Review, 123(1), 56–73. https://doi.org/10.1177/0141778919879744

In 2015, anthropologist Kathryn Zyskowski (2018) shadowed working-class Hyderabadi women from Muslim backgrounds as they sat through computer-training programmes to advance their careers…Many women regarded becoming computer literate as an aspiration towards entering an Indian middle class, and therefore not only strictly pursued career skills but also the technological and social trappings of a middle-class lifestyle. However, the very systems to which they aspired often applied sociotechnical filters to keep them out. In a particularly telling example from Zyskowski’s (ibid.) research, a young woman named Munawar who enlisted the researcher’s help to set up a Gmail account was rebuffed at several points. First, Munawar’s chosen email address, which contained the auspicious number 786 (standing for Bismillah-hir-Rahman-nir-Raheem, in the name of God the most gracious the most merciful), was rejected because of the quantity of addresses using that number. Then, over the course of sending several test emails to Zyskowski, the email address was deactivated. Google’s spam filters deemed the address a likely fake and automatically disabled it. Finally, after Zyskowski sent several emails to the account, taking care to write several lines and to use recognisable American English-language spacing, punctuation, forms of address and grammar, the address was reinstated. Zyskowski (ibid.) hypothesises that her interlocutor’s imperfect English grammar, location, name and lack of capitalisation caused the spam filter to block the account. The spam filter, as a kind of ‘sieve’, separated out ‘desired from undesired materials’ (Kochelman, 2013, p. 24). It did this work recursively, making correlations between a set of traits and fraudulent behaviour. As it did, the filter developed a ‘profile’ of a fraudulent account that also marked a population. For Munawar, the population was hers—Muslim, Indian, non-native English speaker. Once identified, the Google algorithm automatically suspended her account. Zyskowski—with her proper grammar, her United States location and her Westernised email address—was able to retrain the algorithm to recognise the new address as legitimate. This example shows one of the fundamental forms of corporeal attunement, namely the way bodies are trained to fit the profile of successful digital subjects. Those bodies that cannot form themselves correctly may not even know they have been excluded from its forms and react with perplexity to these exclusions (Ramamurthy, 2003). Notably, Munawar’s other bodily comportment towards an everyday spirituality as embodied in the 786 had to be erased in order for her to be recognised as a member of a technological contemporary. Those without the correct comportment, which Munawar would not have achieved without the intervention of the US-trained anthropologist, become risky subjects to be surveilled at the peripheries of sociotechnical systems.