Context Erasure of AI
Context has many definitions, yet it can be the parts of a discourse that surround a word or passage and can throw light on its meaning. Throwing light on the meaning of AI is not necessarily a forward or easy activity in many ways when it comes to the direct or immediate effects of the implementation that occurs. Then if we considers all beyond the immediate deployment — the backend, the non-place or immateriality things starts to get muddled.
One definition of erasure written in the Cambridge English dictionary is: “in the the act of removing or destroying something, especially something that shows that that person or thing ever existed or happened.” In many ways those who produce goods or services are in many cases not seen, not heard, not recognised. It must be said that this is not wholly unique for the field or industry of artificial intelligence yet it remains an important point.
There is one research institute that has a focus on this aspect called the AI Now Institute.
“The AI Now Institute at New York University is an interdisciplinary research institute dedicated to understanding the social implications of AI technologies. It is the first university research center focused specifically on AI’s social significance. Founded by Kate Crawford and Meredith Whittaker in 2017, AI Now is one of the few women-led AI institutes in the world.”
The AI Now Institute wrote a report published in December 2019. Within this report there is a focus on a wide range of topics, yet they sum up their focus in the introduction:
“The Institute’s current research agenda focuses on four core areas: bias and inclusion, rights and liberties, labor and automation, and safety and critical infrastructure. […] AI Now’s 2019 report spotlights these growing movements, examining the coalitions involved and the research, arguments, and tactics used. We also examine the specific harms these coalitions are resisting, from AI-enabled management of workers, to algorithmic determinations of benefits and social services, to surveillance and tracking of immigrants and underrepresented communities. What becomes clear is that across diverse domains and contexts, AI is widening inequality, placing information and control in the hands of those who already have power and further disempowering those who don’t. The way in which AI is increasing existing power asymmetries forms the core of our analysis, and from this perspective we examine what researchers, advocates, and policymakers can do to meaningfully address this imbalance.”
Within the section of emerging and urgent concerns in 2019 there is a subsection called: “From “Data Colonialism” to Colonial Data.” Within this section there is a talk of context erasure.
“From “Data Colonialism” to Colonial Data The Abstraction of “Data Colonialism” and Context Erasure “Data colonialism” and “digital colonialism” have become popular metaphors for academics, policymakers, and advocacy organizations looking to critique harmful AI practices. In these accounts, colonialism is generally used to explain the extractive and exploitative nature of the relationship between technology companies and people, deployed toward varying political ends.”
In particular the question mentioned in regards to “digital sovereignty” is important in relation to the erasure of context. This is described in the context of Europe:
“digital sovereignty” that encourages decentralized and community-owned data-governance mechanisms.
It is also argued that India has an opposition between domestic industrialists and policymakers versus the large Silicon Valley tech giants described as “data colonizers”.
The domestic argument is that national companies, rather than foreign ones, must get first priority accessing Indians’ data.
Within this section the argument is that these practices overlooks specific: “…historical, cultural, and political contexts and obscures the fact that present-day.”
Histories of colonisation are important in regards to the question of AI labor and economic structures.
“Growing research on the locally specific real-world impact of the AI industry on countries in the global South makes visible these contexts and the lived human conditions behind the technology and data.”
It is argued that abstracting “colonialism” can allow co-opting of narrow economic interests.
It is a rhetoric of decolonial struggles while replicating the same extractivist logics of their Silicon Valley counterparts.
In describing the situation or erasure of context the text explores how quantitative information is used — colonial data. As well as how indigenous people have been combating what they term data abstraction.
“Indigenous communities have been at the forefront of resisting harms caused by data abstraction.”
Census information and population counts has been used against local population: “…function as a core feature of settler-colonial governance, feeding massive amounts of abstracted data into digital systems.”
“Indigenous statistics” in census administration — linked to underrepresentation and lack of resources.
The term “Indigenous data sovereignty” (ID-Sov) is generally defined as “the right of a nation to govern the collection, ownership, and application of its own data.”
Data sovereignty is used by several sides in an argument.
The text describes how several indigenous networks (Indigenous US, Māori and Aboriginal) have been advocating together to share: “…stories about data initiatives, successes, and challenges, and resources.”
The article mentions the question of data sovereignty as proposed in UNDRIP: Data Sovereignty: Toward an Agenda in response to oversights in the United Nations Declaration on the Rights of Indigenous Peoples.
“the twin problems of a lack of reliable data and information on indigenous peoples and biopiracy and misuse of their traditional knowledge and cultural heritage.”
One important aspect is that advocacy groups are establishing sovereignty and ownership protocols at the level of data and analysis.
“For example, the Local Contexts initiative aims to support Native, First Nations, Aboriginal, Inuit, Metis, and Indigenous communities in the management of their intellectual property and cultural heritage in the growing digital environment.”
Establishing frameworks for labeling data.
There are moves toward the possibility of reconfiguring entire information systems, like the US Library of Congress.
Beyond these concerns it may be important to think about technology companies responsibility for the local areas they operate in to a greater extent.
It must be said, as is mentioned in the report, that large technology companies have been making a push towards renewable energy, yet there are so many social concerns that are ignored or forgotten along the way.
Perhaps through thinking hard about the hardware and the physical aspects of the distributed technologies we implement we may start to make the context of technology more visible in its convergence with the global and disconnection with the local. Each technology despite global claims is situated in a series of intertwined complex relationship at each location that it operates or is involved.
Crawford, Kate, Roel Dobbe, Theodora Dryer, Genevieve Fried, Ben Green, Elizabeth Kaziunas, Amba Kak, Varoon Mathur, Erin McElroy, Andrea Nill Sánchez, Deborah Raji, Joy Lisi Rankin, Rashida Richardson, Jason Schultz, Sarah Myers West, and Meredith Whittaker. AI Now 2019 Report. New York: AI Now Institute, 2019, https://ainowinstitute.org/AI_Now_2019_Report.html.
This is #500daysofAI and you are reading article 309. I am writing one new article about or related to artificial intelligence every day for 500 days. My focus for day 300–400 is about AI, hardware and the climate crisis.