How do anthropologists approach questions of objectivity, in theory as well as in practice?
The question that first drove me to study artificial intelligence (AI) was why everyone was so obsessed with it. It seemed as if there was no end to the amount of stories in newspapers up and down about this topic. However, with my background having worked for several years in a nonprofit oriented towards sustainability, a looming question quickly became: do developers think about the climate crisis? Caught in between the question of ethics; the construction of ‘clean’ data; how objective or ‘biased’ an algorithm is; and the rapid growth of the field in terms of investments both public and private — you find the ethnographer. Yet I am in no way a protagonist, only a concerned and engaged citizen with a small voice and small data (as opposed to big data). At the same time I have started studying computer science at university level, so I am increasingly interested in participating in the process of developing or programming code.
Get With the Programme — Learn Programming
To get with the programme means to to accept new ideas and give more attention to what is happening now. In one sense it gives connotations to speed, of being left behind. Another definition is the objective of a shared enterprise, especially after the objective or the environment has changed. If objectivity has a shifting object with a permeating subject, it is conceivable that a human is the same — while the object is dynamic, technology if you will. Although this is false, it is tempting to think that humans are static and the new evolution will happen as we integrate with devices. The image of the transformed human through development — as such transhumanist tendencies — is clear in the variety of images reaching for the human machine, at least within the conferences on AI where I have been present.
“There is no need for conflict between science and art, between fact and story.” (Madden, 2010, p.24)
There is a strange eerie nowness that arrives, like a flying stork carrying a baby, a mythical journey of sorts. “You need to learn programming, learn how to code! Become a programming hero!” Advertisements gleaming in the corners of my platforms or as I scroll along with clear signals: on Facebook, Instagram, Twitter, Snapchat and Medium. Mediated as I am in all these ways could I not help but to heed the call? I am a weak human, or maybe just a working class white kid, ‘he/him’ from a white neighbourhood looking to stay relevant in the job market.
“…methodological reflexivity. There is a need to account for the inevitability of the ethnographer’s influence on the research process and to manage the tension between objectivity and subjectivity in order to produce better portraits of the human condition.” (Madden, 2010, p.2).
Platform in this context often referring to a website, while ‘templates’ are referring to the pre-made sites with building blocks already coded that only needed to be ‘designed’ or placed in the right order — filled with ‘content’ (text, images etc.). There was never the expressed need to ‘hard-code’ as most of the time we needed every website to be easy to use for those who bought each solution. Each website additionally had to be delivered fast and be inexpensive.
After discovering anthropology as a field I decided to sell my shares and enter university. When entering, I took the first ‘purely’ programming course at the undergraduate level at the Social Sciences Faculty. It was called Machine Learning and Programming for the Social Sciences, and I programmed in a language simply called ‘R’.
I began realising developers or people coding was an understudied area in anthropology and an interest so I wanted to pursue this direction further. Before this in late winter 2018 I sent out emails to twenty technology companies. They all refused my interest to do ethnography studying artificial intelligence, and I should have realised it would not be that easy. I realised I would have to make more of an effort to ‘enter the field’ as an ethnographer. When I entered the Computer Science faculty my focus in terms of language changed to ‘Python’, doing object-oriented programming.
There is an important relationship between method and theory in anthropology as we often define the field of anthropology through the various work that is undertaken in the ‘field’. Yes, the ‘field’ as a site is changing (Gupta & Ferguson, 1997), yet so is the way that anthropologists can participate.
Objections to Object-Orientation and Objectivity
“Getting into a computer science subject felt a bit like buying concert tickets on very high demand. When I went to the information help desks this was iterated by students where I additionally found out that the University of Oslo has a system for the rest of the university called Canvas, however the faculty of informatics have individual choices for every module chosen by the specific lecturers responsible for each subject — very different! This system was of course as the students at the help desk assured me far superior to the system at the rest of the university.” [sic]
- notes from personal diary on the 22nd of August
In a performative sense I have no ill will towards object-oriented programming. It is somewhat amusing perhaps that I am now writing about objectivity, because object-oriented programming is to a large degree about defining objects and perception to make decisions. Amusing intellectually because data makers view this creation and organising of data different than what I experienced at the social science faculty. Objectivism and subjectivism are partners (Madden, 2010, p. 26), but not necessarily in programming practice.
I became interested in the connection between computing and the climate crisis. Increasingly the ecological impact of technology was becoming known to me. This gradually happened after following parts of the Anthrotox projects at the Department of Anthropology in Oslo; as well as the heavy environmental cost of cooling systems in data centres; lack of recycling of electronics; and funding of ethics by large technology companies.
I had been working to build a charity oriented towards sustainability three years prior to beginning my studies, so I think I had this in the back of my mind when I entered computing. During summer 2019 I decided to start a project called 500 days of AI — writing one new article every day about artificial intelligence every day for 500 days. I think this came to mind after reading Samantha Breslin who studied the Making of Computer Scientists in Singapore. In her PhD thesis she wrote about how programmers were interested in ‘passion projects’, you had to show your dedication through personal projects.
Breslin studying developers had in an academic context needed permission from the university, in talks with my supervisor I realised this would be hard to achieve in a short time. Therefore I decided to consider the work of anthropologist Nick Seaver who studied developers and algorithms within music, his pragmatic ethnographic tactics helped in shaping my own approach to ethnography amongst developers. My aim is still to become proficient enough to have an embodied experience of participation as a programmer (embodied), getting to the level where I can work on a programming project, physically programming for several months. I have not read much yet about an ethnographer learning the ‘language’ so to speak, actually programming with the developers, participating in the development of code as a programmer. Therefore I had a growing idea of wanting to explore this area in anthropology.
Becoming a Learning Voice on AI
Around day 100 of my project writing about artificial intelligence every day for 500 days I hit the top 50 writers on artificial intelligence on an international blogging site with 60 million users. Whereafter I started to get invited to speak at events about artificial intelligence. First I got invited to talk at Microsoft’s head office in Norway during Oslo Innovation Week. Second I talked at a conference on anthropology and technology in Bristol. The third conference was Norway’s largest science and tech festival called Cutting Edge, where I participated in the last panel debate of the evening. I have since then been booked at several events and it has shaped my experience of the field of AI.
These series of events has been incredibly helpful in helping to begin my journey into the ethnographic field as a participant rather than an observer. Yes, of course I was participating in the events previously, hanging out with the developers and talking about technology. However, being an active voice in a community, starting initiatives and contributing has enabled me to get far more insight into this area than I had previously. Enabling me as well to enter events that I would not have been able to afford to enter due to the ticket prices often being high. In addition to participating at events I joined a series of Facebook groups focused on artificial intelligence. Another notable occasion was joining a programming event (hackathon) of more than 100 people with a team of developers from different backgrounds and winning the competition.
I have tried not to point my finger at any informant in specific to keep my account anonymous, however I have not been able to come close enough to bring in a textured perspective of this community. That is the failure of the ethnographer, as I am still gaining access into the field, and although I have gained more entry into conferences there is still a while to go until I get a close access into the daily life of developers as a peer or participant. So far educating myself or getting with the programme is a beginning.
“Everyone are sitting in front of their computers solving problems. We are currently trying to understand how to make a bingo-board. I read about bingo on a gambling site (innocent gaming), I guess you could make a lot of money by launching a bingo site back in the day (perhaps nowadays too). If I cannot pass the next Monday course and get the required points I will get kicked out of the programming module. I do not know yet what type of score that I will get.” notes from personal diary on the 6th of September
Selling Ethics in a Revolution of Everything
There was talk of revolution in many ways through technology. Clean energy revolution is one possibility. A revolution in the way we work is another one. There are revolutions too numerous to mention often accompanied by the word ‘exponential’ or ‘growth’. Philippe Bourgois in his fieldwork in Central America discovered that the anthropological ethics were found lacking, and the engagement of an anthropologist had to continue beyond the ‘writing up’ of fieldwork. This can be challenging to engage with also when studying ‘elites’ (Marcus, 1983).
“The eminently political orientation of a supposed apolitical commitment to empirical research must be appreciated for its internal inconsistencies and ultimate ethical poverty.” (Bourgois, 1990)
One event I was going to was a whole day of talks about ethics without the mention of the climate crisis, inequality, gender, or failed AI. Selling ethics functionality seemed a specific and clean practice in itself. If the climate crisis was mentioned it was often baked into a sales pitch for a product or a service, although this may be expected from conferences, as the goal for some is to attract clients or acknowledgement from peers.
Ethics are the moral principles that govern a person’s behaviour or the conducting of an activity. It is in addition to this moral philosophy systematizing, defending, and recommending concepts of right and wrong conduct. AI involves learning rules for using information; reasoning from rules to reach approximate or definite conclusions; and self-correction. Possibly due to this I have experienced the topic of ethics comes up with high frequency in debates about AI.
“The rule is perfect: in all matters of opinion our adversaries are insane.”
AI was viewed by some as magic, and the other as a tool — both are correct in conjunction. Indeed we could call it a magical tool, an unlikely oxymoron in the eye of any developer who believes strongly in rationality. Then again you have to enter the ontological perspective or weltanschauung of your informants, in this magic can be used to explain a wide variety of outcomes (Evans-Pritchard, 1937) much like applied artificial intelligence in this context. Its announced science is not always scientific or stringently mono-disciplinary. These dominant ontological approaches can have legal and economic consequences (Povinelli, 1995).
Ethnographic method is challenging and I was unable to get any real continuity except perhaps in conversations with a few AI researchers that caught an interest in my thoughts. For most developers on the one hand my enquiry seemed too strange or alien to indulge in, slowing them down. On the other hand, I was a stranger in a clique of elites, most programming jobs are well paid and the perceived value in society is high. The win-win-ism and philanthropy of the technology elite gives us again the perception that technology is good despite the adverse actions taken by these large companies (Giridharadas, 2019). The richest man in the world was Jeff Bezos (amazon.com), followed by Bill Gates (Microsoft), while Mark Zuckerberg (Facebook) is on the eight place and Larry Page (Google) was on the 10th (Forbes, 2019).
People who work in code are perceived to be smart, and I noticed how perceptions changed when I mentioned I was doing programming courses at university. “The ethnographer’s body, and the sensation it records, are part of the ethnographic script,” (Madden, 2010, p. 19) technical prowess and ingenuity was far more valued during the first formative semester rather than critical thinking, programming requires performative dedication towards this objective.
“This seems like the central course in the programming curriculum, and it is acknowledged by the students here. It is raining outside. The other course is almost not relevant, this is the central course. I could spend my entire week doing this, but I have filled my week with other subjects, other activities. You have to be dedicated to do programming and I am not dedicated enough.” — notes from personal diary on the 6th of September
As such their ethical judgement is technically good, and information technology had a sheen of good, and I think it still does. Few would say their smartphone is unethical; their television is unethical; or their Internet is unethical.
Human or Non-human AI
“As anthropologists discover new subjects — either in established visual cultural forms or in evolving uses of the visual media — they may well redefine the terrain of anthropology.” (MacDougall, 1999)
What I often found strange was the religious mood that seemed pervasive at most ‘pure’ technology conferences, at least those where most speakers were from a computer science background. Pictures of humans touching or interacting with robots was rampant or images of some type of mechanical human-being. The artificial intelligence was often represented by a human-like figure that happily interacted or was a threatening presence, depending on whichever message the speaker wanted to get across. Anthropomorphism the attribution of human characteristics or behaviour to a god, animal, or object seems tempting in this case. This has been popularised in films such as The Terminator.
“Humanity might once have been defined as beings that could not fly, but then came the aeroplane. Instead of using terms such as post-human or trans-human, we might want to define humanity as including a latency that is achieved by each new technology. The concluding point is that digital anthropology, which can include the study of both use and consequence, is thereby as much a study of what people are becoming as what technologies are becoming.” (Miller, 2018)
As such the changing human is a tantalising image, yet it could be argued that by taking this viewpoint we fall into an overly anthropocentric modus operandi. Still, by looking at the notion of human intentions we may see interesting perspectives: “Critical description of living things maps those designs, intentional or unintentional, that gesture towards the future, making worlds for the yet-to-come as well as for the present.” (Tsing, 2013).
I would much prefer the perspective on nature as more-than-human sociality such as described by Anna Tsing than this perspective on technology-human, but then again who says they cannot be reconciled? Likely the way we are destroying nature and most of the species on the planet, at least that is a fair argument to the contrary. We could add on top of this the worrying trend as viewing AI as objective and able to predict social outcomes in a way superior to humans, although humans programmers code this enunciated objectivity.
“Theory should not be treated as a rule to which we find people to tightly conform, it is a guide to help us understand why humans do and think the things they do. Theory is our tool to master; it should not master us.” (Madden, 2010, p.18)
There are a variety of ethnographic approaches to digital media. It is possible to study the cultural politics of digital media, vernacular cultures of digital media and the prosaics of digital media (Coleman, 2010). Within this area the focus on power or discrimination has been part of an anthropological engagement with technology. Looking at the multiplicity of artificial intelligence could perhaps be done in what has been described as polymedia by examining convergence and fragmentation. Polymedia: mediated communication does not take place over a single technology (Madianou & Miller, 2012). This relational autonomy of our new selves requires new privacy practices (Ess, 2015), not that we ever were so individualistic everywhere as the eurocentric notion of personhood seems to prescribe (Comaroff & Comaroff, 2001). The relational autonomy written into code by certain programmers must be questioned.
Big Data Small Stories — The Coded Gaze
“The point I want to make about the ethnographic gaze is that all ethnographers develop one; our vision is inevitably shaped by our theoretical climate, the people and questions that interest us, and our own experiences, predispositions and foibles.” (Madden, 2010, p. 100)
Much like ethnographers have a gaze, we could argue with confidence — so does programmers. One of the most striking stories I experienced online was that of Joy Buolamwini. As an MIT master’s candidate in Media Arts and Science she was sitting in front of recognition software that did not recognise her face. As a woman of colour it was seemingly, according to technology companies a ‘neutral machine’, that she did not exist. The algorithms simply failed to detect her face. She was on the wrong side of computational decisions that could lead to exclusionary and discriminatory practices and behaviours.
She called the phenomenon the ‘coded gaze’ and it motivated her to launch the Algorithmic Justice League (AJL) to highlight such bias through provocative media and interactive exhibitions. Yet they do in addition to this contribute to writing ‘shadow reports’, do speaking engagements and more about coded discrimination. For this they develop practices for: “…accountability, design, development, and deployment phases of coded systems.” (MIT Media Lab, 2017).
From numbers on a screen to person on the bus. Storied realities are beyond the data sets, and I think one of the most important takeaways is looking to the data makers in a broad sense and finding the unexpected stories. If you analyse multi-stakeholder mobility patterns you may not see the Rosa Parks. The small stories can change the system in the most constructive way.
“… one representation of the ‘other’ is just that; only one way of seeing things; this attitude comes from the idea that truth is partial, not absolute.” (Madden, 2010, p. 22; Marcus, 1988, p. 197).
Engagement with partial truths and in particular where these are situated has been a longstanding engagement in anthropology both in positionality and planned actions (Haraway, 1988). There is a strong argument to be made that algorithms are culture shaped by people and thus often dynamically changing (Seaver, 2017). Researching this, then indeed the ethnographer in doing so is only covering partial truths, however this could be a strength rather than a crux (Clifford & Marcus, 1986). As ethnographers, we do not have to settle for only one or the other and can draw on both, especially if we attempt collaborations with other fields. Anthropologists can for example collaborate with computer scientists to combine ethnographic fieldwork with understanding of networks. This has as an example been done in ‘ghost work’, the exploitation of workers in Silicon Valley surrounding the development of AI (Gray & Suri, 2019). I have myself seen practices of indentured labour in the field of artificial intelligence that I would like to explore further at some other point in the future.
Rune Flikke studied the material history of racist advertising and how soap as a technology became so popular in Africa through the promotion of large soap manufacturers meeting local religious notions (Flikke, 2005). We could for certain attempt to approach questions of materiality in these methodological enquiries, since artificial intelligence can in all likelihood be studied as a political and religious process, or as faith. AI is not only immaterial, it is also expressed in a variety of ways in society while and being run on physical servers. Still, the ethnographer has to be careful about navigating the ideas of the intersubjective future (Jackson, 1998) ethnographic present (Hastrup, 1995) and the memorialised past (Harvey, 2005).
My personal failure as an ethnographer is likely more than twofold, yet two specific topics come to mind: the thick description in observation (Geertz, 2008) and losing sight of important anthropological studies in other areas due to my interest in digital anthropology. I think I was too influenced by my informants in seeing the broader trends, hype or big data instead of the small data. Ethnographic descriptions of AI and the developers or actions/plans is challenging to come by in my work because I failed to diligently note down the stories that I saw and participated in, likely due to my inexperience as an ethnographer. My supervisor told me not to be too hard on myself, yet I want to be appropriately critical so that I can improve going forward.
If objectivity dominates then subjectivity subjects. User/used and the objectivity that ascribes an intended meaning. Neither is correct. The economist Goodhart’s Law was phrased by the anthropologist Marilyn Strathern : — “When a measure becomes a target, it ceases to be a good measure.” It means a lot of small subjective decisions in an objective truth matters. Objective truth: conditions are met without bias caused by a sentient subject. How can ‘sentient beings’ not have been involved in the decision-making? This as you may now understand is impossible and possible in practice, depending on the computational ontology. I would say that algorithms operate in the space between subjectivities and objects. I could say they fall within the inter-subjective objectivity. However, that may be more confusing than simple saying it is still subjective, or to repeat and rephrase: we embed our values — or developers/teams/owners embed their values into an object that makes decisions. That does not make the object into a subject, but the decisions can be subjective as enacted by the object on behalf of the subjects inscribing the actions.
- Clifford, J., & Marcus, G. E. (Eds.). (1986). Writing culture: The poetics and politics of ethnography. Univ of California Press.
- Coleman, E. Gabriella. (2010). Ethnographic approaches to digital media. Annual Review of Anthropology, 39:487–505. annualreviews.org
- Comaroff, J and Comaroff, J. (2001). On Personhood: An Anthropological Perspective from Africa. Social Identities. 7(2):267–283. oria
- Ess, Charles. (2015). ”New selves, new research ethics?”, Internet Research Ethics, edited by H. Fossheim and H. Ingierd. Oslo: Cappelen Damm Akademisk, pp 48–76. nordicopenaccess.no
- Evans-Pritchard, E. E. (1937). Witchcraft, oracles and magic among the Azande (Vol. 12). London: Oxford.
- Flikke, Rune. (2005). Såpe som politisk praksis og religiøs prosess i Sør-Afrika. Norsk Antropologisk Tidsskrift, 16 (2):142–152. idunn.no
- Forbes. (2019, March). Billionaires 2019. Retrieved December 11, 2019, from https://www.forbes.com/billionaires/#208798bb251c.
- Geertz, C. (2008). Thick description: Toward an interpretive theory of culture. In The cultural geography reader (pp. 41–51).Routledge.
- Giridharadas, A. (2019). Winners take all: The elite charade of changing the world. Vintage.
- Gupta, Akhil, and James Ferguson. (1997). ”Discipline and practice: ’The field’ as site, method, and location in anthropology”, Anthropological Locations: Boundaries and Grounds of a Field Science, edited by A. Gupta and J. Ferguson. Berkeley: University of California Press. CANVAS
- Haraway, Donna J. (1988). Situated knowledges: the science question in feminism and the privilege of partial perspective. Feminist Studies, 14 (3): 575–99. jstor.org
- Harvey, Penelope. (2005). ”Memorialising the future. The museum of science and industry in Manchester.”, Science, Magic and Religion. The Ritual Process of Museum Magic, edited by M. Bouquet and N. Porto. Pp 29–50.
- Hastrup, Kirsten. (1995). The ethnographic present: on starting in time. In A Passage to Anthropology. Between Experience and Theory. London: Routledge. CANVAS
- Jackson, Michael. (1998). ”Digressions”, Minima Ethnographica: Intersubjectivity and the Anthropological Project. Chicago: The University of Chicago Press. Pp. 88–124.
- MacDougall, David. (1999). The visual in anthropology. In Rethinking Visual Anthropology, edited by M. Banks and H. Murphy. New Haven: Yale University Press. cscs.res.in
- Madianou, Mirca and Daniel Miller (2012) Polymedia: Towards a new theory of digital media in interpersonal communication. International Journal of Cultural Studies, 16 (2): 169–187. ORIA
- Miller, Daniel (2018) Digital Anthropology. In The Cambridge Encyclopedia of Anthropology (eds) F. Stein, S. Lazar, M. Candea, H. Diemberger, J. Robbins, A. Sanchez & R. Stasch. Pp 1–16 anthroencyclopedia.com
- Marcus, G. E. (1983). Elites, ethnographic issues. Univ of New Mexico Pr.
- MIT Media Lab. (2017, January 17). Joy Buolamwini wins national contest for her work fighting bias in machine learning.Retrieved December 12, 2019, from http://news.mit.edu/2017/joy-buolamwini-wins-hidden-figures-contest-for-fighting-machine-learning-bias-0117.
- Povinelli, E. (1995). Do Rocks Listen? The Cultural Politics of Apprehending Aboriginal Australian Labour. American Anthropologist. 97(3):505–518. jstor.org
- Seaver, Nick (2017) Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems, Big Data & Society 4 (2): 1–12. sagepub.com
- Tsing, Anna. (2013). ”More-than-human sociality: a call for critical description.”, Anthropology and Nature, edited by K. Hastrup. New York and London: Routledge. Pp 27–43.
This is #500daysofAI and you are reading article 358. I am writing one new article about or related to artificial intelligence every day for 500 days. My focus for day 300–400 is about AI, hardware and the climate crisis, but this text is more about methods.