What can anthropology tell us about data in the 21st century?

This piece was published in January 2018 on the Observatory for a Connected Society app:

https://connectedobservatory.org/

What can anthropology tell us about data in the 21st century?

17 January 2018
Data is no longer just the outcome of scientific research or administrative functions of government but is now created as a bi-product of every person’s interactions with the internet, infrastructures, institutions, news media, supermarkets, banks, the built environment and so on. Building on the work of anthropologists who have been trying to make sense of data and its social implications, Dr Hannah Knox makes the case for the crucial role that anthropology can play in wading through this data saturated landscape.
It is a truth commonly expressed that we live in a world saturated by digital data. Data is no longer just the outcome of scientific research or administrative functions of government but is now created as a bi-product of every person’s interactions with the internet, transport infrastructures, institutions, news media, supermarkets, banks and the built environment. Confronted with this ever-increasing mass of digital data there is both excitement and consternation about how this data should be analysed and what the implications of its use will be for the future of work, knowledge and social relations.My academic focus is on data and its social implications and I see a crucial role for anthropologists in helping others make sense of this data-saturated landscape. Anthropologists are uniquely equipped to explore the promises and expectations of data and to understand their effects. Far too often, commentaries on the social promise or cultural dangers of data are dominated by technicist accounts that fail to appreciate the way in which digital data, even in its most-posthuman manifestations (e.g. general artificial intelligence or advanced robotics) remains a deeply human endeavour.

When we approach data from the perspective of technical systems we are confronted with what looks like enormous complexity – algorithms working invisibly in the background using things like Bayesian techniques for determining probabilistic relations, gaussian prediction and github repositories of code running to hundreds of pages to produce links and insights that encourage us to buy, click and skim (1). But what if we were to try to understand data from the position of the people that work with it and manage it? What other kinds of understandings of this data landscape would this elicit?

This is what anthropologists are beginning to do. Anthropologists are sometimes criticised for pointing out that things are simply ‘more complex’ than they seem at first sight. But in the case of digital data, I would argue that anthropology offers a way of re-describing data, through an attention to human practices and ideas, so as to make it less complex and more understandable for those who are not steeped in the technical languages of coding, mathematics or computer engineering.

Anthropologists are experts in translation. The classic image of the anthropologist is that of the intrepid explorer visiting far-off cultures to bring back tales of the exotic rendered comprehensible through social analysis. In fact, today you are as likely to find an anthropologist in a science laboratory, a government office, a protest march or a community allotment as you are to find them hanging out in a Papua New Guinean village. But whether doing research in far off places, or in social situations that seem closer to home, anthropologists are always cultural translators, turning the seemingly incomprehensible dimensions of the worlds they study into terms that other anthropologists, and hopefully others who are not anthropologists, can begin to understand.

Being an anthropologist invariably involves learning another language. To do our research we must become competent members of a community learning the terminologies, rituals and practices of the group of people we are studying. This process of gaining intimacy with a community, learning social and linguistic cues and becoming versed in the techniques that are often taken for granted can be an awkward experience, full of surprises and mistakes. Rather than papering over these failures and mistakes, or erasing the aspects of people’s activities that don’t fit preconceived ideas anthropologists use these experiences as a way of interrogating the difference between their own assumptions about the world, and the assumptions of those with whom they are doing research.

This allows anthropologists to unravel and unpack what is often taken for granted. When people say digital technologies will lead to the end of work, the perplexed anthropologist who might understand work as a social contract, will wonder how it can be that such a social contract could be imagined as disappearing. This might entail asking whether work for the person making such a statement is indeed a social contract, and it is that which is disappearing, or whether it is seen as something else which is disappearing, in which case the question becomes, what? With these kinds of questions, we begin to unravel what’s taken for granted in the everyday, allowing us to better understand just what it is that people fear or desire about digital technologies (skills, identity, continuity, community, safety, security?) and where those worries and hopes come from (sense of self, ethical stance, moral interpretation?).

The proliferation of digital data, and the challenges it poses, offers a fertile terrain for this kind of anthropological work. The current enthusiasm for blockchain, machine learning and predictive analytics, raises questions about precisely what it is that is driving this interest and what the effects of this enthusiasm are. In relation to blockchain, media studies scholar Lana Swartz (2) has shown, using precisely the analytic approach described, how interest in blockchain is sustained not only by the technical capacities of the distributed ledger but also operates as what she calls an ‘inventory of desire’. Focusing on what those working with blockchain actually say, rather than on an idealised version of what blockchain is supposed to do, Swartz shows the importance of liberal values of freedom, decentralisation, and privacy that underpin enthusiasm for blockchain, fuelling investment and development in the technology.

Similar analyses have been done on algorithms and the imaginaries that sustain them. Susanne Thompson and colleagues have recently gone so far as to suggest that algorithms might be usefully understood as ‘fetish’ objects. As they make clear, within anthropology, fetishes are understood not as ‘indices of false thinking’ but rather as ‘material objects that stabilise ongoing social relations because people invest them with [an] effect [of simultaneous belief and disbelief]’. Algorithms, they show, gain part of their power from their ability to both confirm people’s understanding of how the world should be, and to produce awe and wonder when they actually work.

This combination of belief in, and disbelief of, technology is perhaps most clearly evident in forms of data analysis that are oriented to the replication or improvement of human-like abilities. Machine learning, artificial intelligence and humanoid robots all entail a fetish-like form of engagement. Developers of intelligent machines draw explicitly on ideas about abilities that are derived from particular understandings of human being such as cognitive and rational capacities, haptic interaction, environmental awareness and logical deduction. When aspects of these qualities become replicated in computational machines, there is often a certain disbelief, awe and wonder expressed at the spectacle of a machine acting like a human.

The interest in finding ways of making machines act like humans have a long historical precedent which helps remind us that not everything about advanced data technologies in necessarily new. Current dreams of automation can be traced back to intricate automata made by Viennese clockmakers in the 17th century, through to Wolfgang Von Kempelen’s chess playing Mechanical Turk which turned out to be what Steven Shapin has insightfully described as device in which a human, pretended to be a machine that was pretending to be human (3). It is no coincidence that Amazon’s mechanical turk rests on a similar idea, whereby human beings stand in for machines that themselves are meant to replace humans.

For anthropologists, one of the most fascinating things about digital data is that the work to manage and manipulate it uncovers taken for granted ideas about human capacities. As well as being based on ideas about human capacities, computational machines are also shaping what we expect it means to be a human. Industrial manufacturing led to people becoming reimagined as units of productive labour. Now algorithms are leading to a rethinking of identity as a composition of experiences and preferences, and the gig-economy is making people rethink themselves not as units of labour but as marketable commodities in a shifting and unstable landscape of work. Whilst futurist predictions about digital data often worry about how computers are set to replace human beings, anthropological studies show how what makes us human is already being shaped by digital data and the machines that analyse it. The question is not whether robots or artificial intelligence will replace human beings but which kind of human they will replace, and with what implications.

In a report, published by the Royal Society and the British Academy last year, a call was made to ensure the governance of data puts human flourishing at its core. For anthropologists this is crucial, not just because it brings human beings into discussions about the benefits and dangers of digital technologies, but also because it allows us to talk about how digital technologies are framing what is valuable – about data, about machines but also, crucially, about what it is to be human in the world today.

Footnotes:

  1. Adrian Mackenzie, 2017 ‘Infrastructures in Name Only: Identifying effects of Depth and Scale’ in Penny Harvey, Casper Bruun Jensen and Atsuro Morita (Eds) Infrastructures and Social Complexity
  2. Lana Swartz. 2017. “Blockchain Dreams: Imagining techno-economic alternatives after Bitcoin.” Another Economy is Possible, edited by Manuel Castells. Polity Press.
  3. Mechanical Marvels, Clockwork Dreams, BBC 4 documentary, Simon Schaffer, “Enlightened Automata” in The Sciences in Enlightened Europe, edited by William Clark, Jan Golinski, and Schaffer (Chicago University Press, 1999)

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s