[Book recommendation: We Are Data by John Cheney-Lippold]
This book has two intersecting purposes. First is to understand how algorithms transcode concepts like gender, race, class, and even citizenship into quantitative, measurable-type forms. Second is to recognize how these measurable types reconfigure our conceptions of control and power in a digitally networked world. The political battles that surround structures of patriarchy, white supremacy, and capitalism writ large must necessarily attend to the terms of algorithm.
In other words, while HP’s facial-recognition cameras might promote asymmetrical usages across racial lines, an algorithmic ‘race’ complicates this asymmetry. HP’s response is indicative of this complication, in that HSV contrast, not skin color, shared history, or even DNA, stands in for the concept of race. We’re talking about HP’s construction of a ‘white’ ‘face,’ not White Wanda’s white body. But we’re talking about whiteness nonetheless. Race is incessantly being made and remade, and we must focus on how digital technology also makes and remakes ‘race’ on algorithmic terms.
Explicating the terms that underpin this making/remaking is the ultimate goal of the book. In the following pages, we’ll talk about jazz, terrorists, HP being racist (again), marketing, the NSA, citizenship, and even Santa Claus. The shift to the data/algorithm ontology of the computer conceptually moves identity past explicit, policed boundaries that require negation and exclusivity (either male or female, at risk or not, black or white).
This move lays the foundations for a plane of smoothness, an open set of possibilities where we play on the limits of established truth. Algorithmic identity doesn’t declare that you are just ‘male’ or ‘female.’ Statistical confidence and probability, even the chance that this book will spontaneously combust, can never be 100 percent anything. Rather, you’re likely to be 92 percent confidently ‘male’ and 32 percent confidently ‘female.’ Algorithmic ‘race’ and ‘gender’ isn’t about being a white man. It’s about being a ‘Caucasian’ ‘man’ with a confidence measure of 87 percent. In algorithmic identity, we confirm the inorganic realities of Donna Haraway’s cyborg, one who is “not afraid of permanently partial identities and contradictory standpoints.
Chapter 1. Categorization: Making Data Useful
In order to compute something like ‘woman’ or ‘smiling,’ one needs to first make data useful. In chapter 1, I describe the how-to of algorithmic knowledge production. This how-to centers on how computers create categories through patterns in data, which then construct algorithmically transcoded ideas about the world that I call measurable types. Algorithms are neither magical nor mysterious. Instead, they make data useful through a very intricate but, I promise, also very interesting constellation of different technologies (like metadata or marimbas) that then create different algorithmic identifications (like ‘terrorist’ or ‘John Coltrane’).
Chapter 2. Control: Algorithm Is Gonna Get You
Measurable types are much more than descriptive containers for algorithmic meaning. They also play formative roles in how life and knowledge is controlled. With the aid of Gilles Deleuze’s concept of modulation, I theorize how the deluges of data we produce online help enact a form of control. This type of control substitutes the strongly worded, hard-coded prohibitory “no!” of traditional modes of power in exchange for what some scholars have called “control without control”—and that I call soft biopolitics. These soft biopolitics describe how our algorithmic identities can regulate life without our direct participation or realization.
Chapter 3. Subjectivity: Who Do They Think You Are?
Soft-biopolitical measurable types structure our lives’ conditions of possibilities every time we buy a plane ticket, cross a border, or translate a document on Google Translate. While we are ceaselessly made subject to different arrangements of algorithmic knowledges, these datafied subject relations are foreign to our most immediate experiences. We are not individuals online; we are dividuals. And without the philosophical anchor of the individual to think alongside, we are often at a loss in how we interpret ourselves as users. This chapter explores how algorithms make us subject in ways unique to online, algorithmic life.
Chapter 4. Privacy: Wanted Dead or Alive”
How does one practice privacy in a world where not only is almost everything surveilled but that surveillance is rarely, if ever, felt? I evaluate privacy’s legacy and outline its origins in the nineteenth-century phrase “right to be let alone,” in order to bring that history into conversation with the exigencies of our contemporary era. I argue that privacy cannot just be about whether you have a password on your email or whether there are doors on a bathroom stall. Privacy must be a practical response to the lived restriction and control implicit in ubiquitous surveillance. In this way, I theorize a dividual privacy that focuses especially on how the freedom in being “let alone” might translate to a datafied, algorithmic world.
Conclusion: Ghosts in the Machine
At the end of the book, I return to my central arguments: online we are made, read, interpreted, and intelligible according to data. Our world, and the knowledge that gives it its meaning, is increasingly a datafied world. We are subsequently understood in the datafied terms of dynamic, soft-coded, and modulating measurable types. The contemporary encounters we have with ubiquitous surveillance suggest a new relationship to power that I term soft biopolitics. And the resulting ubiquity and emergent configurations of these different types of knowledge force us to rethink how subjectivity functions and what it is that privacy can practically defend.”
Source: John Cheney-Lippold. “We Are Data: Algorithms and The Making of Our Digital Selves.”