Connections: In Depth
- We Keep Our Data Safe
- Share
We Keep Our Data Safe
Resisting biometric surveillance requires more than opting out.
Having data about our bodies tracked is so much a part of our daily lives that we sometimes forget it can be used against us.
When the Supreme Court overturned Roe v. Wade in 2022, a deluge of articles warned people to delete their menstrual tracking apps. While these technologies can be useful for those seeking (or avoiding) pregnancy, these apps have come under fire in the past for selling intimate data to social media sites. With the changing legal landscape, privacy experts pointed out that this data could be used to criminalize people seeking abortions. Proponents of forced birth also recognize this potential, and are working to ensure that menstrual data remains open to search warrants. Turning this data into a site of political contestation raises questions of collective response beyond the limits of bodily autonomy.
It’s become normalized for data about our bodies and actions to be collected, quantified, and sold on the market. Facial recognition and body scans have become necessary to travel. Employer health insurance can require divulging health information from blood pressure to sexual activity in order to access care. Colleges and universities are subjecting students to e-proctoring and glorified spyware, while Amazon requires delivery drivers to sign “biometric consent” forms. Even attending a sports event or concert can subject us to facial scanning and data mining. Given this landscape, the notion of “opting out” is starting to feel impossible.
So many of these systems have become necessary, and it can be useful to have this type of information about ourselves. At some point the cost of having our behaviors predicted by an algorithm can start to feel worth it. For someone struggling to get pregnant or desperate to avoid it, the exchange of privacy for the information needed to manage their reproduction may make a lot of sense. During these sharp moments of political crisis, however, we start to feel how thin the membrane can be between the benefits and the criminalizing potential of these technologies.
New technologies aren’t simply neutral tools; they’re created by and for existing power structures. Predictive policing, for example, uses existing data sets of past crimes to predict where crime will occur and who is likely to be a perpetrator. The existing data is extracted from those already criminalized, creating a self-fulfilling feedback loop. In this way, as poet and scholar Jackie Wang discusses in her 2018 book, Carceral Capitalism, the datafication of bodily control meant that “biological and cultural racism was eventually supplanted by statistical racism,” converting white supremacist social structures into algorithms.
The question here is what to do with these technologies when they’re used to criminalize, coerce labor, and extract profit, and whether they can be repurposed or should be dismantled. For many, this can mean relying on individual hacks to evade or confuse surveillance. There are limits, though, to relying solely on these autonomous strategies if we’re thinking about building the conditions for collective liberation. Taking a longer view of the logics and structures of our data-saturated society can help us better understand what got us here—and which strategies can be used to avoid reifying these same systems.
Tracking Bodies All the Time
The specific possibilities of AI-generated surveillance and mobile health apps may not have existed before now, but the logics and parameters on which these technologies rely have a lengthy history. Many of the concepts used to build our current system were codified in the early 20th century as part of the massive collection of biometric data that fueled the eugenics movement. Eugenics rests on collecting extensive data around the body and measuring it against an ideal “norm.” For those interested in tracing a critical history from the eugenics movement to our current moment, this suggests a continuity in two major goals of biometric data collection: measuring who is dangerous and determining how labor or profit can be extracted from people.
The academic branch of eugenics, often tied to Francis Galton who coined the term, heavily shaped emerging fields like psychology, social work, and criminology. Eugenicists invented concepts such as the body mass index and IQ to attempt to determine the ideal social body, and created statistical judgments about some individuals as better, “healthier,” or more “normal” than others. This included taking the social construct of race and trying to turn it into objective biological or evolutionary fact through empirically dubious studies.
Alongside this project of scientific white supremacy, eugenicists were intensely concerned with whether people would conform to many different constructs of “normal.” These included expressions of sexuality and gender performance, and framings of disability and “fitness” around who could be a “productive worker.”
Measuring people for productivity and surveilling them for control predates the rise of the eugenics movement, with roots in slavery and colonization. In her book Decolonizing Methodologies: Research and Indigenous Peoples, scholar Linda Tuhiwai Smith discusses how these notions of scientific measurement required “pre-existing views of the ‘other’ around science, philosophy, imperialism, classification, and ‘regimes of truth’” that could be adapted to “new conceptions of rationalism, individualism and capitalism.” Regarding these deeper histories, in her 2015 book, Dark Matters: On the Surveillance of Blackness, Simone Browne writes that the conceptual and power structures of slavery solidified around the accumulation of people as a countable, surveillable subject, and Blackness as a “saleable commodity.”
Even the specific technologies of the Nazi Holocaust, often remembered as the most famous and horrific example of how a regime can mobilize eugenics, were workshopped and fine-tuned in colonial projects in Jamaica and the Pacific. After World War II, it generally fell out of fashion for academics and public figures to openly name eugenics as their explicit research apparatus or political logic. But these measurements, value judgments, and systems of control continue to texture our world, from restricting access to gender-affirming health care to forced sterilization in ICE detention centers.
Algorithmic Justice Beyond Autonomy
For those of us committed to challenging these legacies, we then ask: How do we respond tactically to the criminalizing potential of biometric surveillance without fleeing into the fiction of personal responsibility? What systems and technologies can we build that aren’t merely protecting ourselves individually from the worst effects of this extraction and criminalization but are also building a different world together?
We can’t fully retreat to bodily autonomy. Although this can be a powerful and important political frame in many contexts, we can never be individual units moving as autonomous bubbles through the world. We are highly entangled with each other, our histories, our environments, and the more-than-human world; we are affected by and affect each other beyond the scope of what is legible to variables in a tracking software. Also, although the criminalizing strains of these systems might come for us differently, they are coming for all of us, whether or not you have a uterus, precarious legal status, nonnormative gender expression, or any other variable that can be criminalized.
There’s great work being done by groups like the Algorithmic Justice League and the Center for Critical Race and Digital Studies to address the harms caused by biased biometric algorithms, such as the notorious incompetence of facial recognition software to detect darker faces, or the use of AI to deepen medical discrimination. In addition to this, we need to continue pushing beyond accuracy as the horizon of our demands. Even if we had a universal data set fully quantifying and tracking every human on the planet, is that algorithmic justice? If the systems that created these technologies and the purposes for which these technologies were designed are fundamentally oppressive, more efficient or accurate tech does not lead to a more just world.
After all, August Vollmer, the eugenicist “father of American policing” who founded the first U.S. police academies with proposed courses like “Race Degeneration,” was obsessed with gathering perfect data in order to determine a criminal “type.” His vision of social control dreamed of a moment “[w]hen statistics reach a plane where their accuracy is no longer doubtful.” If we are to focus on pushing for better algorithms, we must ask how this is different from what the eugenics movement wanted. Perfecting statistics does not necessarily intervene in the fundamental variables against which the “normal” is judged, or the power structures in which this biometric data is being designed and deployed.
So much of the data for these systems is crowdsourced, which allows complicity to be dispersed. Given this, we have responsibilities as the building blocks that make up this algorithm. As artist and poet Manuel Abreu asks in a 2014 article for The New Inquiry, how do we engage with these structures when “[o]ur banal activities are the source from which algorithms automatically generate kill lists made up of nodes that deviate from the cluster of normal activity patterns” and “algorithms make complicity incalculable”? When we are embedded in these systems, our answer has to be something beyond opting out.
One response could be to treat data security as community care. Legacies such as COINTELPRO and other domestic surveillance programs teach us that taking sensible precautions with our own communication practices can reduce harm to people in our community. There can be deep solidarity, along the lines of “we keep us safe,” in small-scale measures such as using an encrypted messaging app or browser, and having politically or legally sensitive conversations face-to-face with physical distance from your phone.
A contradiction of political organizing is that it often requires a level of visibility to build community and draw more people into the work, but visibility attracts repression. Some projects that offer tools and resources to help people grapple with these conundrums include MediaJustice’s Defend Our Movements campaign, A People’s Guide to Tech, the Electronic Frontier Foundation’s guide to surveillance self-defense for abortion access activists, workers, and patients, Hack*Blossom’s DIY Guide to Feminist Cybersecurity, and Our Data Bodies’ Digital Defense Playbook.
Another response is to intervene through collective action. In 2018, members of Mijente, an immigrant justice organization, created the #NoTechForICE campaign to confront the surveillance firm Palantir, whose tendrils stretch from collaborating with U.S. Immigration and Customs Enforcement on deportations to supporting the Israeli Defense Forces in surveilling and bombing Gaza. Mijente aimed to put pressure on companies and cities to break contracts with Palantir, reducing its ability to recruit and retain workers and bringing this surveillance more under public scrutiny.
Groups such as the Tech Workers Coalition have also organized to create solidarity in workplaces. One of the grievances that kicked off the West Virginia teachers wildcat strike in 2018 was the demand that they wear a Fitbit and perform a certain amount of movement or face higher health insurance costs. People could have individually opted out of this ableist, fatphobic, and generally invasive requirement. Instead, they collectively withdrew their labor to change the conditions of possibility through worker power.
Fundamentally, the lesson is that in order to confront these systems, we must build something greater than individual autonomy. In Dark Matters, Browne writes about what she calls “dark sousveillance,” reversing the surveillance and turning the gaze on those in power as a tactic to keep oneself out of sight in the flight from enslavement to freedom. This flight requires networks and histories far in excess of the autonomous self, entailing the creation of entire social networks, languages, technologies, and relationships outside of and directly opposed to the systems of power that are surveilling.
Our response to a world trending toward alienation from each other and data-driven social control cannot be based solely on personal privacy. Any engagement with these systems that takes solidarity seriously must remember that, whether we find ourselves shoulder to shoulder or embedded in an algorithm, our liberation is bound up with each other.
Juliet Kunkel
is an independent writer, “scholar” (ambivalently), and general troublemaker who learned far more from her comrades in movement work than from the Ph.D.
|