In June of 2018, following a campaign initiated by activist employees within the company, Google announced its intention not to renew a US Defense Department contract for Project Maven, an initiative to automate the identification of military targets based on drone video footage. Defendants of the program argued that that it would increase the efficiency and effectiveness of US drone operations, not least by enabling more accurate recognition of those who are the program’s legitimate targets and, by implication, sparing the lives of noncombatants. But this promise begs a more fundamental question: What relations of reciprocal familiarity does recognition presuppose? And in the absence of those relations, what schemas of categorization inform our readings of the Other? The focus of a growing body of scholarship, this question haunts not only US military operations but an expanding array of technologies of social sorting. Understood as apparatuses of recognition (Barad 2007: 171), Project Maven and the US program of targeted killing are implicated in perpetuating the very architectures of enmity that they take as their necessitating conditions. Taking any apparatus for the identification of those who comprise legitimate targets for the use of violent force as problematic, this talk joins a growing body of scholarship on the technopolitical logics that underpin an increasingly violent landscape of institutions, infrastructures and actions, promising protection to some but arguably contributing to our collective insecurity. Lucy Suchman’s concern is with the asymmetric distributions of sociotechnologies of (in)security, their deadly and injurious effects, and the legal, ethical, and moral questions that haunt their operations. She closes with some thoughts on how we might interrupt the workings of these apparatuses, in the service of wider movements for social justice.
Lucy Suchman is a Professor of Anthropology of Science and Technology in the Department of Sociology at Lancaster University, in the United Kingdom.