By STEPHEN GALOOB
Review of Predict and Surveil: Data, Discretion, and the Future of Policing, by Sarah Brayne
New York, NY: Oxford University Press, 2020
The Fourth Amendment to the U.S. Constitution prohibits unreasonable searches and seizures. The noble dream of this provision is that law can prevent arbitrary domination by state officials through unchecked and unprincipled exercises of power. The requirements of “individualized suspicion” support this noble dream: police may not stop, search, or arrest you without having a good (or, depending on the intrusiveness of the stop, a pretty good) reason to believe that you have recently committed a crime or are about to commit one.
If this is the dream, then one version of the nightmare is that these legal restrictions fail to restrict state officials, that everyone is subject to arbitrary domination that is also legitimated. An even worse version is that only some people are subject to arbitrary domination, their subordination both authorized by law and invisible to those who are unaffected.
Many believe that the nightmare is a reality in the contemporary United States. If so, then a number of factors drive the nightmare: broadly applicable criminal laws, long prison sentences authorized for minor offenses, ample discretion for law enforcement, entrenched political inequalities, racial segregation, and powerful obstacles to law enforcement accountability. Some posit technology as a countervailing force, a way to ensure that law enforcement power is used to promote the public good without exacerbating racialized subordination.
Sarah Brayne’s Predict and Surveil is an incisive exploration of the impact of data analytics on contemporary policing and a harbinger about how technology can be used to entrench the nightmare. Brayne illuminates the role of big data in policing through a long-term, in-depth ethnography of the Los Angeles Police Department. Because of Brayne’s extraordinary access at all levels of the LAPD and analytic rigor, Predict and Surveil is the best on-the-ground account of how technology influences policing.
Brayne begins the book by noting a version of the nightmare and noble dream: although the use of “big data has the potential to reduce inequality and discrimination, as currently used it increases inequality while appearing to be objective” (5). “Big data,” for Brayne, is a “data environment made possible by the mass digitization of information and associated with the use of advanced analytics, including network analysis and machine learning algorithms” (3). In chapter 2, Brayne provides an accessible overview of the role of data in the evolution of policing generally (and the LAPD in particular), illustrating how “the embrace of big data by police is part of a broader trend toward quantification and algorithmic risk assessment in the criminal assessment” (18).
Chapter 3 shows how a Palantir Technologies platform allows for “dragnet” surveillance, or the “collection and analysis of information on everyone, rather than only people under suspicion, possible at an unprecedented scale” (39). Palantir is a now-notorious domestic intelligence technology company. According to Brayne, Palantir’s products aim to create a “full data ecosystem” for policing suspects and the suspicious. The Palantir platform systematizes data collected from the police (e.g., arrest and police contact information), from other official sources (automated license plate readers in particular), from third-party databases, and from queries to the Palantir system itself. The result: a surveillance reservoir that is accessible whenever law enforcement needs to draw from it. Palantir technology not only allows police to track those with whom they have had contact, but also to create what Brayne calls a “secondary surveillance network” of targets who are tracked solely in virtue of their connections to those who are suspects (111).
Chapter 4 examines the use of big data techniques to conduct “predictive” and “precision” policing efforts directed at specific persons and places. Brayne focuses on the LAPD’s “Operation LASER,” a federally-funded pilot program that used Palantir technology to direct police attention toward “chronic offenders” (largely suspected gang members). Under Operation LASER, officers received “chronic offender” bulletins, or “information-only” documents that include identifying information about “chronic offenders” not wanted for any particular crime who are assigned point values based on their algorithmically-computed risk factors (62). Bulletins aimed to improve officers’ “situational awareness” by helping them identify “the worst of the worst.” Although Operation LASER was formally cancelled in 2019, Brayne notes that LAPD continues to utilize both place-based and location-based surveillance strategies, as well as the Palantir platform (137).
Chapters 6 and 7 explore the implications of police use of big data technologies. In chapter 6, Brayne explains that the “unprecedentedly broad and deep police surveillance” enabled by “big data and associated technologies” can reproduce existing inequalities in policing (101)—in effect, providing a technocratic rationale for arbitrary domination. Predictive policing models can have a “ratchet effect” in which racialized patterns of surveillance are increased “absent any evidence that [they are] warranted” (107). The result is that both “individuals already in the criminal justice system” and those merely connected to them lack the ability to “avoid being drawn into the surveillance net” (108). Because the databases from which the Palantir technology draws its inferences are infected by racial bias, the system’s predictions about future danger are also biased. Moreover, Brayne notes, marking people as suspicious for the purposes of the criminal justice system encourages their avoidance of non-criminal social institutions, such as hospitals, schools, or employment (114).
Chapter 7 explores the legal implications of big data policing, arguing that “existing legal frameworks are anachronistic and inadequate for governing police work in the age of big data” because they do not “attend[] to the sociological processes that underpin basic legal principles” (119). Brayne makes four main points: first, the criteria for individualized suspicion do not reflect reality about how big data supports police decision-making on the ground; second, big data technologies reflect a difference in kind, rather than degree, from the paradigm of officer decision-making; third, the exclusionary rule (which prevents the state from utilizing unlawfully-obtained evidence in order to convict a defendant) is inadequate to regulate big data policing; fourth, big data presents unprecedented opportunities for “parallel construction,” a process of retrospectively justifying otherwise-unlawful police actions (118).
The first and fourth of these points are the most novel and unsettling. I would like to reformulate them slightly in order to explain how they implicate the nightmare of arbitrary domination.
Consider first a technique that in certain areas of philosophy is pejoratively called “bootstrapping.” Bootstrapping is a way of creating reasons for believing something or acting in a certain way out of nothing. In the case of policing, the Fourth Amendment requires that police have “probable cause” to believe that someone has committed a crime in order to arrest or search them. To briefly detain someone, the police must have “reasonable suspicion” to believe that someone has committed a crime.
Imagine a case where all of the relevant considerations do not add up to reasonable suspicion or probable cause with regard to a person—call them S. In this imaginary case, the police are not authorized to stop, arrest, or search S. Now amend your imaginary situation to include an imaginary Palantir-model “Reliabalator,” which analyzes all of the relevant considerations and establishes a conclusion, Iron Man style, about whether S is suspicious enough. After the Reliabalator spits out its verdict, do the police have enough suspicion to stop, search, or arrest S? Of course not. All the Reliabalator has done is to repurpose evidence that was already insufficient to suspect S. Police can’t use technology to generate reasons out of nothing. Yet under existing legal tests it is highly likely that a court would allow something like the Reliabalator to generate reasonable suspicion or probable cause. In this way, it is possible that technology could bootstrap individualized suspicion in a process akin to transubstantiation.
Consider another problem that Brayne identifies as “parallel construction” (132-33), a form of backstopping. Take a clear example of arbitrary domination: a police officer decides to stop a Black person at random. The officer stops S, searches S’s vehicle, and finds contraband. This stop is illegal under the Fourth Amendment because it is at random. (Perhaps surprisingly, it would not be illegal under the Fourth Amendment because of the race of the person stopped.) Now amend this imaginary situation to include a function of the Reliabalator that allows police to go back and determine that, at the time S was stopped, the circumstances at the time of the stop were (unbeknownst to the officer) sufficient to justify the stop and/or search of S. Under current law, the Reliabalator would almost certainly legitimate the unlawful police conduct by allowing use of the evidence of the contraband in a prosecution of S.
Bootstrapping and backstopping technology moves toward the nightmare of arbitrary domination. Brayne’s research indicates that LAPD actually engaged in practices that look a lot like bootstrapping through the use of “chronic offender bulletins” in Operation LASER. [Although, as Brayne notes, the bulletins expressly deny that a person’s inclusion in a bulletin can be the “sole basis” for detaining them (62), it is unclear that this limitation is constitutionally required.] Moreover, Brayne shows that big data technology “increases the opportunity” for backstopping, although future research is needed to establish whether this practice actually takes place (134).
In Chapters 7 and 8, Brayne offers sensible insights about how law and policy should be changed in order to adapt to the realities of big data policing. Brayne argues that legal tests should be revised in order to account for the racial bias of data and to allow people to understand and contest the data about them maintained by private companies that inform the Palantir system (135). She contends that law enforcement should be required to provide an independent justification of the use of big data and new surveillance tools prior to mass deployment (142).
Brayne also points out the ways that data technologies could be used to effect reform, for example by increasing police accountability (143) and directing non-punitive interventions (e.g., from public health workers) (144). More broadly, Brayne urges that departments use big data analytics to redefine successful policing away from simplistic metrics such as arrest rate and toward more equitable goals such as increasing clearance rates for the most serious crimes and reducing racial inequalities (145).
One refrain in Predict and Surveil is that data is “fundamentally social,” that it arises in social contexts and affects how people act within those contexts (4-5). Perhaps, in our future, big data analytics will further the noble dream of policing as a bulwark for freedom. It is also possible that these tools will bring about (or, perhaps, lock in) the nightmare of arbitrary domination. Which scenario is realized depends less on the tools than on those who make and use them.
Posted on 10 March 2021
STEPHEN CALOOB is Chapman Professor of Law at the University of Tulsa College of Law.