Skip to main content
February 01, 2024

ACLU to the DOJ and DHS: Policing Technology Cannot Go Unchecked

Engelberg Center

By Eunice Park (‘24) and Aditya Trivedi (‘24)

The Technology Law and Policy Clinic at NYU Law recently teamed up with the ACLU to draft a comment in response to a request from the Department of Justice (DOJ) and the Department of Homeland Security (DHS). The DOJ and DHS requested input from the public to help inform the development of a report for the White House on law enforcement use of facial recognition technology, biometric technology, and predictive policing algorithms. 

Law enforcement use of these technologies has been left unchecked. Legislation is slow to keep up with ever-changing technological advances. Local law enforcement can spend up to hundreds of thousands of dollars on technology; federal law enforcement spends millions. 

The privacy and accuracy risks are high. Facial recognition technology enables the police to surveil and identify individuals walking down the street. DNA technology makes use of complicated algorithms to make probabilistic determinations based on tainted DNA samples. Predictive policing algorithms capture whole swaths of communities by using predictive algorithms trained on historical crime data in which a history of racist policing is built in. And all of this is shrouded in secrecy. It’s often unclear exactly when or how the police use these technologies. 

Recognizing the possible harms associated with these technologies, the Biden administration released Executive Order 14074 in May 2022. The order—entitled “Advancing Effective, Accountable Policing and Criminal Justice Practices To Enhance Public Trust And Public Safety”—requires the federal government to reform policing through direct authority over federal law enforcement agencies and influence over state, tribal, local, and territorial agencies. Section 13 of the Order mandates the DOJ, DHS, and the Office of Science and Technology Policy to consult with civil rights organizations on how to safeguard privacy, civil rights, and civil liberties.

The Clinic and the ACLU partnered to draft the comment, which focuses on four biometric technologies: facial recognition, DNA, fingerprint, and iris scans, plus predictive policing technology. For each of these technologies, the comment focuses on several aspects of the technologies and how they are used by law enforcement today. First, many of these technologies—though heralded as ushering in a new era of “smart” policing—do not function well enough to be placed into the field, leading to wrongful arrests. Second, each technology carries privacy risks and raises Constitutional issues. Third, law enforcement often does not reveal the extent to which they rely on technology in the course of their investigations and prosecutors deny defendants’ requests for information in contravention of their Brady rights.