Predictive Policing as a Human Rights Issue

This Summer, I have been conducting an independent research project that explores intelligence-led, predictive policing practices as a human rights issue. My research directly addresses structural inequalities that can be perpetuated through law enforcement tactics and especially through novel technologies that have been rhetorically construed to be falsely unbiased. Although the idea that modern technologies are capable of enacting discrimination is nothing new, especially considering discourses surrounding expediency versus justice, this logic of efficiency and optimal resource allocation is a commonly cited rationale for the implementation of predictive policing technologies.

This rationale takes me back to 2020, when major controversy surrounding the predictive policing program in my hometown of Pasco County Florida broke out following a Tampa Bay Times Investigation. The Pasco County Police department saw the technology as an opportunity to best allocate police resources by concentrating efforts based on “evidence-based,” algorithmic predictions about who was most likely to commit crimes. In practice, however, many of the people targeted by the algorithm described treatment by the police department as over-surveillance and blatant harassment to them and their families.

Police deputies, after being directed towards people predicted to be criminals based on the algorithmic system, would repeatedly show up to the homes and workplaces of those targeted, often questioning these individuals and their family members without evidence of a specific crime, probable cause, or a search warrant according to the Tampa Bay Times. After facing this harrassment from the police department, many people targeted by the program, along with their family members said they dealt with anxiety and felt compelled to move counties.

Read more about Pasco County’s Intelligence-led policing program: https://projects.tampabay.com/projects/2020/investigations/police-pasco-sheriff-targeted/intelligence-led-policing/

Article 12 in the Universal Declaration of Human Rights states that “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation.” Many scholars, activists, and community members claimed Pasco’s predictive policing program was a violation of human and civil rights. The Institute for Justice currently represents a group of plaintiffs targeted by the program in Pasco, but the trial is still pending. The intelligence-led policing program in my hometown was eventually phased out in 2021 and 2022 following community outcry, but the police department has still not acknowledged any wrongdoing.

This discussion of how human rights issues are evolving in an evermore digitized and globalized society takes me to my Global Scholars CapStone Project. For this project, I plan to draw upon my takeaways from my research project and my upcoming experience studying human rights in the Czech Republic to theorize best practices in human rights protection in the face of emerging technologies such as complex algorithmic systems and artificial intelligence. Pictured with this article is me reading about the White House’s suggested Algorithmic Discrimination Protections, a part of its Blueprint for an AI Bill of Rights. This set of suggested guidelines demonstrates that protecting human and civil rights in an age of unprecedented technological advancements is an endeavour weighing heavily on the minds of many Americans and policy makers.

Leave a comment