Friday, March 29, 2024

AdvertiseDonateSubmit
NewsSportsArtsOpinionThe QuadPhotoVideoIllustrationsCartoonsGraphicsThe StackPRIMEEnterpriseInteractivesPodcastsBruinwalkClassifieds

Predictive policing algorithm perpetuates racial profiling by LAPD

By Lena Nguyen

May 2, 2019 10:47 p.m.

Los Angeles Police Department officers have done such a poor job of making our streets safer that they are now relying on an algorithm to tell them how to do their jobs.

A racist, culturally ignorant algorithm, that is.

LAPD has a history of using and abandoning predictive policing algorithms. One of the only ones still in use is PredPol, which generates 500-square-foot hot spots on maps predicting where crime is likely to happen within the next 12 hours. It uses three factors to determine where officers should patrol: crime type, crime location and crime time.

This technology was created by one of UCLA’s very own: Jeffrey Brantingham, an anthropology professor. Brantingham developed the program with statisticians and the help of National Science Foundation funding to solve public safety concerns in LA.

Yet, the public doesn’t feel any safer.

Sixty-eight UCLA students and faculty sent a letter to the LAPD commissioner April 2 denouncing the research for its ethical implications and because it naturalizes policies and practices that disparately impact black and brown communities.

And it’s not like the algorithm is even remotely successful. The Office of the Inspector General released a report in March stating that, after eight years, it could draw no conclusions on the software’s ability to reduce or prevent crime due to major inconsistencies in its oversight, implementation and criteria.

LAPD has a long record of racially disproportionate arrests, stops and searches, according to a study by the American Civil Liberties Union of Southern California. Its use of PredPol shows the troubling reality about policing in Los Angeles: Modernization has only amplified the department’s discriminatory tendencies.

Brantingham said he built the algorithm as a crime-fighting machine to serve a federal responsibility to make the public safer.

“Some places have more crime than others, and the police have a responsibility to deal with that crime,” Brantingham said.

While machines may not see color, LAPD does. PredPol’s algorithm intends to exclude personal information about people, like socioeconomic status or race, but this method isn’t foolproof. Algorithms are often heralded as objective because they’re impersonal and data-driven, but that’s not true when the data driving them has roots in the historic overpolicing of black and brown communities.

“This data was being taken from existing records of policing, which is fraught with racial biases,” said Casey Dalager, a graduate student in public policy.

In a perfect world, crime data would be objective. But when African Americans make up only 9% of the population yet 46% of arrests from metropolitan police in LA, as Million Dollar Hoods’ report demonstrates, it is evident that objectivity has never been a part of the justice system. PredPol normalizes the perception that anyone in the designated hot spot is a potential criminal.

Alveena Shah, a UCLA law student and editor in chief of the UCLA Law Review who signed the letter, said the algorithm’s criteria for determining hot spots itself is racist.

“The type of crimes coded into the database are already based on overpolicing of communities of color,” she said.

Brantingham disagrees, though, that PredPol contributes to overpolicing of certain communities.

“The presence of the algorithm doesn’t, in any way, change the number of officers designated to patrol the community that day,” he said.

Yet, the review released by the Office of the Inspector General states that patrol officers are given missions to “respond to a PredPol hotspot to provide high police visibility.”

That disconnect is just one example of the mixed messaging LA is falling for with regard to this algorithm. Brantingham is clearly in the business of promoting his product, which serves to benefit him more than it does Angelenos.

After all, PredPol is implemented by more than 50 police offices, with each contract running between $30,000 to more than $100,000 apiece. Considering Brantingham is both the co-founder of PredPol and the researcher behind the algorithm, it’s obvious he is biased toward his own work.

However, the algorithm perpetuates stereotypes about crime and race when the police go to these hot spots, justifies suspicion based on unreasonable geographic cues, and further contributes to the larger pattern of minorities being hypercriminalized.

“It has the potential to lead more black and brown faces into mass incarceration because it creates a narrative that these communities are just more volatile and need more policing,” said Taylore Thomas, a third-year African American studies student and a student researcher for Million Dollar Hoods.

Crime prevention to improve public safety is obviously a goal that any large, urban city like LA would strive toward. After all, this is PredPol’s mission statement and sole purpose.

But predictive policing suggests some people commit more crimes than others just because they fall into a hot spot. Refusal to acknowledge LAPD’s problematic history of racism makes PredPol racially charged too, regardless of whether it intended to be.

LAPD shouldn’t have wasted thousands on problematic and ineffective software that tells its officers how to do their jobs – especially since they weren’t doing them right in the first place.

Share this story:FacebookTwitterRedditEmail
Lena Nguyen | Alumna
Nguyen was the 2020-2021 Social Media Director. She was previously the 2019-2020 assistant Opinion editor and editorial board member.
Nguyen was the 2020-2021 Social Media Director. She was previously the 2019-2020 assistant Opinion editor and editorial board member.
COMMENTS
Featured Classifieds
More classifieds »
Related Posts