Wednesday, June 20

Tabatha Lewis: LAPD’s predictive algorithm is inaccurate, dehumanizes offenders


ALGORITHM-01.png


Steven Spielberg, in his 2002 movie “The Minority Report,” played out the hypothetical in which an organization can predict crimes before they happen. The Los Angeles Police Department is about 10 years late to the game, and it looks like it’s taking the department more than 145 minutes to see why trying to predict crime is an inherently problematic strategy.

Earlier this year, LAPD was forced to release court documents pertaining to a lawsuit filed against it by Stop LAPD Spying Coalition, an organization that advocates for citizens’ right to privacy. The lawsuit surrounded the department’s use of a predictive algorithm to identify potential offenders and be proactive in fighting crime.

In 2012, the LAPD implemented the Los Angeles’ Strategic Extraction and Restoration program, in which analysts find crime “hot spots” and create a chronic offenders bulletin. This program originally looked at data from crimes, incidents and police service calls from 2006 to 2011 in the Newton Division in Los Angeles, with special attention to those involving firearms. The department then identified five hot spots in the division and increased the number of officers in these areas.

The culmination of this project: Police spent an extra 55.5 man hours per week in these hot spots.

LAPD’s use of LASER is dehumanizing. LASER and other predictive algorithms exclude criminals from society and assume that those who have committed crimes will commit them again, instead of helping these individuals not continue to commit crimes.

LASER was expanded into seven more zones in 2014. The program’s crime bulletin is now essentially a list of offenders to help officers do “proactive police work.” Officers have access to this bulletin from their patrol cars. This list is compiled by analysts using data from past police encounters. The individuals on this list are then given points for activities such as being part of a gang, on parole, arrested while carrying a handgun and for having a police encounter within the past two years. Having more points increases your rank as a probable offender.

However, these algorithms may not even be that reliable in predicting crime.

According to ProPublica, predictive crime algorithms can have an inherent bias. The publication, for example, pointed out that while two individuals were arrested for misdemeanors – one for petty theft and another for shoplifting – a predictive crime algorithm in Florida flagged the former, a black teenager, and not the latter, a middle-aged white man. In fact, the algorithm seemed to overlook how the white man had a history of crime, including armed robbery.

The study essentially found that there was a bias for African Americans in the algorithm used in Florida.

LAPD did not list in its documents what method it used to compile the probable offender list used in the LASER program. This uncertainty draws into question whether the strategy used in LAPD’s own algorithm to compile the list of offenders is biased.

There is also the issue of algorithms being fed biased data. LASER’s offender-based strategy uses data from police incidents. But there is no algorithm to determine who the police decide to stop. That aspect has human bias that one cannot sift out.

A study in the Policing and Society journal pointed out that the use of an algorithm reduces the accountability of police. Officers could simply use the excuse that an algorithm flagged a particular individual as high risk. This would significantly weaken movements against blatant police discrimination such as Black Lives Matter, as officers would gain a tool to deflect from accusation.

Additionally, the crime bulletin’s top-10 offenders in 2012 each had more than 25 points. In one case, an individual was able to earn nine points within two months because they were stopped twice a day on four separate occasions.

LAPD’s crime bulletin is cyclical: If the police decide to stop an individual, that person could end up with more points on their police record. And if they committed any crimes in the past, they have a higher chance of making it onto the Chronic Offender Bulletin. Additionally, the bulletin contains physical descriptions, which increases the likelihood of people on the list being stopped by police and getting even more points.

Certainly, LAPD found in 2012 that the number of homicides per month decreased by 22.59 percent and the average number of Part I violent crimes, which includes nonnegligent manslaughter, rape and aggravated assault, decreased by 5.393. LAPD tracked similar data in 18 other divisions, but the decrease in crime rates only occurred in the Newton Division, where operation LASER was initially implemented.

But a basic high school statistics lesson is that correlation does not imply causation. While the crime rates may have decreased after operation LASER was implemented, this does not mean that it was solely because of the program. If the number of patrolling officers in an area was increased, people could commit crimes elsewhere where there aren’t police.

LAPD did not immediately respond for comment.

In the LASER mission statement, LAPD compared criminals to tumors, which they remove or extract with “laserlike precision.” Not only is LASER a problematic and inaccurate algorithm, but it is incredibly biased. It suggests crime is innate, not a byproduct of people’s socioeconomic conditions.

Proactive police work is giving the residents of troubled neighborhoods intrinsic reasons not to want to commit crimes, not putting them in cuffs before they have the chance to choose a life outside of crime.

Share on FacebookTweet about this on TwitterEmail this to someoneShare on Google+Share on Reddit

Comments are supposed to create a forum for thoughtful, respectful community discussion. Please be nice. View our full comments policy here.