Você está na página 1de 6

Caviness 1

Alex Caviness
Prof. Campbell
UWRT 1103
November 9, 2016
When Big Brother Comes Knocking: The Case Against Predictive Policing
Howard Marks came home one day to the sight of his wife sleeping with another man.
Blinded by anger, he grabbed a pair of scissors and was prepared to take his revenge against his
spouse when the police burst into home and said, you [are] under arrest for the future murder of
Sarah Marks, that was to take place today How did the police know to show up? All because
of a clairvoyant who envisioned this crime taking place before it ever happened (qtd. in MayerSchnberger and Cukier). Now, this story is not actually realits from the movie Minority
Reportand while it is probably safe to say police are unlikely to ever use clairvoyants to base
arrests, substitute psychics with big data and suddenly that anecdote becomes shockingly
possible. In fact, it could be a reality much sooner than we may think. Faced with this imminent
problem, we have to ask, are the benefits of predictive policing worth the threats to civil
liberties?
To answer this question, we must first know what exactly predictive policing is, which in
turn requires a knowledge of the larger context of big data. Basically, big data is a term used to
describe the ability of society to harness information in novel ways to produce useful insights,
as big data experts Viktor Mayer-Schnberger and Kenneth Cukier put it. In other words, big
data is the use of algorithms to analyze huge amounts of information to find trends otherwise
impossible to see. Its applications are as far reaching as the mind can imaginefrom medicine to
astronomy to economicsbut most relevantly, big data is increasingly being used in the field of

Caviness 2
policing. In this case, information on previous crimes in an area is collected and inputed into a
software, and the software predicts where future crimes will occurhence predictive policing.
More specifically, from the website of the predominant predictive policing software, PredPol,
historical data is combined with information on day-to-day crimes as they come in, and the
software makes a prediction on where crimes are most likely to happen for the next few hours.
On the face of it, predictive policing sounds like a powerful tool in making policing more
efficient, and in some ways it is. For one, crime analysts can spend more time gathering
intelligence instead of predicting hot spots, or places where crimes are more likely to occur.
This fact is especially useful because predictive policing software can determine hot spots
immeasurably faster than people can and with far greater accuracy. Furthermore, PredPol can
attest to drops in crime as high as 30% in various cities after its implementation. The theory is
police departments can send their officers to hot spots when they have extra time, and their
presence will prevent crimes that would otherwise have happened.
One of the other hopes with predictive policing is that it will help reduce bias in policing
and help heal communities divided by racial profiling. The general idea, as Santa Cruz Police
Department crime analyst Zach Friend reports, is that by relying solely on data and taking out
some of the human factor of where to patrol, racial prejudice can be eliminated. Since computer
programs cannot be biased, the predictive policing software must not be biased, right? According
to Science writer Mara Hvistendahl, however, this is far from true. She fears that predictive
policing is a scientific curtain behind which racial prejudice will lie. At first it seems nigh on
impossible for algorithms to perpetuate racial bias, but it does, in fact, make sense because all of
the crime statistics a predictive policing software uses to base its predictions come from the
biased police force. The software then reflects a bias towards areas where more crimes are

Caviness 3
reported (likely skewed towards areas with many minorities). A Chicago Police Department
(CPD) program used a complex algorithm to provide officers with a list of likely offenders and
victims, usually people of color, and officers were basically told to keep an eye on those people.
While crime was not reduced as a result of this and supposedly at-risk victims were no more
likely to be victims than anyone else, people on the victim list were more likely to be arrested
than average citizens. A large part of the problem there is the list was generated using
information that was skewed to show more minorities as criminals and victims. This is
compounded by officers being told to watch them and having their own prejudice in a system all
justified by predictive policing. Thus, we are forced to conclude that predictive policing does not
keep police officers unbiased so much as provide a convenient cover for systemic prejudices.
(Hvistendahl).
Another equally tricky issue with predictive policing arises when considering the concept
of reasonable suspicion. Reasonable suspicion, as NPR law enforcement correspondent Martin
Kaste reports, is the suspicion required to stop somebody on the street. In other words, if a cop
sees someone with furtive movements, he has grounds to stop the person, whereas he could not
stop someone just because he felt negatively about the other person. Why is this a problem? The
issue arises when a police officer is sitting in his squad car having driven to a hot spot marked by
the computer for a likelihood of car theft. As he sits there, he sees a man go up to a car and
struggle for a moment before getting insidecould be car theft. But, does that computer-drawn
hot spot provide the reasonable suspicion necessary to stop him? Law professor Andrew
Ferguson says in the NPR article that police departments have told officers not to use predictive
policing as reasonable suspicion, but it is likely only a matter of time before hot spots are used as
justification. There are not any laws dictating if predictive policing can be used in establishing

Caviness 4
reasonable suspicion, so it is worth considering the civil rights we would be giving up by letting
a computer determine if we are suspicious or not. Really, this situation is not very different from
someone getting stopped because a cop thinks the neighborhood is suspect. It is profilingjust
done with the aid of a computer programand it demonstrates an unjust disregard for civil
liberties.
An even scarier thought towards the future goes back to the opening anecdote from
Minority Report. It may have sounded science fiction at the time, but we could be quite close to
that being a reality. Viktor Mayer-Schnberger and Kenneth Cukier tell how most state parole
boards already use big data to determine the likelihood that a prisoner will commit a crime if
released. It goes without saying that given the imperfections of any prediction, this could lead to
a lot of faulty decisions. In other words, no matter how good their prediction algorithms are,
there will always be some prisoners who would never commit another crime denied parole and
prisoners who will commit a crime set free. This is only the beginning of the path, however.
Police departments are already seeking to go beyond the current state of predictive policing to
where algorithms can predict exact locations of crimes and who will commit them. Imagine how
perfect that would be, knowing who was going to crime before they did and being able to show
up at the scene and intervene before anyone got hurt. That all works fine until society wants to
punish the would-be criminal for his would-be actions. No action could be more misguided.
While assuredly society just wants to hold people accountable and prevent them
from committing crimes in the future, locking people up on the basis of crimes they had not yet
committed is a grave mistake. For starters, our entire system of justice is based on the idea that
one is innocent until it can proven beyond a reasonable doubt that he committed the action of
which he is accused. Without having performed that act, there is obviously no way to prove that

Caviness 5
he did it because it never happen. Then, there is the fact that, as mentioned earlier, predictions
are never perfect (if they were, then there would be no choice because everything would occur as
predicted), and therefore, no matter how good a predictive policing software is, there will be
people jailed based on faulty predictions. On top of this, linking people to actions they have not
yet committed not only comes dangerously close to thought crime reminiscent of Big Brother,
but it also denies people the freedom of choice that is fundamental to the human condition
(Mayer-Schnberger and Cukier). People variously have thoughts of performing wrongdoing,
from fleeting curiosities to serious considerations. What sets them apart from criminals is that
they choose not to act on those thoughts. We, as humans, have the power to decide what thoughts
to make reality, and holding people guilty for crimes they have not yet chosen to commit is a
violation of that very principle.
Where does this leave us on predictive policing? It may reduce crime rates by a fair bit,
but embracing predictive policing in full force comes at the price of perpetuating racial bias,
opening the gateway to baseless police stops, and stripping away the freedom of choice. It is a
tool that can be utilized to societys advantage, but only as a supplement to guide police on
where to patrol. Beyond that and predictive policing threatens to undermine our civil liberties as
we know them today.

Caviness 6
Works Cited
Friend, Zach. "Predictive Policing: Using Technology to Reduce Crime." FBI. 9 Apr. 2013.
Web. Accessed 15 Oct. 2016.
Hvistendahl, Mara. Crime forecasters. Science, vol. 353, no. 6307, 2016, pp.1484-1487. Web.
Accessed 18 Oct. 2016.
Kaste, Martin. "Can Software That Predicts Crime Pass Constitutional Muster?" NPR. 26 July
2013. Web. Accessed 15 Oct. 2016.
Mayer-Schnberger, Viktor, and Kenneth Cukier. Big Data: A Revolution That Will Transform
How We Live, Work and Think. London: John Murray, 2013, pp. 157-163. Print.
Predictive Policing Software. PredPol, 2015. Web. Accessed 19 Oct. 2016.

Você também pode gostar