Other cops start restraining Marks, who screams, "I did not do anything!" The opening scene of the film Minority Report depicts a society in which predictions seem so accurate that the police arrest individuals for crimes before they are committed. People are imprisoned not for what they did, but for what they are foreseen to do, even though they never actually commit the crime. The movie attributes this prescient and preemptive law enforcement to the visions of three clairvoyants, not to data analysis. But the unsettling future Minority Report portrays is one that unchecked big-data analysis threatens to bring about, in which judgments of culpability are based on individualized predictions of future behavior.
Of course, big data is on track to bring countless benefits to society. It will be a cornerstone for improving everything from healthcare to education. We will count on it to address global challenges, be it climate change or poverty. And that is to say nothing about how business can tap big data, and the gains for our economies. The benefits are just as outsized as the datasets. Yet we need to be conscious of the dark side of big data too.
Already we see the seedlings of Minority Report-style predictions penalizing people. Parole boards in more than half of all U.S. states use predictions founded on data analysis as a factor in deciding whether to release somebody from prison or to keep him incarcerated. A growing number of places in the United States -- from precincts in Los Angeles to cities like Richmond, Virginia -- employ "predictive policing": using big-data analysis to select what streets, groups, and individuals to subject to extra scrutiny, simply because an algorithm pointed to them as more likely to commit crime.
But it certainly won't stop there. These systems will seek to prevent crimes by predicting, eventually down to the level of individuals, who might commit them. This points toward using big data for a novel purpose: to prevent crime from happening.
A research project under the U.S. Department of Homeland Security called FAST (Future Attribute Screening Technology) tries to identify potential terrorists by monitoring individuals' vital signs, body language, and other physiological patterns. The idea is that surveilling people's behavior may detect their intent to do harm. in tests, the system was 70 percent accurate, according to the DHS. (What this means is unclear; were research subjects instructed to pretend to be terrorists to see if their "malintent" was spotted?) Though these systems seem embryonic, the point is that law enforcement takes them very seriously.
Stopping a crime from happening sounds like an enticing prospect. Isn't preventing infractions before they take place far better than penalizing the perpetrators afterwards? Wouldn't forestalling crimes benefit not just those who might have been victimized by them, but society as a whole?
But it's a perilous path to take. If through big data we predict who may commit a future crime, we may not be content with simply preventing the crime from happening; we are likely to want to punish the probable perpetrator as well. That is only logical. If we just step in and intervene to stop the illicit act from taking place, the putative perpetrator may try again with impunity. In contrast, by using big data to hold him responsible for his (future) acts, we may deter him and others.
Today's forecasts of likely behavior -- found in things like insurance premiums or credit scores -- usually rely on a handful of factors that are based on a mental model of the issue at hand (that is, previous health problems or loan repayment history). Basically, it's profiling -- deciding how to treat individuals based on a characteristic they share with a certain group. With big data we hope to identify specific individuals rather than groups; this liberates us from profiling's shortcoming of making every predicted suspect a case of guilt by association.
The promise of big data is that we do what we've been doing all along -- profiling -- but make it better, less discriminatory, and more individualized. That sounds acceptable if the aim is simply to prevent unwanted actions. But it becomes very dangerous if we use big-data predictions to decide whether somebody is culpable and ought to be punished for behavior that has not yet happened.
The very idea of penalizing based on propensities is nauseating. To accuse a person of some possible future behavior is to undermine the very foundation of justice: that one must have done something before we can hold him accountable for it. After all, thinking bad things is not illegal, doing them is. It also negates the idea of the presumption of innocence, the principle upon which our legal system, as well as our sense of fairness, is based. And if we hold people responsible for predicted future acts, ones they may never commit, we also deny that humans have a capacity for moral choice.
The important point here is not simply one of policing. The danger is much broader than criminal justice; it covers all areas of society, all instances of human judgment in which big-data predictions are used to decide whether people are culpable for future acts or not. Those include everything from a company's decision to dismiss an employee, to a doctor denying a patient surgery, to a spouse filing for divorce.
Perhaps with such a system society would be safer or more efficient, but an essential part of what makes us human -- our ability to choose the actions we take and be held accountable for them -- would be destroyed. Big data would have become a tool to collectivize human choice and abandon free will in our society. And even if a person isn't thrown into a chic, night club-like standing prison as in the film Minority Report, the affect may look like a penalty nonetheless. A teenager visited by a social worker for having the propensity to shoplift will feel stigmatized in the eyes of others -- and his own.
In the big-data era we will have to expand our understanding of justice, and require that it include safeguards for human agency as much as we currently protect procedural fairness. without such safeguards the very idea of justice may be utterly undermined.
By guaranteeing human agency, we ensure that government judgments of our behavior are based on real actions, not simply on big-data analysis. Thus government must only hold us responsible for our past actions, not for statistical predictions of future ones. And when the state judges previous actions, it should be prevented from relying solely on big data. And companies should make their big data activities open to scrutiny if it leads to substantial harm to many.
A fundamental pillar of big-data governance must be a guarantee that we will continue to judge people by considering their personal responsibility and their actual behavior, not by "objectively" crunching data to determine whether they're likely wrongdoers. Only that way will we treat them as human beings: as people who have the freedom to choose their actions and the right to be judged by them.
This article was excerpted with permission from Big Data: A Revolution That Will Transform How We Live, Work, and Think (Houghton Mifflin Harcourt, 2013). Viktor Mayer-Schönberger is a professor of Internet governance and regulation at the Oxford Internet Institute in the UK. Kenneth Cukier is the data editor of The Economist
Even with this people will still cry racism.
What? Please, go back to the jesus forums and leave the science to the free thinkers. Honestly, that's what you took from this article? A way to justify your bigotry? Not content with keeping the exclusionary rhetoric contained within the walls of your delusion chambers (church) you insist on spreading your caveman brain paranoia on Popsci. Thanks but no thanks.
Wrap this article with the article of NSA.
So yes, with the powers delegated to the NSA, since 911, computers are monitoring all conversations in the world real time and act on it!
The link above is of an old article, but also google more recent articles about NSA as well.
Big data, big abuse. Doesn't anyone see how slippery this slop is? We can't trust eye witnesses, now people want to trust a heuristic. Whatever happened to the Constitution and the right to face your accuser (who incidently is supposed to be a human being)? The current breed of politicians are hell bent on creating hell right here on earth, this is just another step toward that goal. They are suspending kids from school because they use a bagle as a toy gun, I suppose they should just lock them up for life because "bagle guns" mean you are definitely going to be a mass murderer, big blue said so.
I personally think federal laws should be passed to forbid the use of GPS or computer tracking from being used in trials. A computer should NEVER be allowed to testify.
"free thinkers" Really? They aren't good at science (nor free thinking). "Free thinkers" are mostly just a bunch of Darwin thumping fundamentalists who don't really understand what religion is (because they claim not to be religious) and don't have a good grasp on how the scientific method works.
But now to comment on the article. No! If anything just send some police in to deter the crime. But if someone hasn't committed a crime don't punish them.
How can you punish someone that "might" do something in the future. Silly. So what's next, involuntary abortions because someones baby may grow up to be a criminal? Technology is good, but it will be our demise.
"Should We Use Big Data To Punish Crimes Before They're Committed?"
Punish crimes all you want. But you just can't punish people who are not yet criminals and if "you" did that to me, I'm pretty sure I would feel strongly inclined to commit that crime after I've already payed for it.
What a ridiculous article.
@davek01521: I believe the word you're looking for is BAGEL.
Punishment is a RESPONSE to misbehavior. If you know somebody is likely to commit a crime then you can use proactive measures. Get the person some counseling, remove their motivation for the crime, increase the targets security, and similar tactics.
But you'll never see that as the plot of a movie because it's boring to watch. Instead they'll write about the improbable response because it's far more interesting. If it were probable (and therefore common) it wouldn't be interesting.
You people should really read the fine print of the powers of the government after 911 and the powers of the NSA.
You will really be shocked!
This will prevent countless amounts of violence, theft, cheats, waste, environment infractions, and provide a safety umbrella for the citizen whom have invested their whole lives in spending their accumulated funds to retire in a peaceful harmonious post (DEMOLITION MAN) movie reference) type of life style.
The people whom HOLD!! the data can easily exempt themselves from the civil liberty persecution, & Will not be held accountable for a glitch in their quantitative algorithms. The police departments will no longer have to patrol only persecute. The judicial system will no longer matter. No longer will freedom of thought be a freedom. The constitution will be only a memory. No one will ever be able to think about creating a bank robbing movie without having some way to pay for the thought. Peer pressure will be a crime.
AND key what happens if a crime is committed against a family member and is never addressed but because you thought about retaliating you get the sentence?
(Where do you start to implement this, Wall Street, Banks, Military, elected officials or everyday civilians or all at once?) The whole system which defines laws will be restructured and will likely start from the bottom up and stop midway exempting those who are immuned due to fame, roles, idols, Godly figures (all possible but unknown) don’t want to pass judgments that for the data base to decide right?
Ethics- 4th amendment no longer, KEEP IN MIND The habits and algorithms already exist on what you buy, who your friends are, when you leave and take your phone with you etc.
Blessings - Some can say this would be a blessing from God! To think no one will ever have to worry about going outside or worry about getting into a wreck or fear for the safety of your life because it is truly safe, but will now only have to fear that ignorance of the law will impede peoples thought of new laws that restrict creative thinking and human nature and their inability to make decision that and will have to be aware of the changes daily as the laws are made.
These are just my thoughts and they don’t have a firm definitive understanding of how this will work other than a new way to possibly sell you a new phone, car, toy, partner or make you buy Carbonated product, but what I’ve stated what has already been addressed by many of the movies –(yes some movies are documentries) representing the idea being presented.
@starfire42: Well said.
Yes, this is a sensationalist title wrapping an interesting topic. Of course we should be cautious of the power that we give to algorithms, but think of the power that we give to individual witnesses and experts now? The article buried the lede. Humans use algorithms to make judgements about people all the time. They are combinations of previous experience, prejudices, weird perceptions that are opaque and nearly impossible to examine or question. An non-human algorithm that can be examined, made public, and tweaked is far more useful. The social sciences, gathering real data about occurrences and tendencies in society has been the only thing to ever combat our miserably biased internal algorithms. Sure, if we let due process fade into the night because we trust machines so much, then we're in trouble, but the problem isn't the technology, it's the users. We've already given ourselves the power to punish and even kill with far less evidence than these algorithms would give us. why not have another tool?
Only human brains using their outmoded, entirely statistic-less and biased equations would come up with a cause-effect relationship like your [bagel] gun scenario. The application of careful science is the only thing that can combat this idiocy. Yes, a politician, or other authority member could make claims based on bad science, and claim that the mighty algorithm supported their claims, but that happens now using any manner of justification. at least we'd be able to refute the analysis if real analysis had been done.
There are people at Google who are ALREADY abusing their power by looking into private emails and keeping track of idle hearsay.
They blacklist based only on idle speculation and rumor...not facts.
Google, in the wrong hands, is the greatest threat to personal liberty our nation has seen.
Unchecked, they can do a LOT of damage to a person's ability to get a job and make even an honest living.
Go-Ogle: We don't care if it is true, we just care that it can do you harm.
All a person has to do to blacklist someone is boost the
Go-Ogle page rank of the slander they want to appear FIRST in a Go-Ogle search of a victim's name.
Then, when an employer does a search of the victim's name, the slander pops up first...ensuring that the employer will see it before anything else.
Do a Go-Ogle search of Lindsey Stone's name, and you will see what I am talking about. She will never be able to escape her ONE thoughtless act as long as Go-Ogle continues to be indifferent to her fate.
Go-Ogle: The big advertising bucks come first...your puny reputation? Not even on our radar.
Those who propose ludicrous ideas such as this, are either ignorant or criminals. Using "average" info to draw a conclusion or prediction on a personal level is wrong to say the least. This approach is used by corrupt systems such as McCarthyism, Fascism, and dictatorship governments!
I would say that punishing people on the probability that they will commit a crime opens the legal system up to a lot of court cases against themselves. This sounds like it would be a great boom for lawyers.
More realistic would be sending a message to an individual that their behavior has made them conspicuous and that they need to explain themselves…like a traffic stop for erratic driving.
A couple of years ago I read about a camera system in (I think) England where the police officer watching the camera was able to communicate through an audio system to those observed that their behavior is unacceptable…”The cup you just had in your hand belongs in the trash can and not on the ground…yes…you in the red jacket…pick the paper cup up that you threw on the ground and put it in the trash can.”
This is disturbing, especially when one considers our current administration's overreaching and blatant disregard for the Constitution. These kinds of "laws" or methods only blossom in an environment of tyranny, such as the track U.S. is currently on, moving into a Socialist Democracy. It is exactly the same as trying to institute more predictive gun laws as is the current fashion. You cannot take away rights because of what might happen or you end up in a society that is anything but free.
Reasonable thought is the basis of prediction, in computers it is math based algorithms which are created by human beings. Instead of working within a human based system would this be better as an autonomous robotic system that bears the capability of stopping crime mid-action (Instead of the precrime possibilities). Furthermore in such a long term mechanical interface does it become necessary to develop a standard "Three Laws"? What then of the "Zeroth Law" possibility in a machine system that will physically outlast the human with which it interacts.
With the use of this machine and its continued enhancement after deployment is there any way to morally and socially avoid an evolution to an Asimovic world?
The Liberals do this each time they blame the 99.9999% population that doesn't hurt anyone with their privately held firearms by trying to pass restrictive firearms laws. This is nothing new.
How thrilling, thought crime is just around the corner. I also love the irony that those who scream the loudest that Obama and his socialist friends are taking away their freedom are the same ones that are screaming the loudest to bring on the punishments for Thought Crime and the Thought Police to make them safer. These people deserve neither freedom nor security.
I also hear the chants of 'Witch, Witch, burn her' because an old spinster lady had a cat. How do you know if this lady is a witch, well we can just tell, burn her. If she comes out looking like burnt meat, then she was a witch, if she comes out unscathed, she is a witch, and we'll then hang her.
Crimes need to be physically committed in order for punishment to be necessary. We don't need a pre-crime division of the police, we don't need the Thought Police. Big data has no role in law enforcement.
I guess it pretty much goes hand in hand with O's Drone Program. If the data center says that an American sitting in a restaurant eating a hamburger is a possible terrorist, then one of O's drones will wipe him out - and everyone else in the vicinity including that restaurant's business. Then, as in "Equilibrium," every American keeps taking drugs, so they remain as emotionless and inhuman as possible about the whole idea of living in a berserk, homicidal police state.
Auroria posted the following 03/07/13 at 8:08 am:
"POPSCI bans and blocks people from their words. Yes, I can see the USA government doing the same thing in an automated computer program. The government is just reflected what is common from society now."
When you provide the forum and the bandwidth, Auroria, you will do the same.
In most of the US, individuals are routinely penalized before being convicted of any crime. Take a look at how governments use asset forfeiture to seize a person's private property before that person has been convicted of any crime. This essentially means a person bears the burden of proving their innocence in order to recover their property from the government.
And good luck with that.
get with the program popsci
where's the thumbs up/down icon?
This article has all the credibility of the movie. Zip. Nada. None. It's total drivel.
Note to PopSci: This is the kind of nonsense writing that caused me to take a pass on renewing my subscription to the print mag and that will keep me from subscribing to the digital edition. I won't pay to read this tripe. It's bad enough when it's free.
I once knew a Yugoslav, who thought America was brimming with criminals, and that he might get jumped at any minute. He was boasting of the safety of his streets at home, and I explained our principle of innocent until proven guilty, which risks releasing a guilty person, rather than to imprison an innocent one.
He replied that he would be willing to go to prison for something he didn't do, rather than to know the country was unsafe because criminals were set free if they could not be proven guilty.
He may have believed that at the moment, but would certainly change his mind if it would really happen.
Sounds great .. start with politicians!! Really if it can be done, why not just prevent the crime before it happens?
A crime must be committed before a law is broken. There is no such thing as crime prevention. This is a simple kind of fascist idea isn't it? Punishment may not be implemented before there is a crime or itself becomes the crime. What a DUMB IDEA....
This will be the end of small businesses, individual patents, entrepreneurship, open markets and the opportunity for a person to create a unique product.
The conglomerates will own the data before a person could even begin to get it to market.
Open markets are now CLOSED.
How about starting with ALL/past(living) current politicians,including the president,,,and I mean local,villages,towns,cities,counties,states,national.Include jury members,court justices(judges),include members of the U.S.supreme court,include company presidents,CEOs board members,stock holders,law enforcement(all brances)Everyone,No one to be exempted,including Me..Then who would be left to judge who has the honorability/dishonorability to make final decisions in the future of anything for the good /best for the U.S.A.