By: Jennifer Granick
Wired
A team of neuroscientists announced a scientific breakthrough last week in the use of brain scans to discover what's on someone's mind.
Researchers from the Max Planck Institute for Human Cognitive and Brain Sciences, along with scientists from London and Tokyo, asked subjects to secretly decide in advance whether to add or subtract two numbers they would later be shown. Using computer algorithms and functional magnetic resonance imaging, or fMRI, the scientists were able to determine with 70 percent accuracy what the participants' intentions were, even before they were shown the numbers.
The study used "multivariate pattern recognition" to identify oxygen flow in the brain that occurs in association with specific thoughts. The researchers trained a computer to recognize these flow patterns and to extrapolate from what it had learned to accurately read intentions.
The finding raises issues about the application of such tools for screening suspected terrorists -- as well as for predicting future dangerousness more generally. Are we closer than ever to the
crime-prediction technology of Minority Report?
As I've argued in this space before, the popular press tends to over-dramatize scientific advances in mind reading. FMRI results have to account for heart rate, respiration, motion and a number of other factors that might all cause variance in the signal. Also, individual brains differ, so scientists need to study a subject's patterns before they can train a computer to identify those patterns or make predictions.
While the details of this particular study are not yet published, the subjects' limited options of either adding or subtracting the numbers means the computer already had a 50/50 chance of guessing correctly even without fMRI readings. The researchers indisputably made physiological findings that are significant for future experiments, but we're still a long way from mind reading.
Still, the more we learn about how the brain operates, the more predictable human beings seem to become. In the Dec. 19, 2006, issue of The Economist, an article questioned the scientific validity of the notion of free will: Individuals with particular congenital genetic characteristics are predisposed, if not predestined, to violence.
Studies have shown that genes and organic factors like frontal lobe impairments, low serotonin levels and dopamine receptors are highly correlated with criminal behavior. Studies of twins show that heredity is a major factor in criminal conduct. While no one gene may make you
a criminal, a mixture of biological factors, exacerbated by environmental conditions, may well do so.
Looking at scientific advances like these, legal scholars are beginning to question the foundational principles of our criminal justice system.
For example, University of Florida law professor Christopher Slobogin, who is visiting at Stanford this year, has set forth a compelling case for putting prevention before retribution in criminal justice.
Two weeks ago, Slobogin gave a talk based on his book, Minding Justice. He pointed to the studies showing that our behavior is predetermined or strongly influenced by biology, and that if we can identify those biological factors, we can predict behavior. He argues that the justice system should provide treatment for potential wrongdoers based on predictions of dangerousness instead of settling for punishing them after the fact.
It's a tempting thought. If there is no such thing as free will, then a system that punishes transgressive behavior as a matter of moral condemnation does not make a lot of sense. It's compelling to contemplate a system that manages and reduces the risk of criminal behavior in the first place.
Yet, despite last week's announcement from the Max Planck Institute, neuroscience and bioscience are not at a point where we can reliably predict human behavior. To me, that's the most powerful objection to a preventative justice system -- if we aren't particularly good at
predicting future behavior, we risk criminalizing the innocent.
We aren't particularly good at rehabilitation, either, so even if we were sufficiently accurate in identifying future offenders, we wouldn't really know what to do with them.
Nor is society ready to deal with the ethical and practical problems posed by a system that classifies and categorizes people based on oxygen flow, genetics and environmental factors that are correlated as much with poverty as with future criminality.
"Minority Report," a short story by Philip K. Dick that became the 2002 Steven Spielberg blockbuster, portrays a society where investigators can learn in advance that someone will commit a murder, even before the offender himself knows what he will do. Gattaca, a
1997 film, tells of a society that discriminates against genetically "inferior" individuals.
Science fiction has long grappled with the question of how a society that can predict future behavior should act. The stories suggest that people are more varied, more free, than the computer models allow. They also suggest that a system based on predictions is subject not
only to innocent mistakes but also to malicious manipulation at a level far greater than our current punishment-oriented regime.
In time, neuroscience may produce reliable behavior predictions. But until then, we should take the lessons of science fiction to heart when deciding how to use new predictive techniques.
No comments:
Post a Comment