There are tests that have right answers, which are returned with a number on top in a red circle, and there are tests with open-ended questions, which provide insight into the test taker’s mind
By NOAM COHEN
July 29, 2009
The Rorschach test, a series of 10 inkblot plates created by the Swiss psychiatrist Hermann Rorschach for his book “Psychodiagnostik,” published in 1921, is clearly in the second category.
Yet in the last few months, the online encyclopedia Wikipedia has been engulfed in a furious debate involving psychologists who are angry that the 10 original Rorschach plates are reproduced online, along with common responses for each. For them, the Wikipedia page is the equivalent of posting an answer sheet to next year’s SAT.
They are pitted against the overwhelming majority of Wikipedia’s users, who share the site’s “free culture” ethos, which opposes the suppression of information that it is legal to publish. (Because the Rorschach plates were created nearly 90 years ago, they have lost their copyright protection in the United States.)
“The only winners seem to be those for whom this issue has become personal, and who see this as a game in which victory means having their way,” one Wikipedia poster named Faustian wrote on Monday, adding, “Just don’t pretend you are doing anything other than harming scientific research.”
What had been a simmering dispute over the reproduction of a single plate reached new heights in June when James Heilman, an emergency-room doctor from Moose Jaw, Saskatchewan, posted images of all 10 plates to the bottom of the article about the test, along with what research had found to be the most popular responses for each.
“I just wanted to raise the bar — whether one should keep a single image on Wikipedia seemed absurd to me, so I put all 10 up,” Dr. Heilman said in an interview. “The debate has exploded from there.”
Psychologists have registered with Wikipedia to argue that the site is jeopardizing one of the oldest continuously used psychological assessment tests.
While the plates have appeared on other Web sites, it was not until they showed up on the popular Wikipedia site that psychologists became concerned.
“The more test materials are promulgated widely, the more possibility there is to game it,” said Bruce L. Smith, a psychologist and president of the International Society of the Rorschach and Projective Methods, who has posted under the user name SPAdoc. He quickly added that he did not mean that a coached subject could fool the person giving the test into making the wrong diagnosis, but rather “render the results meaningless.”
To psychologists, to render the Rorschach test meaningless would be a particularly painful development because there has been so much research conducted — tens of thousands of papers, by Dr. Smith’s estimate — to try to link a patient’s responses to certain psychological conditions. Yes, new inkblots could be used, these advocates concede, but those blots would not have had the research — “the normative data,” in the language of researchers — that allows the answers to be put into a larger context.
And, more fundamentally, the psychologists object whenever diagnostic tools fall into the hands of amateurs who haven’t been trained to administer them. “Our ethics code that governs the behavior of psychologists talks about maintaining test security,” Steve J. Breckler, the executive director for science at the American Psychological Association, said in an interview. “We wouldn’t be in favor of putting the plates out where anyone can get hold of them.”
Alvin G. Burstein, a professor emeritus of psychology at the University of Tennessee, Knoxville, wrote in an e-mail message that his preference was to have the images removed but that he did not think they would harm the psychological process.
“The process of making sense of one’s experience,” he wrote, “is gratifying. To take Rorschach’s test is to make sense of ambiguity in the context of someone who is interested in how you do that.”
Trudi Finger, a spokeswoman for Hogrefe & Huber Publishing, the German company that bought an early publisher of Hermann Rorschach’s book, said in an e-mail message last week: “We are assessing legal steps against Wikimedia,” referring to the foundation that runs the Wikipedia sites.
“It is therefore unbelievably reckless and even cynical of Wikipedia,” she said, “to on one hand point out the concerns and dangers voiced by recognized scientists and important professional associations and on the other hand — in the same article — publish the test material along with supposedly ‘expected responses.’ ”
Mike Godwin, the general counsel at Wikimedia, hardly sounded concerned, saying he “had to laugh a bit” at the legal and ethical arguments made in the statement from Hogrefe.
Hogrefe licenses a number of companies in the United States to sell the plates along with interpretative material. One such distributor, Western Psychological Services, sells the plates themselves for $110 and a larger kit for $185. Dr. Heilman, the man who originally posted the material, compared removing the plates to the Chinese government’s attempt to control information about the Tiananmen massacre. That is, it is mainly a dispute about control, he said.
“Restricting information for theoretical concerns is not what we are here to do,” Dr. Heilman said, adding that he was not impressed by the predictions of harm from those who sought to keep the Rorschach plates secret. “Show me the evidence,” he said. “I don’t care what a group of experts says.”
To illustrate his point, Dr. Heilman used the Snellen eye chart, which begins with a big letter E and is readily available on the Wikipedia site.
“If someone had previous knowledge of the eye chart,” he said, “you can go to the car people, and you could recount the chart from memory. You could get into an accident. Should we take it down from Wikipedia?”
And, Dr. Heilman added, “My dad fooled the doctor that way.”
A Scientific Study of the Human Mind and the Understanding of Human Behavior through the analysis and research of Meta Psychology.
Wednesday, July 29, 2009
Monday, July 27, 2009
Schizophrenia and Bipolar Disorder Share Genetic Roots
Chromosomal Hotspot of Immunity/Gene Expression Regulation Implicated
A person holding a gene chip, also known as a DNA microarray.
A trio of genome-wide studies – collectively the largest to date – has pinpointed a vast array of genetic variation that cumulatively may account for at least one third of the genetic risk for schizophrenia. One of the studies traced schizophrenia and bipolar disorder, in part, to the same chromosomal neighborhoods.
"These new results recommend a fresh look at our diagnostic categories," said Thomas R. Insel, M.D., director of the National Institute of Mental Health (NIMH), part of the National Institutes of Health. "If some of the same genetic risks underlie schizophrenia and bipolar disorder, perhaps these disorders originate from some common vulnerability in brain development."
Three schizophrenia genetics research consortia, each funded in part by NIMH, report separately on their genome-wide association studies online July 1, 2009, in the journal Nature. However, the SGENE, International Schizophrenia (ISC) and Molecular Genetics of Schizophrenia (MGS) consortia shared their results - making possible meta-analyses of a combined sample totaling 8,014 cases and 19,090 controls.
All three studies implicate an area of Chromosome 6 (6p22.1), which is known to harbor genes involved in immunity and controlling how and when genes turn on and off. This hotspot of association might help to explain how environmental factors affect risk for schizophrenia. For example, there are hints of autoimmune involvement in schizophrenia, such as evidence that offspring of mothers with influenza while pregnant have a higher risk of developing the illness.
"Our study was unique in employing a new way of detecting the molecular signatures of genetic variations with very small effects on potential schizophrenia risk," explained Pamela Sklar, M.D., Ph.D., of Harvard University and the Stanley Center for Psychiatric Research, who co-led the ISC team with Harvard's Shaun Purcell, Ph.D.
"Individually, these common variants' effects do not all rise to statistical significance, but cumulatively they play a major role, accounting for at least one third – and probably much more – of disease risk," said Purcell.
Among sites showing the strongest associations with schizophrenia was a suspect area on Chromosome 22 and more than 450 variations in the suspect area on Chromosome 6. Statistical simulations confirmed that the findings could not have been accounted for by a handful of common gene variants with large effect or just rare variants. This involvement of many common gene variants suggests that schizophrenia in different people might ultimately be traceable to distinct disease processes, say the researchers.
"There was substantial overlap in the genetic risk for schizophrenia and bipolar disorder that was specific to mental disorders," added Sklar. "We saw no association between the suspect gene variants and half a dozen common non-psychiatric disorders."
Still, most of the genetic contribution to schizophrenia, which is estimated to be at least 70 percent heritable, remains unknown.
"Until this discovery, we could explain just a few percent of this contribution; now we have more than 30 percent accounted for," said Thomas Lehner, Ph.D., MPH, chief of NIMH's Genomics Research Branch. "The new findings tell us that many of these secrets have been hidden in complex neural networks, providing hints about where to look for the still elusive – and substantial – remaining genetic contribution."
The MGS consortium pinpointed an association between schizophrenia and genes in the Chromosome 6 region that code for cellular components that control when genes turn on and off. For example, one of the strongest associations was seen in the vicinity of genes for proteins called histones that slap a molecular clamp on a gene's turning on in response to the environment. Genetically rooted variation in the functioning of such regulatory mechanisms could help to explain the environmental component repeatedly implicated in schizophrenia risk.
The MGS study also found an association between schizophrenia and a genetic variation on Chromosome 1 (1p22.1) which has been implicated in multiple sclerosis, an autoimmune disorder.
"Our study results spotlight the importance not only of genes, but also the little-known DNA sequences between genes that control their expression," said Pablo Gejman, M.D., of the NorthShore University HealthSystem Research Institute, of Evanston, ILL, who led the MGS consortium team. "Advances in biotechnology, statistics, population genetics, and psychiatry, in combination with the ability to recruit large samples, made the new findings possible."
The SGENE consortium study pinpointed a site of variation in the suspect Chromosome 6 region that could implicate processes related to immunity and infection. It also found significant evidence of association with variation on Chromosomes 11 and 18 that could help account for the thinking and memory deficits of schizophrenia.
The new findings could eventually lead to multi-gene signatures or biomarkers for severe mental disorders. As more is learned about the implicated gene pathways, it may be possible to sort out what's shared by, or unique to, schizophrenia and bipolar disorder, the researchers say.
Schizophrenia/bipolar disorder genetic overlap
Schizophrenia and bipolar disorder share genetic roots that appear to be specific to serious mental disorders, and are not shared by non-psychiatric illnesses. Bars representing different study samples show that the same genetic variations that account for risk in both mental disorders account for virtually none of the risk for coronary artery disease (CAD), Crohn's disease (CD), hypertension (HT), rheumatoid arthritis (RA), or Type 1 (T1D) or Type 2 (T2D) diabetes.
Source: Psychiatric and Neurodevelopmental Genetics Unit, Center for Human Genetic Research, Harvard University.
References
Jianxin S, et al. Common variants on chromosome 6p22.1 are associated with schizophrenia. July 1, 2009, Nature
Stefansson H, et al. Common variants conferring risk of schizophrenia. July 1, 2009, Nature
Purcell SM, et al. Common polygenic variation contributes to risk of schizophrenia that overlaps with bipolar disorder. July 1, 2009, Nature
A person holding a gene chip, also known as a DNA microarray.
A trio of genome-wide studies – collectively the largest to date – has pinpointed a vast array of genetic variation that cumulatively may account for at least one third of the genetic risk for schizophrenia. One of the studies traced schizophrenia and bipolar disorder, in part, to the same chromosomal neighborhoods.
"These new results recommend a fresh look at our diagnostic categories," said Thomas R. Insel, M.D., director of the National Institute of Mental Health (NIMH), part of the National Institutes of Health. "If some of the same genetic risks underlie schizophrenia and bipolar disorder, perhaps these disorders originate from some common vulnerability in brain development."
Three schizophrenia genetics research consortia, each funded in part by NIMH, report separately on their genome-wide association studies online July 1, 2009, in the journal Nature. However, the SGENE, International Schizophrenia (ISC) and Molecular Genetics of Schizophrenia (MGS) consortia shared their results - making possible meta-analyses of a combined sample totaling 8,014 cases and 19,090 controls.
All three studies implicate an area of Chromosome 6 (6p22.1), which is known to harbor genes involved in immunity and controlling how and when genes turn on and off. This hotspot of association might help to explain how environmental factors affect risk for schizophrenia. For example, there are hints of autoimmune involvement in schizophrenia, such as evidence that offspring of mothers with influenza while pregnant have a higher risk of developing the illness.
"Our study was unique in employing a new way of detecting the molecular signatures of genetic variations with very small effects on potential schizophrenia risk," explained Pamela Sklar, M.D., Ph.D., of Harvard University and the Stanley Center for Psychiatric Research, who co-led the ISC team with Harvard's Shaun Purcell, Ph.D.
"Individually, these common variants' effects do not all rise to statistical significance, but cumulatively they play a major role, accounting for at least one third – and probably much more – of disease risk," said Purcell.
Among sites showing the strongest associations with schizophrenia was a suspect area on Chromosome 22 and more than 450 variations in the suspect area on Chromosome 6. Statistical simulations confirmed that the findings could not have been accounted for by a handful of common gene variants with large effect or just rare variants. This involvement of many common gene variants suggests that schizophrenia in different people might ultimately be traceable to distinct disease processes, say the researchers.
"There was substantial overlap in the genetic risk for schizophrenia and bipolar disorder that was specific to mental disorders," added Sklar. "We saw no association between the suspect gene variants and half a dozen common non-psychiatric disorders."
Still, most of the genetic contribution to schizophrenia, which is estimated to be at least 70 percent heritable, remains unknown.
"Until this discovery, we could explain just a few percent of this contribution; now we have more than 30 percent accounted for," said Thomas Lehner, Ph.D., MPH, chief of NIMH's Genomics Research Branch. "The new findings tell us that many of these secrets have been hidden in complex neural networks, providing hints about where to look for the still elusive – and substantial – remaining genetic contribution."
The MGS consortium pinpointed an association between schizophrenia and genes in the Chromosome 6 region that code for cellular components that control when genes turn on and off. For example, one of the strongest associations was seen in the vicinity of genes for proteins called histones that slap a molecular clamp on a gene's turning on in response to the environment. Genetically rooted variation in the functioning of such regulatory mechanisms could help to explain the environmental component repeatedly implicated in schizophrenia risk.
The MGS study also found an association between schizophrenia and a genetic variation on Chromosome 1 (1p22.1) which has been implicated in multiple sclerosis, an autoimmune disorder.
"Our study results spotlight the importance not only of genes, but also the little-known DNA sequences between genes that control their expression," said Pablo Gejman, M.D., of the NorthShore University HealthSystem Research Institute, of Evanston, ILL, who led the MGS consortium team. "Advances in biotechnology, statistics, population genetics, and psychiatry, in combination with the ability to recruit large samples, made the new findings possible."
The SGENE consortium study pinpointed a site of variation in the suspect Chromosome 6 region that could implicate processes related to immunity and infection. It also found significant evidence of association with variation on Chromosomes 11 and 18 that could help account for the thinking and memory deficits of schizophrenia.
The new findings could eventually lead to multi-gene signatures or biomarkers for severe mental disorders. As more is learned about the implicated gene pathways, it may be possible to sort out what's shared by, or unique to, schizophrenia and bipolar disorder, the researchers say.
Schizophrenia/bipolar disorder genetic overlap
Schizophrenia and bipolar disorder share genetic roots that appear to be specific to serious mental disorders, and are not shared by non-psychiatric illnesses. Bars representing different study samples show that the same genetic variations that account for risk in both mental disorders account for virtually none of the risk for coronary artery disease (CAD), Crohn's disease (CD), hypertension (HT), rheumatoid arthritis (RA), or Type 1 (T1D) or Type 2 (T2D) diabetes.
Source: Psychiatric and Neurodevelopmental Genetics Unit, Center for Human Genetic Research, Harvard University.
References
Jianxin S, et al. Common variants on chromosome 6p22.1 are associated with schizophrenia. July 1, 2009, Nature
Stefansson H, et al. Common variants conferring risk of schizophrenia. July 1, 2009, Nature
Purcell SM, et al. Common polygenic variation contributes to risk of schizophrenia that overlaps with bipolar disorder. July 1, 2009, Nature
Why is it hard to "unlearn" an incorrect fact?
Cognitive psychologist Gordon H. Bower of Stanford University answers
By The Editors
Why is it that once you learn something incorrectly (say, 7 X 9 = 65), it seems you never can correct your recall?
—J. Kruger, Cherry Hill, N.J.
Cognitive psychologist Gordon H. Bower of Stanford University answers:
Identifying, correcting and averting our memory errors are part of a cognitive process called memory monitoring. Incorrect associations can be tough to change, but we can use techniques to retrain our brain.
When strong habits impede our ability to acquire a desired new habit or association, we experience a common phenomenon known as proactive interference. Wrong associations appear in common spelling errors such as “wierd” for “weird” and “neice” for “niece.” Persistent mistaken connections also can cause embarrassing errors, such as calling a man’s second wife by the name of his first. Interference is stronger the more previous wives you’ve had to deal with, and it is more difficult to overcome the stronger the habits are.
Accurate memory monitoring requires a well-functioning prefrontal cortex (PFC). Young children, who have an immature PFC, and stroke patients with extensive PFC damage make more errors as a result of memory-monitoring failures. They are more likely to confuse the source of information they recall, and they are more susceptible to accepting as true an event they only imagined.
You can overcome proactive interference by consistent (even silent) correction, especially when you space rehearsals over time. But it takes some conscious practice. We have to identify (or be told) when we have just made an error so that we can correct it immediately. Our inability to do so is typically the cause of the error’s persistence.
Building on the correct information can help you learn new associations to it: add something to change how you retrieve the item from your memory. You might replace your question “Name of John’s wife?” with “Name of John’s second wife?”; or use an elaboration that contains the accurate information, such as “We are weird” or “My niece is nice”; or convert 7 X 9 into 7 X (10 – 1) = 70 – 7 = 63. As you practice the elaborated association, the simpler direct association (7 X 9 = 63) eventually replaces the earlier one, which weakens without rehearsals. Labeling and rehearsing the wrong association (for example, saying to yourself, “7 X 9 is not 63”), however, are distinctly counterproductive.
By The Editors
Why is it that once you learn something incorrectly (say, 7 X 9 = 65), it seems you never can correct your recall?
—J. Kruger, Cherry Hill, N.J.
Cognitive psychologist Gordon H. Bower of Stanford University answers:
Identifying, correcting and averting our memory errors are part of a cognitive process called memory monitoring. Incorrect associations can be tough to change, but we can use techniques to retrain our brain.
When strong habits impede our ability to acquire a desired new habit or association, we experience a common phenomenon known as proactive interference. Wrong associations appear in common spelling errors such as “wierd” for “weird” and “neice” for “niece.” Persistent mistaken connections also can cause embarrassing errors, such as calling a man’s second wife by the name of his first. Interference is stronger the more previous wives you’ve had to deal with, and it is more difficult to overcome the stronger the habits are.
Accurate memory monitoring requires a well-functioning prefrontal cortex (PFC). Young children, who have an immature PFC, and stroke patients with extensive PFC damage make more errors as a result of memory-monitoring failures. They are more likely to confuse the source of information they recall, and they are more susceptible to accepting as true an event they only imagined.
You can overcome proactive interference by consistent (even silent) correction, especially when you space rehearsals over time. But it takes some conscious practice. We have to identify (or be told) when we have just made an error so that we can correct it immediately. Our inability to do so is typically the cause of the error’s persistence.
Building on the correct information can help you learn new associations to it: add something to change how you retrieve the item from your memory. You might replace your question “Name of John’s wife?” with “Name of John’s second wife?”; or use an elaboration that contains the accurate information, such as “We are weird” or “My niece is nice”; or convert 7 X 9 into 7 X (10 – 1) = 70 – 7 = 63. As you practice the elaborated association, the simpler direct association (7 X 9 = 63) eventually replaces the earlier one, which weakens without rehearsals. Labeling and rehearsing the wrong association (for example, saying to yourself, “7 X 9 is not 63”), however, are distinctly counterproductive.
Sunday, July 19, 2009
VS Ramachandran on your mind
Vilayanur Ramachandran explains what brain damage can reveal about the connection between celebral tissue and the mind, using three startling delusions as examples.
Wednesday, July 15, 2009
A POW's 'Tears in the Darkness'
Ben Steele recounts surviving the Bataan Death March in World War II
"Men died like flies," says Steele, now 91
Steele's story is recounted in new book "Tears in the Darkness"
Steele forced to confront hatred through chance meeting after war
By John Blake
CNN -- Ben Steele hated the young man as soon as he saw him.
The man's almond-shaped eyes, dark hair and olive skin -- Steele had seen those Asian facial features before.
He saw that face when he watched Japanese soldiers behead sick men begging for water, run over stumbling prisoners with tanks and split his comrades' skulls with rifle butts.
"Men died like flies," Steele says. "I thought for a while I would never make it."
Steele, now 91, is one of the last survivors of the Bataan Death March.
During World War II, the Japanese army forced American and Filipino prisoners of war on a march so horrific that the Japanese commander was later executed for war crimes.
Steele returned home to Montana after the war to teach, but he still had something to learn.
When he saw a young Japanese-American student seated in his class one day, he felt both anger and anguish.
What, he wondered, do I do with all of the hate I've brought home with me?
'The worst war story' he ever heard
Steele's answer to that question can be found in the new book "Tears in the Darkness," a searing depiction of the Bataan Death March.
The book details how Steele found help through an unlikely source. But he would first have to survive one of the worst defeats in U.S. military history.
In December 1941, Japanese forces attacked an army of American and Filipino soldiers in the Philippine Islands and forced them to surrender. They captured 76,000 prisoners, double what they had expected.
The Japanese forced the POWs to march 66 miles under a tropical sun to a railway station for transport. They shot, bayoneted and beat to death prisoners who couldn't keep pace.
At least 7,000 soldiers died during the march.
More died later. The brutal conditions of the march contributed to the subsequent deaths of an estimated 25,000 Filipinos and 1500 Americans in Japanese prison camps, says Michael Norman, a Vietnam veteran who wrote "Tears in the Darkness," with his wife, Elizabeth.
"It's the worst war story I've ever heard," Norman says. "What they [the Japanese] did was monstrous."
Prisoners were forced to bury others alive and work as slave laborers; some were executed for sport. One Japanese soldier, who later became a Buddhist priest, told the authors that he is still haunted by what he did on Bataan.
Some Filipinos who live today near the march's route say that they, too, cannot forget what happened, Elizabeth Norman says.
"They would tell us that when they lay awake at night, they thought they could still hear the trampling of the men's feet on the death march," she says.
Why Steele survived
The death march was filled with villains, but the authors also found a hero: Steele. The march is told through his eyes and drawings.
Steele was a cowboy from Montana who could ride a horse, rope cattle and shoot by the time he was 8 years old.
"I thought that if anybody gets out of here, I'm going to be one of them," says Steele, who was a 22-year-old Army Air Corps private when he was captured.
At times, though, Steele wondered whether he was being too optimistic. He was bayoneted, starved and beaten. He was constantly ill, and his weight fell to 112 pounds.
Steele found a way to preserve his mind even as his body wasted away: He drew. He started sketching pictures of what he saw during his captivity.
"I felt an obligation to show people what went on there," he says.
Steele was released after three years of captivity when World War II ended. He returned to Billings, Montana, where he became an art professor at a state college.
"I had a lot of anger when I got home," Steele says. "We were beaten for so long. I hated [the Japanese]."
Steele meets his 'nemesis'
Steele's hatred smoldered for 15 years. It threatened to spill out into the open in 1960, when he walked into his classroom on the first day of the semester and saw a Japanese-American student.
In "Tears in the Darkness," Steele says that his "heart hardened and filled with hate." But he was so anguished by what he was feeling, he returned to his office after class to think.
He told himself that the war was over; he wasn't a prisoner anymore, and he had to treat the Japanese-American student like anybody else, because he was an American, too.
Then he did something else. He invited the student to his office for a talk.
The student's name was Harry Koyama, and he, too, had been marked by the war. His family had been imprisoned at a "relocation camp" in Arizona during the war.
Steele also discovered that he and Koyama had something else in common: a passion for drawing Montana's rural life. By the end of the semester, Koyama was one of Steele's best students.
Steele says that talking to Koyama helped his hatred evaporate.
"We had a discussion and finally came to an understanding that we liked each other," he says.
Today, Steele and Koyama remain in touch.
"We're the best of friends," Koyama tells CNN from his Montana art studio. "We see each other regularly."
Koyama says he can't remember exactly what he and Steele talked about first, only that Steele had always treated him well. Steele did tell him later that their relationship helped him recover from the war, he says.
"I was just there," Koyama says. "I just happened to be there for him to use my presence as a way to overcome his dark time."
Koyama says he is still amazed by Steele's survival story.
"Just to be a part of his life is an honor," Koyama says.
Steele's voice is still strong and his mind sharp. He's been married to his wife, Shirley, for 57 years, and they have three children and six grandchildren.
Steele says Bataan taught him to treasure small pleasures, like a drink of cool water and a warm bed at night.
"I'm thankful that I have a plateful of food," he says. "I can remember when that plate was empty."
He still remembers tiny details from the death march as well. He constantly draws pictures of his friends and tormentors on Bataan. Their faces fill his sketchbooks.
Steele's hate may be gone, but the death march lingers.
"I think about it every day," he says. "It's in my mind, and I'll never get it out."
"Men died like flies," says Steele, now 91
Steele's story is recounted in new book "Tears in the Darkness"
Steele forced to confront hatred through chance meeting after war
By John Blake
CNN -- Ben Steele hated the young man as soon as he saw him.
The man's almond-shaped eyes, dark hair and olive skin -- Steele had seen those Asian facial features before.
He saw that face when he watched Japanese soldiers behead sick men begging for water, run over stumbling prisoners with tanks and split his comrades' skulls with rifle butts.
"Men died like flies," Steele says. "I thought for a while I would never make it."
Steele, now 91, is one of the last survivors of the Bataan Death March.
During World War II, the Japanese army forced American and Filipino prisoners of war on a march so horrific that the Japanese commander was later executed for war crimes.
Steele returned home to Montana after the war to teach, but he still had something to learn.
When he saw a young Japanese-American student seated in his class one day, he felt both anger and anguish.
What, he wondered, do I do with all of the hate I've brought home with me?
'The worst war story' he ever heard
Steele's answer to that question can be found in the new book "Tears in the Darkness," a searing depiction of the Bataan Death March.
The book details how Steele found help through an unlikely source. But he would first have to survive one of the worst defeats in U.S. military history.
In December 1941, Japanese forces attacked an army of American and Filipino soldiers in the Philippine Islands and forced them to surrender. They captured 76,000 prisoners, double what they had expected.
The Japanese forced the POWs to march 66 miles under a tropical sun to a railway station for transport. They shot, bayoneted and beat to death prisoners who couldn't keep pace.
At least 7,000 soldiers died during the march.
More died later. The brutal conditions of the march contributed to the subsequent deaths of an estimated 25,000 Filipinos and 1500 Americans in Japanese prison camps, says Michael Norman, a Vietnam veteran who wrote "Tears in the Darkness," with his wife, Elizabeth.
"It's the worst war story I've ever heard," Norman says. "What they [the Japanese] did was monstrous."
Prisoners were forced to bury others alive and work as slave laborers; some were executed for sport. One Japanese soldier, who later became a Buddhist priest, told the authors that he is still haunted by what he did on Bataan.
Some Filipinos who live today near the march's route say that they, too, cannot forget what happened, Elizabeth Norman says.
"They would tell us that when they lay awake at night, they thought they could still hear the trampling of the men's feet on the death march," she says.
Why Steele survived
The death march was filled with villains, but the authors also found a hero: Steele. The march is told through his eyes and drawings.
Steele was a cowboy from Montana who could ride a horse, rope cattle and shoot by the time he was 8 years old.
"I thought that if anybody gets out of here, I'm going to be one of them," says Steele, who was a 22-year-old Army Air Corps private when he was captured.
At times, though, Steele wondered whether he was being too optimistic. He was bayoneted, starved and beaten. He was constantly ill, and his weight fell to 112 pounds.
Steele found a way to preserve his mind even as his body wasted away: He drew. He started sketching pictures of what he saw during his captivity.
"I felt an obligation to show people what went on there," he says.
Steele was released after three years of captivity when World War II ended. He returned to Billings, Montana, where he became an art professor at a state college.
"I had a lot of anger when I got home," Steele says. "We were beaten for so long. I hated [the Japanese]."
Steele meets his 'nemesis'
Steele's hatred smoldered for 15 years. It threatened to spill out into the open in 1960, when he walked into his classroom on the first day of the semester and saw a Japanese-American student.
In "Tears in the Darkness," Steele says that his "heart hardened and filled with hate." But he was so anguished by what he was feeling, he returned to his office after class to think.
He told himself that the war was over; he wasn't a prisoner anymore, and he had to treat the Japanese-American student like anybody else, because he was an American, too.
Then he did something else. He invited the student to his office for a talk.
The student's name was Harry Koyama, and he, too, had been marked by the war. His family had been imprisoned at a "relocation camp" in Arizona during the war.
Steele also discovered that he and Koyama had something else in common: a passion for drawing Montana's rural life. By the end of the semester, Koyama was one of Steele's best students.
Steele says that talking to Koyama helped his hatred evaporate.
"We had a discussion and finally came to an understanding that we liked each other," he says.
Today, Steele and Koyama remain in touch.
"We're the best of friends," Koyama tells CNN from his Montana art studio. "We see each other regularly."
Koyama says he can't remember exactly what he and Steele talked about first, only that Steele had always treated him well. Steele did tell him later that their relationship helped him recover from the war, he says.
"I was just there," Koyama says. "I just happened to be there for him to use my presence as a way to overcome his dark time."
Koyama says he is still amazed by Steele's survival story.
"Just to be a part of his life is an honor," Koyama says.
Steele's voice is still strong and his mind sharp. He's been married to his wife, Shirley, for 57 years, and they have three children and six grandchildren.
Steele says Bataan taught him to treasure small pleasures, like a drink of cool water and a warm bed at night.
"I'm thankful that I have a plateful of food," he says. "I can remember when that plate was empty."
He still remembers tiny details from the death march as well. He constantly draws pictures of his friends and tormentors on Bataan. Their faces fill his sketchbooks.
Steele's hate may be gone, but the death march lingers.
"I think about it every day," he says. "It's in my mind, and I'll never get it out."
Friday, July 10, 2009
Jani Schofield diagnosed with schizophrenia
For Jani Schofield, some progress -- and major setbacks
The 6-year-old, who has been diagnosed with schizophrenia, doesn't fare well after a change in her environment, and the stress of caring for her takes a severe toll on her family.
By Shari Roan
4:56 PM PDT, July 8, 2009
On June 29, The Times profiled Jani Schofield, a 6-year-old diagnosed with schizophrenia, and her parents in “Jani’s at the mercy of her mind.” The article examined Jani's bouts of rage, her make-believe world, and Michael and Susan Schofield's efforts to keep their family together while also safely raising Jani and her toddler brother, Bodhi. Here is an update on the Schofield family.
Michael and Susan Schofield's plan to keep two apartments and trade 14-hour shifts caring for their 6-year-old daughter, Jani, worked for awhile.
And Jani returned, with modest success, to her elementary school.
But when school ended two weeks later, so did the Schofields' only real respite care, and their lives began to unravel.
The Schofields clashed with the team of workers from a nonprofit provider of mental health services the family was depending on for support. The social workers tried to help, says Michael, but didn't seem to understand that simple parenting techniques and behavioral therapies were irrelevant when caring for a psychotic child.
The cost of two apartments was crushing, and their money woes mounted. Michael and Susan began to argue more often.
When she was discharged June 1 from UCLA's Resnick Psychiatric Hospital, Jani had fewer hallucinations and was less violent. But within a week at home, she began spending more time in her imaginary world of rats and cats and searing temperatures.
Jani's psychiatrist raised her dose of Thorazine, one of three drugs she takes to control the psychosis, but it had little effect. Moreover, Thorazine causes severe photo-sensitivity so time spent at the park and swimming pool, where Jani is mostly easily entertained, had to be dramatically curtailed. She was soon bored with her tiny apartment.
The detailed daily schedule the Schofields crafted to mimic Jani's schedule at UCLA went by the wayside. The point system the couple planned to use to track Jani's behavior and reward her for progress was forgotten.
Both Michael and Susan battle depression and see a therapist. But Michael grew especially despondent.
After three weeks, the couple's hopes of caring for their daughter at home slipped away. Jani was psychotic most of the time, talking about her imaginary friends, gesturing to them, running to the door to allow them access to the apartment. She threatened her baby brother, Bodhi, sometimes kicking him. The toddler grew more anxious and clingy, and the Schofields began to worry about his psychological health and development.
On June 24, while the family was eating breakfast at Denny's, a drop of orange juice spilled on Jani's slacks, a sensation she cannot stand. She began to remove her pants in the restaurant but had not put on underwear that morning. The couple wrestled with her to keep her dressed while she erupted with fury over her wet clothing.
It took an hour to calm her.
The next day, she screamed at the doctor's office where she was undergoing regular lab tests to check for side effects from the high doses of medications. Later, she carried an imaginary rat in the palm of her hand and cautioned onlookers, "Be careful around him. He squirts."
Susan lost her set of keys and Michael yelled at her. Also that day, the family (other than Jani) was ill with a respiratory virus, and Bodhi was diagnosed with asthma.
On Thursday, June 25, Jani's blood test results came back. Her thyroid levels were abnormal and there was blood in her urine. She complained of constant itching.
Susan stayed home in Bodhi's apartment while Michael turned off the lights in Jani's apartment and drove her to UCLA.
January Schofield was re-admitted to UCLA's Resnick Neuropsychiatric Hospital later that day.
"It was really hard to take her back," Michael says. "It feels like a failure. We really wanted to make it work."
Jani's doctors at UCLA have decided to wean her off her current medications and try Clozaril, a last-ditch anti-psychotic that carries the risk of severe side effects. In the meantime, the Schofields are completing paperwork seeking to have Jani admitted to a study on child schizophrenia at the National Institute of Mental Health in Bethesda, Md.
Michael acknowledges that he and Susan need time to regain their mental and physical health before beginning the next round of Jani's life.
The day Jani was readmitted, says Michael, "I felt such a profound sense of despair. We can't get the services that we need to keep her at home. It breaks my heart that the only way we can get a break is to put her back in the hospital."
The staff at UCLA was kind when they saw their little patient again. She has made some progress, one of the nurses reminded Michael; she is less violent.
Jani remembered the staff well and didn't seem to mind going back to the hospital. But after Michael hugged her and said good-night that Thursday, she began to cry softly -- something she rarely does except in anger.
Michael detected a little sob and paused at the door.
"I called back to her, 'Jani, are you OK?' "
"Yes," she said.
Saturday, July 4, 2009
Pedophiles, Hebephiles, and Ephebophiles, Oh My: Erotic Age Orientation Why most "pedophiles" aren't really pedophiles, technically speaking
By Jesse Bering
New Scientist: Mind
Michael Jackson probably wasn’t a pedophile—at least, not in the strict, biological sense of the word. It’s a morally loaded term, pedophile, that has become synonymous with the very basest of evils. (In fact it’s hard to even say it aloud without cringing, isn’t it?) But according to sex researchers, it’s also a grossly misused term.
If Jackson did fall outside the norm in his “erotic age orientation”—and we may never know if he did—he was almost certainly what’s called a hebephile, a newly proposed diagnostic classification in which people display a sexual preference for children at the cusp of puberty, between the ages of, roughly, 11 to 14 years of age. Pedophiles, in contrast, show a sexual preference for clearly prepubescent children. There are also ephebophiles (from ephebos, meaning “one arrived at puberty” in Greek), who are mostly attracted to 15- to 16-year-olds; teleiophiles (from teleios, meaning, “full grown” in Greek), who prefer those 17 years of age or older); and even the very rare gerontophile (from gerontos, meaning “old man” in Greek), someone whose sexual preference is for the elderly. So although child sex offenders are often lumped into the single classification of pedophilia, biologically speaking it’s a rather complicated affair. Some have even proposed an additional subcategory of pedophilia, “infantophilia,” to distinguish those individuals most intensely attracted to children below six years of age.
Based on this classification scheme of erotic age orientations, even the world’s best-known fictitious “pedophile,” Humbert Humbert from Nabokov’s masterpiece, Lolita, would more properly be considered a hebephile. (Likewise the protagonist from Thomas Mann’s Death in Venice, a work that I’ve always viewed as something of the “gay Lolita”). Consider Humbert’s telltale description of a “nymphet.” After a brief introduction to those “pale pubescent girls with matted eyelashes,” Humbert explains:
Between the age limits of nine and fourteen there occur maidens who, to certain bewitched travelers, twice or many times older than they, reveal their true nature which is not human, but nymphic (that is, demoniac); and these chosen creatures I propose to designate as “nymphets.”
Although Michael Jackson might have suffered more disgrace from his hebephilic orientation than most, and his name will probably forever be entangled darkly with the sinister phrase “little boys,” he wasn’t the first celebrity or famous figure that could be seen as falling into this hebephilic category. In fact, ironically, Michael Jackson’s first wife, Lisa Marie Presley, is the product of a hebephilic attraction. After all, let’s not forget that Priscilla caught Elvis’s very grownup eye when she was just fourteen, only a year or two older than the boys that Michael Jackson was accused of sexually molesting. Then there’s of course also the scandalous Jerry Lee Lewis incident in which the 23-year-old “Great Balls of Fire” singer married his 13-year-old first cousin.
In the psychiatric community, there’s recently been a hubbub of commotion concerning whether hebephelia should be designated as a medical disorder or, instead, seen simply as a normal variant of sexual orientation and not indicative of brain pathology. There are important policy implications of adding hebephilia to the checklist of mental illnesses, since doing so might allow people who sexually abuse pubescent children to invoke a mental illness defense.
One researcher who is arguing vociferously for the inclusion of hebephilia in the American Psychiatric Association's revised diagnostic manual (the DSM-V) is University of Toronto psychologist Ray Blanchard. In last month’s issue of Archives of Sexual Behavior, Blanchard and his colleagues provide new evidence that many people diagnosed under the traditional label of pedophilia are in fact not as interested in prepubescent children as they are early adolescents.
To tease apart these erotic age orientation differences, Blanchard and his colleagues studied 881 men (straight and gay) in his laboratory using phallometric testing (also known as penile plethysmography) while showing them visual images of differently aged nude models. Because this technique measures penile blood volume changes, it’s seen as being a fairly objective index of sexual arousal to what’s being shown on the screen—which, for those attracted to children and young adolescents, the participant might verbally deny being attracted to. In other words, the penis isn’t a very good liar. So, for example, in Blanchard’s study, the image of a naked 12-year-old girl (nothing prurient, but rather resembling a subject in a medical textbook) was accompanied by the following audiotaped narrative:
“You are watching a late movie on TV with your neighbors’ 12-year-old daughter. You have your arm around her shoulders, and your fingers brush against her chest. You realize that her breasts have begun to develop…”
Blanchard and his coauthors found that the men in their sample fell into somewhat discrete categories of erotic age orientation—some had the strongest penile response to the prepubescent children (the pedophiles), others to the pubescent children (the hebephiles), and the remainder to the adults shown on screen (the teleiophiles). These categories weren’t mutually exclusive. For example, some teleiophiles showed some arousal to pubescent children, some hebephiles showed some attraction to prepubescent children, and so on. But the authors did find that it’s possible to distinguish empirically between a “true pedophile” and a hebephile using this technique, in terms of the age ranges for which men exhibited their strongest arousal. They also conclude that, based on the findings from this study, hebephilia “is relatively common compared with other forms of erotic interest in children.”
In the second half of their article, Blanchard and his colleagues argue that hebephilia should be added to the newly revised DSM-V as a genuine paraphilic mental disorder—differentiating it from pedophilia. But many of his colleagues working in this area are strongly opposed to doing this.
Men who find themselves primarily attracted to young or middle-aged adolescents are clearly disadvantaged in today’s society, but historically (and evolutionarily) this almost certainly wasn’t the case. In fact, hebephiles—or at least ephebephiles—would have had a leg up over their competition. Evolutionary psychologists have found repeatedly that markers of youth correlate highly with perceptions of beauty and attractiveness. For straight men, this makes sense, since a woman’s reproductive value declines steadily after the age of about twenty. Obviously having sex with a prepubescent child would be fruitless—literally. But, whether we like it or not, this isn’t so for a teenage girl who has just come of age, who is reproductively viable and whose brand-new state of fertility can more or less ensure paternity for the male. These evolved motives were portrayed in the film Pretty Baby, in which a young Brooke Shields plays the role of twelve-old-old Violet Neil, a prostitute’s daughter in 1917’s New Orleans whose coveted virginity goes up for auction to the highest bidder.
Understanding adult gay men’s attraction to young males is more of a puzzle. Evolutionary psychologist Frank Muscarella’s “alliance formation theory” is the only one that I’m aware of that attempts to do this. This theory holds that homoerotic behavior between older, high status men and teenage boys serves as a way for the latter to move up in ranks, a sort of power-for-sex bargaining chip. The most obvious example of this type of homosexual dynamic was found in ancient Greece, but male relationships in a handful of New Guinea tribes display these homoerotic patterns as well. There are also, ahem, plenty of present-day examples of this in Congress. Oscar Wilde probably would have signed on to this theoretical perspective. After all, his famous “love that dare not speak its name” wasn’t homosexuality, per se, but rather a “great affection of an elder for a younger man”:
...as there was between David and Jonathan, such as Plato made the very basis of his philosophy, and such as you find in the sonnets of Michelangelo and Shakespeare. It is that deep, spiritual affection that is as pure as it is perfect. It dictates and pervades great works of art like those of Shakespeare and Michelangelo… It is beautiful, it is fine, it is the noblest form of affection. There is nothing unnatural about it. It is intellectual, and it repeatedly exists between an elder and a younger man, when the elder man has intellect, and the younger man has all the joy, hope and glamour of life before him. That it should be so, the world does not understand. The world mocks at it and sometimes puts one in the pillory for it.
But, generally speaking, Muscarella’s theory doesn’t seem to pull a lot of weight. Not many teenage boys in any culture seem terribly interested in taking this particular route to success. Rather—and I may be wrong about this—but I think most teenage boys would prefer to scrub toilets for the rest of their lives or sell soft bagels at the mall than become the sexual plaything of an “older gentlemen.”
In any event, given the biological (even adaptive) verities of being attracted to adolescents, most experts in this area find it completely illogical for Blanchard to recommend adding hebephilia to the revised DSM-V. (Especially since other more clearly maladaptive paraphilias—such as gerontophilia, in which men are attracted primarily to elderly, post-menopausal women—are not presently included in the diagnostic manual.) The push to pathologize hebephilia, argues forensic psychologist Karen Franklin, appears to be motivated more by “a booming cottage industry” in forensic psychology, not coincidentally linked with a “punitive era of moral panic." Because “civil incapacitation” (basically, the government’s ability to strip a person of his or her civil rights in the interests of public safety) requires that the person be suffering from a diagnosable mental disorder or abnormality, Franklin calls Blanchard’s proposal “a textbook example of subjective values masquerading as science.” Another critic, forensic psychologist Gregory DeClue, suggests that such medical classifications are being based on arbitrary distinctions dictated by cultural standards:
Pedophilia is a mental disorder. Homosexuality is not. Should hebephilia of ephebophilia or gerontophilia be considered mental disorders? How about sexual preference for people with different (or with the same) ethnic characteristics as oneself?
And Marquette University psychologist Thomas Zander, points out that since chronological age doesn’t always perfectly match physical age, including these subtle shades of erotic age preferences would be problematic from a diagnostic perspective:
Imagine how much more impractical it would be to require forensic evaluators to determine the existence of pedophilia based on the stage of adolescence of the examinee’s victim. Such determinations could literally devolve into a splitting of pubic hairs.
One unexplored question, and one inseparable from the case of Michael Jackson, is whether we tend to be more forgiving of a person’s sexual peccadilloes when that individual has some invaluable or culturally irreplaceable abilities. For example, consider the following true story:
There once was a man who fancied young boys. Being that laws were more lax in other nations, this man decided to travel to a foreign country, leaving his wife and young daughter behind, where he met up with another Westerner who shared in his predilections for pederasty, and there the two of them spent their happy vacation scouring the seedy underground of this country searching for pimps and renting out boys for sex.
Now if you’re like most people, you’re probably experiencing a shiver of disgust and a spark of rage. You likely feel these men should have their testicles drawn and quartered by wild mares, be thrown to a burly group of rapists, castrated with garden sheers or, if you’re the pragmatic sort, treated as any other sick animal in the herd would be treated, with a humane bullet to the temple or perhaps a swift and sure current of potassium chloride injected into the arm.
But notice the subtle change in your perceptions when I tell you that these events are from the autobiography of AndrĂ© Gide, who in 1947—long after he’d publicized these very details—won the Nobel prize in literature. Gide is in fact bowdlerizing his time in Algiers with none other than Oscar Wilde.
Wilde took a key out of his pocket and showed me into a tiny apartment of two rooms… The youths followed him, each of them wrapped in a burnous that hid his face. Then the guide left us and Wilde sent me into the further room with little Mohammed and shut himself up in the other with the [other boy]. Every time since then that I have sought after pleasure, it is the memory of that night I have pursued.
It’s not that we think it’s perfectly fine for Gide and Wilde to have sex with minors or even that they shouldn’t have been punished for such behaviors. (In fact Wilde was sentenced in London to two years hard labor for related offenses not long after this Maghreb excursion with Gide and died in penniless ignominy.) But somehow, as with our commingled feelings for Michael Jackson, “the greatest entertainer of all time,” the fact that these men were national treasures somehow dilutes our moralistic anger, as though we’re more willing to suffer their vices given the remarkable literary gifts they bestowed.
Would you really have wanted Oscar Wilde euthanized as though he were a sick animal? Should AndrĂ© Gide, whom the New York Times hailed in their obituary as a man “judged the greatest French writer of this century by the literary cognoscenti,” have been deprived of his pen, torn to pieces by illiterate thugs? It’s complicated. And although in principle we know that all men are equal in the eyes of the law, just as we did for Michael Jackson during his child molestation trials, I have a hunch that many people tend to feel (and uncomfortably so) a little sympathy for the Devil under such circumstances.
In this column presented by Scientific American Mind magazine, research psychologist Jesse Bering of Queen's University Belfast ponders some of the more obscure aspects of everyday human behavior. Ever wonder why yawning is contagious, why we point with our index fingers instead of our thumbs or whether being breastfed as an infant influences your sexual preferences as an adult? Get a closer look at the latest data as “Bering in Mind” tackles these and other quirky questions about human nature.
New Scientist: Mind
Michael Jackson probably wasn’t a pedophile—at least, not in the strict, biological sense of the word. It’s a morally loaded term, pedophile, that has become synonymous with the very basest of evils. (In fact it’s hard to even say it aloud without cringing, isn’t it?) But according to sex researchers, it’s also a grossly misused term.
If Jackson did fall outside the norm in his “erotic age orientation”—and we may never know if he did—he was almost certainly what’s called a hebephile, a newly proposed diagnostic classification in which people display a sexual preference for children at the cusp of puberty, between the ages of, roughly, 11 to 14 years of age. Pedophiles, in contrast, show a sexual preference for clearly prepubescent children. There are also ephebophiles (from ephebos, meaning “one arrived at puberty” in Greek), who are mostly attracted to 15- to 16-year-olds; teleiophiles (from teleios, meaning, “full grown” in Greek), who prefer those 17 years of age or older); and even the very rare gerontophile (from gerontos, meaning “old man” in Greek), someone whose sexual preference is for the elderly. So although child sex offenders are often lumped into the single classification of pedophilia, biologically speaking it’s a rather complicated affair. Some have even proposed an additional subcategory of pedophilia, “infantophilia,” to distinguish those individuals most intensely attracted to children below six years of age.
Based on this classification scheme of erotic age orientations, even the world’s best-known fictitious “pedophile,” Humbert Humbert from Nabokov’s masterpiece, Lolita, would more properly be considered a hebephile. (Likewise the protagonist from Thomas Mann’s Death in Venice, a work that I’ve always viewed as something of the “gay Lolita”). Consider Humbert’s telltale description of a “nymphet.” After a brief introduction to those “pale pubescent girls with matted eyelashes,” Humbert explains:
Between the age limits of nine and fourteen there occur maidens who, to certain bewitched travelers, twice or many times older than they, reveal their true nature which is not human, but nymphic (that is, demoniac); and these chosen creatures I propose to designate as “nymphets.”
Although Michael Jackson might have suffered more disgrace from his hebephilic orientation than most, and his name will probably forever be entangled darkly with the sinister phrase “little boys,” he wasn’t the first celebrity or famous figure that could be seen as falling into this hebephilic category. In fact, ironically, Michael Jackson’s first wife, Lisa Marie Presley, is the product of a hebephilic attraction. After all, let’s not forget that Priscilla caught Elvis’s very grownup eye when she was just fourteen, only a year or two older than the boys that Michael Jackson was accused of sexually molesting. Then there’s of course also the scandalous Jerry Lee Lewis incident in which the 23-year-old “Great Balls of Fire” singer married his 13-year-old first cousin.
In the psychiatric community, there’s recently been a hubbub of commotion concerning whether hebephelia should be designated as a medical disorder or, instead, seen simply as a normal variant of sexual orientation and not indicative of brain pathology. There are important policy implications of adding hebephilia to the checklist of mental illnesses, since doing so might allow people who sexually abuse pubescent children to invoke a mental illness defense.
One researcher who is arguing vociferously for the inclusion of hebephilia in the American Psychiatric Association's revised diagnostic manual (the DSM-V) is University of Toronto psychologist Ray Blanchard. In last month’s issue of Archives of Sexual Behavior, Blanchard and his colleagues provide new evidence that many people diagnosed under the traditional label of pedophilia are in fact not as interested in prepubescent children as they are early adolescents.
To tease apart these erotic age orientation differences, Blanchard and his colleagues studied 881 men (straight and gay) in his laboratory using phallometric testing (also known as penile plethysmography) while showing them visual images of differently aged nude models. Because this technique measures penile blood volume changes, it’s seen as being a fairly objective index of sexual arousal to what’s being shown on the screen—which, for those attracted to children and young adolescents, the participant might verbally deny being attracted to. In other words, the penis isn’t a very good liar. So, for example, in Blanchard’s study, the image of a naked 12-year-old girl (nothing prurient, but rather resembling a subject in a medical textbook) was accompanied by the following audiotaped narrative:
“You are watching a late movie on TV with your neighbors’ 12-year-old daughter. You have your arm around her shoulders, and your fingers brush against her chest. You realize that her breasts have begun to develop…”
Blanchard and his coauthors found that the men in their sample fell into somewhat discrete categories of erotic age orientation—some had the strongest penile response to the prepubescent children (the pedophiles), others to the pubescent children (the hebephiles), and the remainder to the adults shown on screen (the teleiophiles). These categories weren’t mutually exclusive. For example, some teleiophiles showed some arousal to pubescent children, some hebephiles showed some attraction to prepubescent children, and so on. But the authors did find that it’s possible to distinguish empirically between a “true pedophile” and a hebephile using this technique, in terms of the age ranges for which men exhibited their strongest arousal. They also conclude that, based on the findings from this study, hebephilia “is relatively common compared with other forms of erotic interest in children.”
In the second half of their article, Blanchard and his colleagues argue that hebephilia should be added to the newly revised DSM-V as a genuine paraphilic mental disorder—differentiating it from pedophilia. But many of his colleagues working in this area are strongly opposed to doing this.
Men who find themselves primarily attracted to young or middle-aged adolescents are clearly disadvantaged in today’s society, but historically (and evolutionarily) this almost certainly wasn’t the case. In fact, hebephiles—or at least ephebephiles—would have had a leg up over their competition. Evolutionary psychologists have found repeatedly that markers of youth correlate highly with perceptions of beauty and attractiveness. For straight men, this makes sense, since a woman’s reproductive value declines steadily after the age of about twenty. Obviously having sex with a prepubescent child would be fruitless—literally. But, whether we like it or not, this isn’t so for a teenage girl who has just come of age, who is reproductively viable and whose brand-new state of fertility can more or less ensure paternity for the male. These evolved motives were portrayed in the film Pretty Baby, in which a young Brooke Shields plays the role of twelve-old-old Violet Neil, a prostitute’s daughter in 1917’s New Orleans whose coveted virginity goes up for auction to the highest bidder.
Understanding adult gay men’s attraction to young males is more of a puzzle. Evolutionary psychologist Frank Muscarella’s “alliance formation theory” is the only one that I’m aware of that attempts to do this. This theory holds that homoerotic behavior between older, high status men and teenage boys serves as a way for the latter to move up in ranks, a sort of power-for-sex bargaining chip. The most obvious example of this type of homosexual dynamic was found in ancient Greece, but male relationships in a handful of New Guinea tribes display these homoerotic patterns as well. There are also, ahem, plenty of present-day examples of this in Congress. Oscar Wilde probably would have signed on to this theoretical perspective. After all, his famous “love that dare not speak its name” wasn’t homosexuality, per se, but rather a “great affection of an elder for a younger man”:
...as there was between David and Jonathan, such as Plato made the very basis of his philosophy, and such as you find in the sonnets of Michelangelo and Shakespeare. It is that deep, spiritual affection that is as pure as it is perfect. It dictates and pervades great works of art like those of Shakespeare and Michelangelo… It is beautiful, it is fine, it is the noblest form of affection. There is nothing unnatural about it. It is intellectual, and it repeatedly exists between an elder and a younger man, when the elder man has intellect, and the younger man has all the joy, hope and glamour of life before him. That it should be so, the world does not understand. The world mocks at it and sometimes puts one in the pillory for it.
But, generally speaking, Muscarella’s theory doesn’t seem to pull a lot of weight. Not many teenage boys in any culture seem terribly interested in taking this particular route to success. Rather—and I may be wrong about this—but I think most teenage boys would prefer to scrub toilets for the rest of their lives or sell soft bagels at the mall than become the sexual plaything of an “older gentlemen.”
In any event, given the biological (even adaptive) verities of being attracted to adolescents, most experts in this area find it completely illogical for Blanchard to recommend adding hebephilia to the revised DSM-V. (Especially since other more clearly maladaptive paraphilias—such as gerontophilia, in which men are attracted primarily to elderly, post-menopausal women—are not presently included in the diagnostic manual.) The push to pathologize hebephilia, argues forensic psychologist Karen Franklin, appears to be motivated more by “a booming cottage industry” in forensic psychology, not coincidentally linked with a “punitive era of moral panic." Because “civil incapacitation” (basically, the government’s ability to strip a person of his or her civil rights in the interests of public safety) requires that the person be suffering from a diagnosable mental disorder or abnormality, Franklin calls Blanchard’s proposal “a textbook example of subjective values masquerading as science.” Another critic, forensic psychologist Gregory DeClue, suggests that such medical classifications are being based on arbitrary distinctions dictated by cultural standards:
Pedophilia is a mental disorder. Homosexuality is not. Should hebephilia of ephebophilia or gerontophilia be considered mental disorders? How about sexual preference for people with different (or with the same) ethnic characteristics as oneself?
And Marquette University psychologist Thomas Zander, points out that since chronological age doesn’t always perfectly match physical age, including these subtle shades of erotic age preferences would be problematic from a diagnostic perspective:
Imagine how much more impractical it would be to require forensic evaluators to determine the existence of pedophilia based on the stage of adolescence of the examinee’s victim. Such determinations could literally devolve into a splitting of pubic hairs.
One unexplored question, and one inseparable from the case of Michael Jackson, is whether we tend to be more forgiving of a person’s sexual peccadilloes when that individual has some invaluable or culturally irreplaceable abilities. For example, consider the following true story:
There once was a man who fancied young boys. Being that laws were more lax in other nations, this man decided to travel to a foreign country, leaving his wife and young daughter behind, where he met up with another Westerner who shared in his predilections for pederasty, and there the two of them spent their happy vacation scouring the seedy underground of this country searching for pimps and renting out boys for sex.
Now if you’re like most people, you’re probably experiencing a shiver of disgust and a spark of rage. You likely feel these men should have their testicles drawn and quartered by wild mares, be thrown to a burly group of rapists, castrated with garden sheers or, if you’re the pragmatic sort, treated as any other sick animal in the herd would be treated, with a humane bullet to the temple or perhaps a swift and sure current of potassium chloride injected into the arm.
But notice the subtle change in your perceptions when I tell you that these events are from the autobiography of AndrĂ© Gide, who in 1947—long after he’d publicized these very details—won the Nobel prize in literature. Gide is in fact bowdlerizing his time in Algiers with none other than Oscar Wilde.
Wilde took a key out of his pocket and showed me into a tiny apartment of two rooms… The youths followed him, each of them wrapped in a burnous that hid his face. Then the guide left us and Wilde sent me into the further room with little Mohammed and shut himself up in the other with the [other boy]. Every time since then that I have sought after pleasure, it is the memory of that night I have pursued.
It’s not that we think it’s perfectly fine for Gide and Wilde to have sex with minors or even that they shouldn’t have been punished for such behaviors. (In fact Wilde was sentenced in London to two years hard labor for related offenses not long after this Maghreb excursion with Gide and died in penniless ignominy.) But somehow, as with our commingled feelings for Michael Jackson, “the greatest entertainer of all time,” the fact that these men were national treasures somehow dilutes our moralistic anger, as though we’re more willing to suffer their vices given the remarkable literary gifts they bestowed.
Would you really have wanted Oscar Wilde euthanized as though he were a sick animal? Should AndrĂ© Gide, whom the New York Times hailed in their obituary as a man “judged the greatest French writer of this century by the literary cognoscenti,” have been deprived of his pen, torn to pieces by illiterate thugs? It’s complicated. And although in principle we know that all men are equal in the eyes of the law, just as we did for Michael Jackson during his child molestation trials, I have a hunch that many people tend to feel (and uncomfortably so) a little sympathy for the Devil under such circumstances.
In this column presented by Scientific American Mind magazine, research psychologist Jesse Bering of Queen's University Belfast ponders some of the more obscure aspects of everyday human behavior. Ever wonder why yawning is contagious, why we point with our index fingers instead of our thumbs or whether being breastfed as an infant influences your sexual preferences as an adult? Get a closer look at the latest data as “Bering in Mind” tackles these and other quirky questions about human nature.
Wednesday, July 1, 2009
Disorderly genius: How chaos drives the brain
HAVE you ever experienced that eerie feeling of a thought popping into your head as if from nowhere, with no clue as to why you had that particular idea at that particular time? You may think that such fleeting thoughts, however random they seem, must be the product of predictable and rational processes. After all, the brain cannot be random, can it? Surely it processes information using ordered, logical operations, like a powerful computer?
Actually, no. In reality, your brain operates on the edge of chaos. Though much of the time it runs in an orderly and stable way, every now and again it suddenly and unpredictably lurches into a blizzard of noise.
Neuroscientists have long suspected as much. Only recently, however, have they come up with proof that brains work this way. Now they are trying to work out why. Some believe that near-chaotic states may be crucial to memory, and could explain why some people are smarter than others.
In technical terms, systems on the edge of chaos are said to be in a state of "self-organised criticality". These systems are right on the boundary between stable, orderly behaviour - such as a swinging pendulum - and the unpredictable world of chaos, as exemplified by turbulence.
The quintessential example of self-organised criticality is a growing sand pile. As grains build up, the pile grows in a predictable way until, suddenly and without warning, it hits a critical point and collapses. These "sand avalanches" occur spontaneously and are almost impossible to predict, so the system is said to be both critical and self-organising. Earthquakes, avalanches and wildfires are also thought to behave like this, with periods of stability followed by catastrophic periods of instability that rearrange the system into a new, temporarily stable state.
Self-organised criticality has another defining feature: even though individual sand avalanches are impossible to predict, their overall distribution is regular. The avalanches are "scale invariant", which means that avalanches of all possible sizes occur. They also follow a "power law" distribution, which means bigger avalanches happen less often than smaller avalanches, according to a strict mathematical ratio. Earthquakes offer the best real-world example. Quakes of magnitude 5.0 on the Richter scale happen 10 times as often as quakes of magnitude 6.0, and 100 times as often as quakes of magnitude 7.0.
These are purely physical systems, but the brain has much in common with them. Networks of brain cells alternate between periods of calm and periods of instability - "avalanches" of electrical activity that cascade through the neurons. Like real avalanches, exactly how these cascades occur and the resulting state of the brain are unpredictable.
It might seem precarious to have a brain that plunges randomly into periods of instability, but the disorder is actually essential to the brain's ability to transmit information and solve problems. "Lying at the critical point allows the brain to rapidly adapt to new circumstances," says Andreas Meyer-Lindenberg from the Central Institute of Mental Health in Mannheim, Germany.
Disorder is essential to the brain's ability to transmit information and solve problems
The idea that the brain might be fundamentally disordered in some way first emerged in the late 1980s, when physicists working on chaos theory - then a relatively new branch of science - suggested it might help explain how the brain works.
The focus at that time was something called deterministic chaos, in which a small perturbation can lead to a huge change in the system - the famous "butterfly effect". That would make the brain unpredictable but not actually random, because the butterfly effect is a phenomenon of physical laws that do not depend on chance. Researchers built elaborate computational models to test the idea, but unfortunately they did not behave like real brains. "Although the results were beautiful and elegant, models based on deterministic chaos just didn't seem applicable when looking at the human brain," says Karl Friston, a neuroscientist at University College London.
In the 1990s, it emerged that the brain generates random noise, and hence cannot be described by deterministic chaos. When neuroscientists incorporated this randomness into their models, they found that it created systems on the border between order and disorder - self-organised criticality.
More recently, experiments have confirmed that these models accurately describe what real brain tissue does. They build on the observation that when a single neuron fires, it can trigger its neighbours to fire too, causing a cascade or avalanche of activity that can propagate across small networks of brain cells. This results in alternating periods of quiescence and activity - remarkably like the build-up and collapse of a sand pile.
Neural avalanches
In 2003, John Beggs of Indiana University in Bloomington began investigating spontaneous electrical activity in thin slices of rat brain tissue. He found that these neural avalanches are scale invariant and that their size obeys a power law. Importantly, the ratio of large to small avalanches fit the predictions of the computational models that had first suggested that the brain might be in a state of self-organised criticality (The Journal of Neuroscience, vol 23, p 11167).
To investigate further, Beggs's team measured how many other neurons a single cell in a slice of rat brain activates, on average, when it fires. They followed this line of enquiry because another property of self-organised criticality is that each event, on average, triggers only one other. In forest fires, for example, each burning tree sets alight one other tree on average - that's why fires keep going, but also why whole forests don't catch fire all at once.
Sure enough, the team found that each neuron triggered on average only one other. A value much greater than one would lead to a chaotic system, because any small perturbations in the electrical activity would soon be amplified, as in the butterfly effect. "It would be the equivalent of an epileptic seizure," says Beggs. If the value was much lower than one, on the other hand, the avalanche would soon die out.
Beggs's work provides good evidence that self-organised criticality is important on the level of small networks of neurons. But what about on a larger scale? More recently, it has become clear that brain activity also shows signs of self-organised criticality on a larger scale.
As it processes information, the brain often synchronises large groups of neurons to fire at the same frequency, a process called "phase-locking". Like broadcasting different radio stations at different frequencies, this allows different "task forces" of neurons to communicate among themselves without interference from others.
The brain also constantly reorganises its task forces, so the stable periods of phase-locking are interspersed with unstable periods in which the neurons fire out of sync in a blizzard of activity. This, again, is reminiscent of a sand pile. Could it be another example of self-organised criticality in the brain?
In 2006, Meyer-Lindenberg and his team made the first stab at answering that question. They used brain scans to map the connections between regions of the human brain and discovered that they form a "small-world network" - exactly the right architecture to support self-organised criticality.
Small-world networks lie somewhere between regular networks, where each node is connected to its nearest neighbours, and random networks, which have no regular structure but many long-distance connections between nodes at opposite sides of the network (see diagram). Small-world networks take the most useful aspects of both systems. In places, the nodes have many connections with their neighbours, but the network also contains random and often long links between nodes that are very far away from one another.
For the brain, it's the perfect compromise. One of the characteristics of small-world networks is that you can communicate to any other part of the network through just a few nodes - the "six degrees of separation" reputed to link any two people in the world. In the brain, the number is 13.
Meyer-Lindenberg created a computer simulation of a small-world network with 13 degrees of separation. Each node was represented by an electrical oscillator that approximated a neuron's activity. The results confirmed that the brain has just the right architecture for its activity to sit on the tipping point between order and disorder, although the team didn't measure neural activity itself (Proceedings of the National Academy of Sciences, vol 103, p 19518).
That clinching evidence arrived earlier this year, when Ed Bullmore of the University of Cambridge and his team used brain scanners to record neural activity in 19 human volunteers. They looked at the entire range of brainwave frequencies, from 0.05 hertz all the way up to 125 hertz, across 200 different regions of the brain.
Power laws again
The team found that the duration both of phase-locking and unstable resynchronisation periods followed a power-law distribution. Crucially, this was true at all frequencies, which means the phenomenon is scale invariant - the other key criterion for self-organised criticality.
What's more, when the team tried to reproduce the activity they saw in the volunteers' brains in computer models, they found that they could only do so if the models were in a state of self-organised criticality (PLoS Computational Biology, vol 5, p e1000314). "The models only showed similar patterns of synchronisation to the brain when they were in the critical state," says Bullmore.
The work of Bullmore's team is compelling evidence that self-organised criticality is an essential property of brain activity, says neuroscientist David Liley at Swinburne University of Technology in Melbourne, Australia, who has worked on computational models of chaos in the brain.
But why should that be? Perhaps because self-organised criticality is the perfect starting point for many of the brain's functions.
The neuronal avalanches that Beggs investigated, for example, are perfect for transmitting information across the brain. If the brain was in a more stable state, these avalanches would die out before the message had been transmitted. If it was chaotic, each avalanche could swamp the brain.
At the critical point, however, you get maximum transmission with minimum risk of descending into chaos. "One of the advantages of self-organised criticality is that the avalanches can propagate over many links," says Beggs. "You can have very long chains that won't blow up on you."
Self-organised criticality also appears to allow the brain to adapt to new situations, by quickly rearranging which neurons are synchronised to a particular frequency. "The closer we get to the boundary of instability, the more quickly a particular stimulus will send the brain into a new state," says Liley.
It may also play a role in memory. Beggs's team noticed that certain chains of neurons would fire repeatedly in avalanches, sometimes over several hours (The Journal of Neuroscience, vol 24, p 5216). Because an entire chain can be triggered by the firing of one neuron, these chains could be the stuff of memory, argues Beggs: memories may come to mind unexpectedly because a neuron fires randomly or could be triggered unpredictably by a neuronal avalanche.
The balance between phase-locking and instability within the brain has also been linked to intelligence - at least, to IQ. Last year, Robert Thatcher from the University of South Florida in Tampa made EEG measurements of 17 children, aged between 5 and 17 years, who also performed an IQ test.
The balance between stability and instability in the brain has been linked with intelligence, at least as measured by scores on an IQ test
He found that the length of time the children's brains spent in both the stable phase-locked states and the unstable phase-shifting states correlated with their IQ scores. For example, phase shifts typically last 55 milliseconds, but an additional 1 millisecond seemed to add as many as 20 points to the child's IQ. A shorter time in the stable phase-locked state also corresponded with greater intelligence - with a difference of 1 millisecond adding 4.6 IQ points to a child's score (NeuroImage, vol 42, p 1639).
Thatcher says this is because a longer phase shift allows the brain to recruit many more neurons for the problem at hand. "It's like casting a net and capturing as many neurons as possible at any one time," he says. The result is a greater overall processing power that contributes to higher intelligence.
Hovering on the edge of chaos provides brains with their amazing capacity to process information and rapidly adapt to our ever-changing environment, but what happens if we stray either side of the boundary? The most obvious assumption would be that all of us are a short step away from mental illness. Meyer-Lindenberg suggests that schizophrenia may be caused by parts of the brain straying away from the critical point. However, for now that is purely speculative.
Thatcher, meanwhile, has found that certain regions in the brains of people with autism spend less time than average in the unstable, phase-shifting states. These abnormalities reduce the capacity to process information and, suggestively, are found only in the regions associated with social behaviour. "These regions have shifted from chaos to more stable activity," he says. The work might also help us understand epilepsy better: in an epileptic fit, the brain has a tendency to suddenly fire synchronously, and deviation from the critical point could explain this.
"They say it's a fine line between genius and madness," says Liley. "Maybe we're finally beginning to understand the wisdom of this statement."
David Robson is a junior editor at New Scientist
Actually, no. In reality, your brain operates on the edge of chaos. Though much of the time it runs in an orderly and stable way, every now and again it suddenly and unpredictably lurches into a blizzard of noise.
Neuroscientists have long suspected as much. Only recently, however, have they come up with proof that brains work this way. Now they are trying to work out why. Some believe that near-chaotic states may be crucial to memory, and could explain why some people are smarter than others.
In technical terms, systems on the edge of chaos are said to be in a state of "self-organised criticality". These systems are right on the boundary between stable, orderly behaviour - such as a swinging pendulum - and the unpredictable world of chaos, as exemplified by turbulence.
The quintessential example of self-organised criticality is a growing sand pile. As grains build up, the pile grows in a predictable way until, suddenly and without warning, it hits a critical point and collapses. These "sand avalanches" occur spontaneously and are almost impossible to predict, so the system is said to be both critical and self-organising. Earthquakes, avalanches and wildfires are also thought to behave like this, with periods of stability followed by catastrophic periods of instability that rearrange the system into a new, temporarily stable state.
Self-organised criticality has another defining feature: even though individual sand avalanches are impossible to predict, their overall distribution is regular. The avalanches are "scale invariant", which means that avalanches of all possible sizes occur. They also follow a "power law" distribution, which means bigger avalanches happen less often than smaller avalanches, according to a strict mathematical ratio. Earthquakes offer the best real-world example. Quakes of magnitude 5.0 on the Richter scale happen 10 times as often as quakes of magnitude 6.0, and 100 times as often as quakes of magnitude 7.0.
These are purely physical systems, but the brain has much in common with them. Networks of brain cells alternate between periods of calm and periods of instability - "avalanches" of electrical activity that cascade through the neurons. Like real avalanches, exactly how these cascades occur and the resulting state of the brain are unpredictable.
It might seem precarious to have a brain that plunges randomly into periods of instability, but the disorder is actually essential to the brain's ability to transmit information and solve problems. "Lying at the critical point allows the brain to rapidly adapt to new circumstances," says Andreas Meyer-Lindenberg from the Central Institute of Mental Health in Mannheim, Germany.
Disorder is essential to the brain's ability to transmit information and solve problems
The idea that the brain might be fundamentally disordered in some way first emerged in the late 1980s, when physicists working on chaos theory - then a relatively new branch of science - suggested it might help explain how the brain works.
The focus at that time was something called deterministic chaos, in which a small perturbation can lead to a huge change in the system - the famous "butterfly effect". That would make the brain unpredictable but not actually random, because the butterfly effect is a phenomenon of physical laws that do not depend on chance. Researchers built elaborate computational models to test the idea, but unfortunately they did not behave like real brains. "Although the results were beautiful and elegant, models based on deterministic chaos just didn't seem applicable when looking at the human brain," says Karl Friston, a neuroscientist at University College London.
In the 1990s, it emerged that the brain generates random noise, and hence cannot be described by deterministic chaos. When neuroscientists incorporated this randomness into their models, they found that it created systems on the border between order and disorder - self-organised criticality.
More recently, experiments have confirmed that these models accurately describe what real brain tissue does. They build on the observation that when a single neuron fires, it can trigger its neighbours to fire too, causing a cascade or avalanche of activity that can propagate across small networks of brain cells. This results in alternating periods of quiescence and activity - remarkably like the build-up and collapse of a sand pile.
Neural avalanches
In 2003, John Beggs of Indiana University in Bloomington began investigating spontaneous electrical activity in thin slices of rat brain tissue. He found that these neural avalanches are scale invariant and that their size obeys a power law. Importantly, the ratio of large to small avalanches fit the predictions of the computational models that had first suggested that the brain might be in a state of self-organised criticality (The Journal of Neuroscience, vol 23, p 11167).
To investigate further, Beggs's team measured how many other neurons a single cell in a slice of rat brain activates, on average, when it fires. They followed this line of enquiry because another property of self-organised criticality is that each event, on average, triggers only one other. In forest fires, for example, each burning tree sets alight one other tree on average - that's why fires keep going, but also why whole forests don't catch fire all at once.
Sure enough, the team found that each neuron triggered on average only one other. A value much greater than one would lead to a chaotic system, because any small perturbations in the electrical activity would soon be amplified, as in the butterfly effect. "It would be the equivalent of an epileptic seizure," says Beggs. If the value was much lower than one, on the other hand, the avalanche would soon die out.
Beggs's work provides good evidence that self-organised criticality is important on the level of small networks of neurons. But what about on a larger scale? More recently, it has become clear that brain activity also shows signs of self-organised criticality on a larger scale.
As it processes information, the brain often synchronises large groups of neurons to fire at the same frequency, a process called "phase-locking". Like broadcasting different radio stations at different frequencies, this allows different "task forces" of neurons to communicate among themselves without interference from others.
The brain also constantly reorganises its task forces, so the stable periods of phase-locking are interspersed with unstable periods in which the neurons fire out of sync in a blizzard of activity. This, again, is reminiscent of a sand pile. Could it be another example of self-organised criticality in the brain?
In 2006, Meyer-Lindenberg and his team made the first stab at answering that question. They used brain scans to map the connections between regions of the human brain and discovered that they form a "small-world network" - exactly the right architecture to support self-organised criticality.
Small-world networks lie somewhere between regular networks, where each node is connected to its nearest neighbours, and random networks, which have no regular structure but many long-distance connections between nodes at opposite sides of the network (see diagram). Small-world networks take the most useful aspects of both systems. In places, the nodes have many connections with their neighbours, but the network also contains random and often long links between nodes that are very far away from one another.
For the brain, it's the perfect compromise. One of the characteristics of small-world networks is that you can communicate to any other part of the network through just a few nodes - the "six degrees of separation" reputed to link any two people in the world. In the brain, the number is 13.
Meyer-Lindenberg created a computer simulation of a small-world network with 13 degrees of separation. Each node was represented by an electrical oscillator that approximated a neuron's activity. The results confirmed that the brain has just the right architecture for its activity to sit on the tipping point between order and disorder, although the team didn't measure neural activity itself (Proceedings of the National Academy of Sciences, vol 103, p 19518).
That clinching evidence arrived earlier this year, when Ed Bullmore of the University of Cambridge and his team used brain scanners to record neural activity in 19 human volunteers. They looked at the entire range of brainwave frequencies, from 0.05 hertz all the way up to 125 hertz, across 200 different regions of the brain.
Power laws again
The team found that the duration both of phase-locking and unstable resynchronisation periods followed a power-law distribution. Crucially, this was true at all frequencies, which means the phenomenon is scale invariant - the other key criterion for self-organised criticality.
What's more, when the team tried to reproduce the activity they saw in the volunteers' brains in computer models, they found that they could only do so if the models were in a state of self-organised criticality (PLoS Computational Biology, vol 5, p e1000314). "The models only showed similar patterns of synchronisation to the brain when they were in the critical state," says Bullmore.
The work of Bullmore's team is compelling evidence that self-organised criticality is an essential property of brain activity, says neuroscientist David Liley at Swinburne University of Technology in Melbourne, Australia, who has worked on computational models of chaos in the brain.
But why should that be? Perhaps because self-organised criticality is the perfect starting point for many of the brain's functions.
The neuronal avalanches that Beggs investigated, for example, are perfect for transmitting information across the brain. If the brain was in a more stable state, these avalanches would die out before the message had been transmitted. If it was chaotic, each avalanche could swamp the brain.
At the critical point, however, you get maximum transmission with minimum risk of descending into chaos. "One of the advantages of self-organised criticality is that the avalanches can propagate over many links," says Beggs. "You can have very long chains that won't blow up on you."
Self-organised criticality also appears to allow the brain to adapt to new situations, by quickly rearranging which neurons are synchronised to a particular frequency. "The closer we get to the boundary of instability, the more quickly a particular stimulus will send the brain into a new state," says Liley.
It may also play a role in memory. Beggs's team noticed that certain chains of neurons would fire repeatedly in avalanches, sometimes over several hours (The Journal of Neuroscience, vol 24, p 5216). Because an entire chain can be triggered by the firing of one neuron, these chains could be the stuff of memory, argues Beggs: memories may come to mind unexpectedly because a neuron fires randomly or could be triggered unpredictably by a neuronal avalanche.
The balance between phase-locking and instability within the brain has also been linked to intelligence - at least, to IQ. Last year, Robert Thatcher from the University of South Florida in Tampa made EEG measurements of 17 children, aged between 5 and 17 years, who also performed an IQ test.
The balance between stability and instability in the brain has been linked with intelligence, at least as measured by scores on an IQ test
He found that the length of time the children's brains spent in both the stable phase-locked states and the unstable phase-shifting states correlated with their IQ scores. For example, phase shifts typically last 55 milliseconds, but an additional 1 millisecond seemed to add as many as 20 points to the child's IQ. A shorter time in the stable phase-locked state also corresponded with greater intelligence - with a difference of 1 millisecond adding 4.6 IQ points to a child's score (NeuroImage, vol 42, p 1639).
Thatcher says this is because a longer phase shift allows the brain to recruit many more neurons for the problem at hand. "It's like casting a net and capturing as many neurons as possible at any one time," he says. The result is a greater overall processing power that contributes to higher intelligence.
Hovering on the edge of chaos provides brains with their amazing capacity to process information and rapidly adapt to our ever-changing environment, but what happens if we stray either side of the boundary? The most obvious assumption would be that all of us are a short step away from mental illness. Meyer-Lindenberg suggests that schizophrenia may be caused by parts of the brain straying away from the critical point. However, for now that is purely speculative.
Thatcher, meanwhile, has found that certain regions in the brains of people with autism spend less time than average in the unstable, phase-shifting states. These abnormalities reduce the capacity to process information and, suggestively, are found only in the regions associated with social behaviour. "These regions have shifted from chaos to more stable activity," he says. The work might also help us understand epilepsy better: in an epileptic fit, the brain has a tendency to suddenly fire synchronously, and deviation from the critical point could explain this.
"They say it's a fine line between genius and madness," says Liley. "Maybe we're finally beginning to understand the wisdom of this statement."
David Robson is a junior editor at New Scientist
Subscribe to:
Posts (Atom)