A Scientific Study of the Human Mind and the Understanding of Human Behavior through the analysis and research of Meta Psychology.
Thursday, February 26, 2009
Why do some people kill themselves?
Finding out what makes certain people commit suicide and others not, could provide new tools for screening patients and providing help before it is too late (Image: Matt Carr/Taxi)
Finding out what makes certain people commit suicide and others not, could provide new tools for screening patients and providing help before it is too late
FOR a few months in late 2006 and early 2007, the woman who called herself kristi4 was one of the best-known members of the pro-anorexia community. As the administrator of a blog on LiveJournal.com, she dispensed advice, encouraged others and wrote candidly about her own struggles. Then, late one Friday night, after a series of entries describing what she was planning to do, kristi4 killed herself with an overdose of prescription sleeping pills, muscle relaxants and painkillers.
Her death was just one tragic data point in one of the most striking statistics in all of psychology. It has long been known that anorexia has the highest death rate of any mental illness: one out of every five people with anorexia eventually die of causes related to the disease. What has only now been recognised, however, is that a huge number of those deaths are from suicide rather than starvation. Someone who develops anorexia is 50 to 60 times more likely to kill themselves than people in the general population. No other group has a suicide rate anywhere near as high (Archives of General Psychiatry, vol 60, p 179).
Recently, psychologists have tried to explain why anorexia and suicide are so intimately connected, something which is helping to answer the wider question of why anyone would commit suicide. If this explanation holds up, it will give psychiatrists a new tool for screening patients and determining which of them are most likely to kill themselves, perhaps saving lives.
Suicide has always been a conundrum for psychologists and other researchers interested in human behaviour. Self-preservation is one of the strongest human instincts, so the drive to commit suicide must be even more powerful. But what causes it?
A century ago, both the sociologist Emile Durkheim and the psychoanalyst Sigmund Freud came up with sweeping explanations. Durkheim, not surprisingly, saw the roots of suicide in social factors, such as a failure to integrate into society, while Freud rooted his explanation in instinctual drives, particularly what he called the death instinct. More recent explanations have tended to focus on factors such as depression, hopelessness and emotional pain, but none of them have had much success in answering the fundamental question about suicide: why do some people kill themselves while others in seemingly identical circumstances do not?
Some progress has been made by crunching large amounts of data on suicide, says Harvard University psychologist Matthew Nock, who studies suicide and self-harm. Researchers have learned, for example, that suicide rates are rising and now account for 1.5 per cent of all deaths worldwide. Suicide is the second leading cause of death among people aged 15 to 24, after vehicle accidents. Women are more likely than men to attempt suicide, while men are much more likely to succeed.
1 million Approximate number of suicides worldwide each year
Most people who commit suicide have a mental disorder - anorexia, major depressive disorder, bipolar disorder, schizophrenia and borderline personality disorder are the most common, but an elevated suicide risk is part and parcel of many of the others, too. People who kill themselves also generally feel deeply depressed and hopeless at the time.
Every 40 seconds somebody dies by suicide
What the statistics do not tell us - and what psychologists most want to know - is exactly which people are most at risk. The vast majority of depressed, hopeless people do not commit suicide, so why do some do it?
In 2005, psychologist Thomas Joiner, a suicide specialist at Florida State University in Tallahassee whose own father committed suicide, set out to answer that question. By studying suicide statistics and paying particular attention to the groups with above average rates, Joiner believes he has found a common thread others have missed. "It was the first grand theory of suicide in quite a while," says Nock.
In essence, Joiner proposed that people who kill themselves must meet two sets of conditions on top of feeling depressed and hopeless. First, they must have a serious desire to die. This usually comes about when people feel they are an intolerable burden on others, while also feeling isolated from people who might provide a sense of belonging.
Second, and most important, people who succeed in killing themselves must be capable of doing the deed. This may sound obvious, but until Joiner pointed it out, no one had tried to figure out why some people are able to go through with it when most are not. No matter how seriously you want to die, Joiner says, it is not an easy thing to do. The self-preservation instinct is too strong.
There are two ways people who want to die develop the ability to override the self-preservation instinct, Joiner argues. One is by working up to it. In many cases a first suicide attempt is tentative, with shallow cuts or a mild overdose. It is only after multiple attempts that the actions are fatal.
20 Number of failed suicide attempts for each successful one
The other is to become accustomed to painful or scary experiences. Soldiers and police who have been shot at or seen their colleagues injured or killed are known to become inured to the idea of their own death. Both groups also have a higher-than-normal suicide rate. Similarly, doctors and surgeons who witness pain, injury and death are more likely to be able to contemplate it for themselves - the suicide rate for doctors is significantly higher than for the general population. Joiner describes this as a "steeliness" in the face of things that would intimidate most people.
Another group that displays steeliness are people with anorexia. Joiner had noted their heightened suicide rate in his original work, Why people die by suicide (Harvard University Press, 2005), but it wasn't until later that he grasped the importance of the connection.
That realisation began to dawn in 2006, during a seminar in which two of Joiner's graduate students, Jill Holm-Denoma and Tracy Witte, were listening to him describe the risk of suicide among people with anorexia. Witte observed that the high suicide rate had two possible explanations. Perhaps people with anorexia were no more likely to attempt suicide than people with other mental disorders, but the anorexia had so weakened their bodies that their suicide attempts were more likely to succeed. Alternatively, perhaps anorexia had so inured them to pain that they were more capable than others of doing what was necessary to kill themselves.
According to Joiner's hypothesis, the second explanation should be correct. So Holm-Denoma set out to test the prediction. She examined nine suicides chosen randomly, and what she found told a very clear story.
"These people would have died regardless of their body weight," she says. "We were just astounded by the lengths to which they went to make sure they were successful." Three jumped in front of trains. Two hung themselves. Two took large drug overdoses. One poisoned herself with sleeping pills and toilet bowl cleaner. And one locked herself in a gas station restroom and set fire to a trash can that produced enough carbon monoxide to asphyxiate her. Nine cases, of course, are not enough to prove the point, but the fact that all took such drastic measures to kill themselves says something (Journal of Affective Disorders, vol 107, p 231).
Anorexia offers a "perfect storm" of the factors laid out in Joiner's hypothesis, Holm-Denoma says. Social isolation is likely because people with anorexia avoid any interactions that might involve food - so that means not going out for a meal, no movies (the popcorn might be too tempting) and no stopping by a friend's house. The result is the "thwarted belongingness" that Joiner describes as a key factor in suicide.
Then there is the feeling that they have become an intolerable burden to their family and friends. One popular approach to treating anorexia in children, for example, involves having a parent oversee their child full-time.
Most importantly, anorexia means becoming inured to pain. Merciless starvation leads to intense and painful hunger pangs and major headaches. Osteoporosis is common, making fractures more likely, not to mention the chest pains caused by heart damage. Kristi4's blogs in the month leading up to her suicide show this perfect storm at work.
It is one of the strengths of Joiner's explanation, says Nock, that it makes testable predictions such as the one spotted by Witte. For example, it should be possible to develop psychological tests to measure how much of a burden people feel, or how thwarted, and then use them to predict who will commit suicide. It should also be possible to examine rates of suicide among various groups with the characteristics Joiner is talking about.
Those tests are slowly taking shape. Recent work by some of Joiner's students has shown that people who feel they are a burden and also experience thwarted belongingness are more likely to have suicidal thoughts (Journal of Consulting and Clinical Psychology, vol 76, p 72). A second study found that "painful and provocative events", such as shooting a gun or getting into a fight, tend to increase something Joiner calls "acquired capability" - a written test that measures someone's ability to hurt or kill themselves.
Meanwhile, University of Minnesota psychiatrist Scott Crow has studied suicide rates among people with bulimia and found that they, too, kill themselves at a much higher rate than the general population. Crow has found a four to six-fold increase in suicides in this group. Bulimia starves the body at some level, as indicated by various biochemical markers, so people with bulimia may well be inured to pain in much the same way as those with anorexia.
60 per cent increase in worldwide suicide rates since 1965
Even though the evidence is all pointing in the same direction, Joiner says many more tests will be needed before his ideas can be accepted as a general explanation for suicide. "It's a start," he says of the evidence assembled to date. "But we need something much more systematic."
Ultimately, he says, a better understanding of why people commit suicide should help clinicians better assess who is most at risk, and find new ways of preventing people from killing themselves. Long-term psychotherapy, for instance, could help chip away at a person's fearlessness and lessen the likelihood that they will commit suicide.
But as long as people steel themselves to pain, as long as they feel isolated and a burden to others, Joiner's theory predicts that suicide will be with us as well.
Robert Pool is a writer based in Tallahassee, Florida
Wednesday, February 25, 2009
Your Mental Movie Soundtrack Brain hub links music, memory, and emotion
Intuitively, you probably already know that music is a powerful memory trigger. Just shuffle through your favorite playlists and you’ll probably land on a few nostalgic tunes that take you back to high school, your first love, or that perfect summer vacation. Now a recent study proves that music, memory, and emotion are linked in the brain, and the research could have implications for the therapy of Alzheimer’s patients.
Researchers at the University of California, Davis, mapped the brain activity of a group of students listening to music, and found that the region of the brain that supports and retrieves memories—the medial pre-frontal cortex (just behind the forehead)—is also a hub linking memories, emotion, and familiar music.
Petr Janata, the author of the study and an associate professor of psychology at UC Davis’s Center for Mind and Brain, had students listen to excerpts of songs that were popular when the students were growing up (between the ages of 8 and 18). Janata recorded their brain activity using a functional MRI (which is used to measure blood flow in the brain), and then compared the fMRI with a survey the students took after listening to the music. He found that the songs linked with the most powerful memories corresponded with more activity in the upper (dorsal) part of the medial pre-frontal cortex.
“What seems to happen is that a piece of familiar music serves as a soundtrack for a mental movie that starts playing in our head,” Janata said. “It calls back memories of a particular person or place, and you might all of a sudden see that person’s face in your mind’s eye. Now we can see the association between those two things—music and memory.”
The medial pre-frontal cortex also happens to be the area of the brain last to atrophy in Alzheimer’s patients. And memories of autobiographically important music seem to be spared in people with the disease. Janata hopes to use his research to develop music-based therapy for patients.
“Providing patients with MP3 players and customized playlists could prove to be a quality-of-life improvement strategy that would be both effective and economical,” he said. So be kind: load up a soundtrack for granny and let her mental movie roll.
Sunday, February 22, 2009
'Anti-phobia pill' breaks link between memory and fear
Phobias and post-traumatic stress could be banished for good by taking a commonly prescribed drug for blood pressure.
Previous studies had suggested that people who experienced traumatic events such as rape and car crashes showed fewer signs of stress when recalling the event if they had first been injected with the beta blocker propranolol, but it was unclear whether the effect would be permanent or not. Fearful memories often return, even after people have been treated for them.
To investigate whether propranolol could stop fear returning in the longer term, Merel Kindt and her colleagues at the University of Amsterdam, the Netherlands, conditioned 60 healthy students to associate a picture of a spider with an electric shock, so that they would eventually be startled by the picture even in the absence of a shock.
However, if the conditioned students were given oral propranolol before seeing the picture, their startle response was eliminated. What's more, it didn't return when the students were put through a second round of conditioning that should have reinstated their fear - suggesting that the association may have been permanently broken.
Shock tactics
Those given a placebo pill could eventually be trained not to be startled by the spider picture, by repeatedly showing it to them in the absence of a shock.
A similar technique is often used in phobia clinics - exposing people with arachnophobia to spiders in a safe and calm environment, for example. However, "even after successful treatment of anxiety disorders many fears and phobias come back," says Kindt.
When those in the placebo group were given a series of electric shocks, their fear of the spider also returned, while those in the propranolol group continued to react calmly to the spider picture, suggesting that the association may have been permanently erased, or at least negated to such a point that it has no effect.
When people experience traumatic events, the body releases adrenaline - also called epinephrine - which affects an area of the brain involved in emotional processing called the amygdala, and makes an emotional connection to the memory.
Recurrence fear
Reliving the memory triggers further release of adrenaline, reinforcing the memory still further. Since propranolol blocks adrenaline receptors in the amygdala, Kindt believes it may also block this reinforcement process and break fear association.
"We can't prove that the memory has gone away, but it is at least weakened so much that we can't find it anymore," says Kindt.
However, Chris Brewin, a memory expert at University College London, UK, says the findings are interesting, but cautions that Kindt's group only tested the volunteers over the course of three days.
"The fear might come back if they tested them several weeks later," he says.
Also, Kindt only looked at the degree to which the volunteers were startled - but conditions like post traumatic stress disorder often involve other emotions such as anger and shame, and we don't know how propranolol would affect them, he says.
Journal reference: Nature Neuroscience
Why would a chimpanzee attack a human?
By Katherine Harmon
Earlier this week, a 14-year-old, 200-pound (90-kilogram) pet chimpanzee in Stamford, Conn., left a woman in critical condition after attacking her—mutilating her face and hands. The owner, Sandra Herold, who tried to stop the attack, was also injured and briefly hospitalized. The victim remains in critical condition.
The chimp, Travis, who was shot and killed by police officers at the scene, was apparently a friendly fixture around the neighborhood. He appeared in television commercials and had a sapiens-level CV that included using a computer, bathing and sipping wine from a stemmed glass, according to The New York Times. Reports, however, are starting to surface that Travis might have bitten another woman in 1996 and that Herold had been warned by animal control that her pet could be dangerous.
Chimpanzees, with a genetic profile that's 98 percent like ours, can seem like cute, hairy iterations of people. But periodic violent attacks on humans, including one in Havilah, Calif., in 2005 in which a man was maimed by two chimps at an animal sanctuary, are reminders that the animals have at least one big difference: brute strength.
So why would an allegedly acclimated chimpanzee turn on a human—especially one whom he had known? Travis was reportedly suffering from Lyme disease, caused by a tick-borne bacterium and known to cause fatigue, joint problems and mental difficulties—including trouble focusing and poor memory in humans. Some have suggested that the attack was spurred by Xanax, a prescription drug used to treat anxiety disorders in humans, with side effects that can—but rarely—include depression, confusion and problem behavior. Travis’ owner claims to have given him a Xanax-laced tea the day of the attack.
To find out more about chimpanzee attacks, we spoke with Frans de Waal, lead biologist from the Yerkes National Primate Research Center. He is affiliated with the Living Links Center at Emory University in Atlanta where he is a professor of psychology, and is also author of The New York Times notable book of the year, Our Inner Ape.
[An edited transcript of the interview follows.]
Are captive chimpanzee attacks on humans common?
Yeah, definitely common. Most of the time they attack through cage bars. They bite off fingers. It happens more often with people they don't know very well and people who aren't familiar with chimpanzees. But it has happened to many of the best scientists and researchers, who are now missing digits. The reason we have them behind bars in zoos and research settings is because chimpanzees can be very dangerous—it's to protect ourselves. This was a sort of free-ranging chimp, which is much more dangerous.
But chimps in the wild are not used to people—they're afraid of them. That's why Jane Goodall had to habituate them. So, really wild chimps don't attack people. But in captivity, they have learned in the meantime that they are stronger than humans.
How strong are they?
The chimpanzee has strength for a human that is utterly incomprehensible. People watch pro wrestlers on TV and think they are strong. But a pro wrestler would not be able to hold a chimpanzee still if they wanted to. Chimpanzee males have been measured as having five times the arm strength as a human male.
Even a young chimpanzee of four or five years, you could not hold it still if you wanted to. Pound-for-pound, their muscles are much stronger. And the adult males, like Travis—unless his were filed down—have big canine teeth. So you have a very dangerous creature in front of you that is impossible to control.
Do chimps in captivity show more aggressive behavior than those in the wild?
In the wild they're pretty aggressive. They have warfare among groups, where males kill other males, and they have been known to commit infanticide. Aggression is a common part of the chimpanzee behavior, whether it's between or within groups.
They can show tremendous mutilation. They go for the face; they go for the hands and feet; they go for the testicles. To outsiders, they have very nasty behaviors.
Are male chimpanzees more aggressive than females?
Yes, that's for sure.
What might cause a chimp to attack someone it knows?
They're very complex creatures. People must not assume that with someone they already know there's not some underlying tension. It's often impossible to figure out what reason they have for attacking.
Having a chimp in your home is like having a tiger in your home. It's not really very different. They are both very dangerous.
Do you think Lyme disease or the Xanax might have been a factor in the attack?
It's all possible. It's possible it was the Xanax. In general, in chimpanzees—because they are so genetically close to us—they will react very similarly to drugs. It might be that the dosages are different, but it really should be pretty much the same.
A chimp in your home is like a time bomb. It may go off for a reason that we may never understand. I don't know any chimp relationship that has been harmonious. Usually these animals end up in a cage. They cannot be controlled.
When a chimp is young, they're very cute and affectionate and funny and playful. There's a lot of appeal. But that's like a tiger cub—they're also a lot of fun to have.
What happens when people decide they can't live with a chimpanzee pet any longer?
There are chimpanzee sanctuaries. If you want to put a chimp in a sanctuary, I would think you would have to come with a lot of money—it's pretty much for lifelong maintenance. A chimp can live for about 50 years, and 10 is usually the age when people don't want them any more. So that's 40 years of care.
I don't know where people would find these animals or why you would want to have them. Even if a chimp were not dangerous, you have to wonder if the chimp is happy in a human household environment.
Thursday, February 19, 2009
Six Ways to Boost Brainpower
By Emily Anthes
Amputees sometimes experience phantom limb sensations, feeling pain, itching or other impulses coming from limbs that no longer exist. Neuroscientist Vilayanur S. Ramachandran worked with patients who had so-called phantom limbs, including Tom, a man who had lost one of his arms.
Ramachandran discovered that if he stroked Tom’s face, Tom felt like his missing fingers were also being touched. Each part of the body is represented by a different region of the somatosensory cortex, and, as it happens, the region for the hand is adjacent to the region for the face. The neuroscientist deduced that a remarkable change had taken place in Tom’s somatosensory cortex.
Ramachandran concluded that because Tom’s cortex was no longer getting input from his missing hand, the region processing sensation from his face had slowly taken over the hand’s territory. So touching Tom’s face produced sensation in his nonexistent fingers.
This kind of rewiring is an example of neuroplasticity, the adult brain’s ability to change and remold itself. Scientists are finding that the adult brain is far more malleable than they once thought. Our behavior and environment can cause substantial rewiring of the brain or a reorganization of its functions and where they are located. Some believe that even our patterns of thinking alone are enough to reshape the brain.
Researchers now know that neurogenesis (the birth of new neurons) is a normal feature of the adult brain. Studies have shown that one of the most active regions for neurogenesis is the hippocampus, a structure that is vitally important for learning and long-term memory.
Neurogenesis also takes place in the olfactory bulb, which is involved in processing smells. But not all the neurons that are born survive; in fact, most of them die. To survive, the new cells need nutrients and connections with other neurons that are already thriving. Scientists are currently identifying the factors that affect the rate of neurogenesis and the survival of new cells. Mental and physical exercise, for instance, both boost neuron survival.
METHOD 1: EXERCISE
Mice that run on wheels increase the number of neurons in their hippocampus and perform better on tests of learning and memory. Studies of humans have revealed that exercise can improve the brain’s executive functions (planning, organizing, multitasking, and more). Exercise is also well known for its mood-boosting effects, and people who exercise are less likely to get dementia as they age. Among those who are already aged, athletic senior citizens have better executive function than do those who are sedentary; even seniors who have spent their entire lives on the couch can improve these abilities just by starting to move more in their golden years.
A variety of mechanisms might be responsible for this brain boost. Exercise increases blood flow to the brain, which also increases the delivery of oxygen, fuel and nutrients to those hard-working neurons. Research has shown that exercise can increase levels of a substance called brain-derived neurotrophic factor (BDNF), which encourages growth, communication and survival of neurons.
Of course, all this research does nothing to help explain dumb jocks.
On the Frontier
New research suggests a little music can make your workout better yet. Volunteers completed two workout sessions. In one, they sweated to the sweet sound of silence; in the other, they listened to Vivaldi’s Four Seasons. After each workout, participants completed assessments of their mood and verbal skills. Exercise alone was enough to boost both, but verbal scores improved twice as much when the exercisers had tunes to listen to. Maybe you can get your insurance company to pay for a new iPod.
Cocktail Party Tidbits
>> Exercise also improves sleep quality, a pile of studies suggests. And immune function. Is there anything it can’t do?
>> You don’t need to be Chuck Norris (thankfully) to get the brain benefits of exercise. Studies of senior citizens have shown that as little as 20 minutes of walking a day can do the trick.
METHOD 2: DIET
The brain needs fuel just as the body does. So what will really boost your brainpower, and what will make you lose your mind? Saturated fat, that familiar culprit, is no better for the brain than it is for the body. Rats fed diets high in saturated fat underperformed on tests of learning and memory, and humans who live on such diets seem to be at increased risk for dementia.
Not all fat is bad news, however. The brain is mostly fat—all those cell membranes and myelin coverings require fatty acids—so it is important to eat certain fats, particularly omega-3 fats, which are found in fish, nuts and seeds. Alzheimer’s disease, depression, schizophrenia and other disorders may be associated with low levels of omega-3 fatty acids.
Fruits and vegetables also appear to be brain superfoods. Produce is high in substances called antioxidants, which counteract atoms that can damage brain cells. Researchers have found that high-antioxidant diets keep learning and memory sharp in aging rats and even reduce the brain damage caused by strokes. That’s food for thought.
On the Frontier
It’s not just what you eat that affects the brain. It’s also how much. Research has shown that laboratory animals fed calorie-restricted diets—anywhere from 25 to 50 percent less than normal—live longer than other animals do. And it turns out they also have improved brain function, performing better on tests of memory and coordination. Rodents on calorie-restricted diets are also better able to resist the damage that accompanies Alzheimer’s, Parkinson’s and Huntington’s disease.
Cocktail Party Tidbits
>> Some of the best brain foods: walnuts, blueberries and spinach.
>> It is especially important that babies get enough fat. Babies who don’t get enough of the stuff have trouble creating the fatty myelin insulation that helps neurons transmit signals. Luckily for babies, breast milk is 50 percent fat.
>> Populations that traditionally eat diets high in omega-3 fatty acids tend to have lower rates of disorders of the central nervous system.
METHOD 3: STIMULANTS
Stimulants are substances that rev up the nervous system, increasing heart rate, blood pressure, energy, breathing and more. Caffeine is probably the most famous of the group. (It is actually the most widely used “drug” in the world.) By activating the central nervous system, caffeine boosts arousal and alertness. In high doses, though, this stimulation can go too far, causing jitters, anxiety and insomnia.
Cocaine and amphetamines are less benign. Although they work on the brain through different mechanisms, they have similar effects. Taking them increases the release of some of the brain’s feel-good neurotransmitters—including dopamine and serotonin—and produces a rush of euphoria. They also increase alertness and energy.
That all sounds pretty good, but cocaine and amphetamines are extremely addictive drugs and in high doses they can cause psychosis and withdrawal. The withdrawal symptoms are nasty and can lead to depression, the opposite of that euphoric feeling. And of course, an overdose can kill you.
On the Frontier
Although high doses of caffeine can undoubtedly have unpleasant effects (ranging from irritability to the most unpleasant of all: death in rare cases), small to moderate amounts can boost our mental functioning in ways researchers are now measuring.
One study showed that the equivalent of two cups of coffee can boost short-term memory and reaction time. Functional MRI scans taken during the study also revealed that volunteers who had been given caffeine had increased activity in the brain regions involving attention. In addition, research suggests caffeine can protect against age-related memory decline in older women.
Cocktail Party Tidbits
>> Three quarters of the caffeine we ingest comes from coffee. Try to limit yourself to fewer than 100 cups a day. That much coffee contains about 10 grams of caffeine, enough to cause fatal complications.
>> One of fiction’s most famous stimulant users is the great caper cracker Sherlock Holmes. Many of the detective’s capers include descriptions of the relief he found from injecting cocaine. It must be tough to make sure justice is done.
METHOD 4: VIDEO GAMES
Video games could save your life. Surgeons who spend at least a few hours a week playing video games make one-third fewer errors in the operating room than nongaming doctors do. Indeed, research has shown that video games can improve mental dexterity, while boosting hand-eye coordination, depth perception and pattern recognition. Gamers also have better attention spans and information-processing skills than the average Joe has. When nongamers agree to spend a week playing video games (in the name of science, of course), their visual-perception skills improve. And strike your notions of gamers as outcasts: one researcher found that white-collar professionals who play video games are more confident and social.
Of course, we cannot talk about the effects of video games without mentioning the popular theory that they are responsible for increasing real-world violence. A number of studies have reinforced this link. Young men who play a lot of violent video games have brains that are less responsive to graphic images, suggesting that these gamers have become desensitized to such depictions. Another study revealed that gamers had patterns of brain activity consistent with aggression while playing first-person shooter games.
This does not necessarily mean these players will actually be violent in real life. The connections are worth exploring, but so far the data do not support the idea that the rise of video games is responsible for increased youth violence.
On the Frontier
Video games activate the brain’s reward circuits but do so much more in men than in women, according to a new study. Researchers hooked men and women up to functional MRI machines while the participants played a video game designed for the study. Both groups performed well, but the men showed more activity in the limbic system, which is associated with reward processing. What is more, the men showed greater connectivity between the structures that make up the reward circuit, and the better this connection was in a particular player, the better he performed. There was no such correlation in women. Men are more than twice as likely as women are to say they feel addicted to video games.
Cocktail Party Tidbits
>> Video games are a $10-billion industry in the U.S.
>> In 2003 a 16-year-old boy shot and killed two police officers and a police dispatcher. Two years later the families of the victims filed suit against the company that made the massively popular video game Grand Theft Auto. The lawsuit alleges that the perpetrator was inspired by his obsession with the controversial video game.
METHOD 5: MUSIC
When you turn on Queen’s Greatest Hits, the auditory cortex analyzes the many components of the music: volume, pitch, timbre, melody and rhythm. But there’s more to music’s interaction with the brain than just the raw sound. Music can also activate your brain’s reward centers and depress activity in the amygdala, reducing fear and other negative emotions.
A highly publicized study suggested that listening to Mozart could boost cognitive performance, inspiring parents everywhere to go out and buy classical CDs for their children. The idea of a “Mozart effect” remains popular, but the original study has been somewhat discredited, and any intellectual boost that comes from listening to music seems to be tiny and temporary. Nevertheless, music does seem to possess some good vibrations. It can treat anxiety and insomnia, lower blood pressure, soothe patients with dementia, and help premature babies to gain weight and leave the hospital sooner.
Music training can bolster the brain. The motor cortex, cerebellum and corpus callosum (which connects the brain’s two sides) are all bigger in musicians than in nonmusicians. And string players have more of their sensory cortices devoted to their fingers than do those who don’t play the instruments. There is no agreement yet on whether musical training makes you smarter, but some studies have indeed shown that music lessons can improve the spatial abilities of young kids.
On the Frontier
Music lessons and practice during childhood increase the sensitivity of the brain stem to the sounds of human speech. According to a recent study, the brain stem is involved in very basic encoding of sound, and lots of exposure to music can help fine-tune this system, even in kids without particular musical gifts.
So buck up, tone-deaf children of the world! Think of it like eating vegetables: chewing on that clarinet is good for you.
Cocktail Party Tidbits
>> The auditory cortex is activated by singing a song in your head. The visual cortex is activated by merely imagining a musical score.
>> Playing classical and soothing music can increase the milk yield of dairy cows.
METHOD 6: MEDITATION
Forget apples. If reams of scientific studies are to be believed (and such studies usually are), an om a day can keep the doctor away. Meditation, or the turning of the mind inward for contemplation and relaxation, seems to help all types of conditions—anxiety disorders, sure, but it can also reduce pain and treat high blood pressure, asthma, insomnia, diabetes, depression and even skin conditions.
And regular meditators say they feel more at ease and more creative than nonmeditators do.
Researchers are now illuminating the actual brain changes caused by meditation by sticking meditators into brain-imaging machines. For one, although the brain’s cells typically fire at all different times, during meditation they fire in synchrony. Expert meditators also show spikes of brain activity in the left prefrontal cortex, an area of the brain that has generally been associated with positive emotions. And those who had the most activity in this area during meditation also had big boosts in immune system functioning.
Meditation can increase the thickness of the cerebral cortex, particularly in regions associated with attention and sensation. (The growth does not seem to result from the cortex growing new neurons, though—it appears that the neurons already there make more connections, the number of support cells increases, and blood vessels in that area get bigger.)
On the Frontier
Meditation can increase focus and attention, improving performance on cognitive tasks. Researchers spent three months training volunteers in the practice of Vipassana meditation, which centers on minimizing distractions. Then the volunteers were asked to perform a task in which they had to pick a few numbers out of a stream of letters. People who had undergone meditation training were much better at identifying numbers that briefly flashed onto a computer screen. They also seemed to be able to do this without exerting as much mental energy.
Thursday, February 12, 2009
Vaccines didn't cause autism, court rules
The ruling came from a panel of "special masters" who began hearing three test cases in 2007 involving children with autism -- a disorder that their parents contend was triggered by the vaccine against measles, mumps and rubella combined with vaccines containing thimerosal, a preservative containing mercury.
Three families -- the Cedillos, the Hazlehursts and the Snyders -- sought compensation from the Vaccine Injury Compensation Program, but the panel ruled that they had not presented sufficient evidence to prove that the childhood vaccines caused autism in their children.
"I feel deep sympathy and admiration for the Cedillo family," Special Master George L. Hastings Jr. wrote in his ruling in the case involving 14-year-old Michelle Cedillo, who cannot speak, wears a diaper and requires round-the-clock monitoring in case she has a seizure.
"And I have no doubt that the families of countless other autistic children, families that cope every day with the tremendous challenges of caring for autistic children, are similarly deserving of sympathy and admiration. However, I must decide this case not on sentiment, but by analyzing the evidence," Hastings wrote. "In this case the evidence advanced by the petitioners has fallen far short of demonstrating such a link." VideoDr.
In a statement shortly after the release of the decisions, the U.S. Department of Health and Human Services said it continues to support research "to better understand the cause of autistic disorders and develop more effective methods of treatment." Paging Dr. Gupta blog: What the ruling tells us
However, "the medical and scientific communities ... have found no association between vaccines and autism."
"Hopefully, the determination by the Special Masters will help reassure parents that vaccines do not cause autism," the statement said.
Since 2001, thousands of parents of autistic children have filed petitions seeking compensation from the VICP at HHS.
By mid-2008, more than 5,300 cases had been filed in the program -- and 5,000 of those were still awaiting adjudication, according to the agency.
A litigation steering committee is representing thousands of families that fall into three categories: those that claim that the measles, mumps and rubella vaccine can combine with thimerosal-containing vaccines to cause autism; those who claim thimerosal-containing vaccines alone can cause autism; and those who claim that MMR vaccines, without any link to thimerosal, can cause autism.
Tuesday, February 10, 2009
Brain scan may reveal risk for Alzheimer's disease
(CNN) -- Key structural changes have been identified in the brain images of some patients with mild cognitive impairment which could help determine who's at greatest risk for developing Alzheimer's disease.
Researchers at the University of California, San Diego, studied MRI scans of 84 patients with Alzheimer's disease, 175 patients with mild cognitive impairment, or MCI, and 139 images of healthy brains.
"Our initial goal was to locate similarities in the patients with Alzheimer's disease to those with MCI, in the hopes of finding a method to predict [MCI patients'] likelihood of developing the disease," said lead study author Linda McEvoy, assistant project scientist at UCSD's department of radiology.
Neuroimaging results for the patients with Alzheimer's disease were as expected, according to the study, which was published online in the journal Radiology. Atrophy, which is loss of brain tissue, was visible throughout the brain. The temporal and parietal lobes, which affect cognitive function, saw the most damage.
What surprised researchers were the differences in images from the MCI patients. More than 50 percent of the brains in the MCI group showed atrophy similar to the Alzheimer's disease patients. The other half of the MCI patients showed only small amounts of tissue damage.
"Although the symptoms for the entire MCI group were primarily memory problems, other parts of the brain were impacted in over half the group," McEvoy said. "And even though these patients [with Alzheimer's-like atrophy] don't have problems with their cognitive function now, their MCI will likely develop to that in the future."
Researchers also evaluated the brains of the MCI group one year after initial testing. They found that patients who earlier had mild cognitive impairment plus signs of atrophy were getting worse. Twenty-nine percent of the group had since been diagnosed with Alzheimer's disease and the others had begun to show signs of more serious cognitive decline.
The condition of patients in the MCI group whose scans showed minimal signs of atrophy the previous year remained about the same. "Only 8 percent of this group had developed Alzheimer's disease. The rest of the patients were stable and their symptoms had not increased," McEvoy said.
Bill Thies, chief medical and scientific officer for the Alzheimer's Association, underscored the significance of these findings. "What this study really shows is how different people with MCI can be, despite having similar symptoms. We can now use this information to create new treatments," he said.
There are several drugs on the market that treat the symptoms of Alzheimer's disease, but none that prevent its progression.
Clinical trials may be able to use this data to select a better pool of candidates when testing new drugs. "If they use a MCI patient with loss of brain tissue, someone who we now know is progressing fast towards Alzheimer's disease, we'd be able to quickly figure out if drug 'X' is slowing things down or not helping at all," Thies added.
In addition, researchers hope that within the next few years patients could regularly be tested by their physicians to determine their risk of developing Alzheimer's. "If nothing else it would be good information for their family members to have early on, to be better prepared for the future." McEvoy said.
Over 5 million Americans have Alzheimer's disease and an estimated three and a half million have mild cognitive impairment.
Saturday, February 7, 2009
Born believers: How your brain creates God
This anomaly was documented in the early 1970s, but only now is science beginning to tell us why. It turns out that human beings have a natural inclination for religious belief, especially during hard times. Our brains effortlessly conjure up an imaginary world of spirits, gods and monsters, and the more insecure we feel, the harder it is to resist the pull of this supernatural world. It seems that our minds are finely tuned to believe in gods.
Religious ideas are common to all cultures: like language and music, they seem to be part of what it is to be human. Until recently, science has largely shied away from asking why. "It's not that religion is not important," says Paul Bloom, a psychologist at Yale University, "it's that the taboo nature of the topic has meant there has been little progress."
The origin of religious belief is something of a mystery, but in recent years scientists have started to make suggestions. One leading idea is that religion is an evolutionary adaptation that makes people more likely to survive and pass their genes onto the next generation. In this view, shared religious belief helped our ancestors form tightly knit groups that cooperated in hunting, foraging and childcare, enabling these groups to outcompete others. In this way, the theory goes, religion was selected for by evolution, and eventually permeated every human society (New Scientist, 28 January 2006, p 30)
The religion-as-an-adaptation theory doesn't wash with everybody, however. As anthropologist Scott Atran of the University of Michigan in Ann Arbor points out, the benefits of holding such unfounded beliefs are questionable, in terms of evolutionary fitness. "I don't think the idea makes much sense, given the kinds of things you find in religion," he says. A belief in life after death, for example, is hardly compatible with surviving in the here-and-now and propagating your genes. Moreover, if there are adaptive advantages of religion, they do not explain its origin, but simply how it spread.
An alternative being put forward by Atran and others is that religion emerges as a natural by-product of the way the human mind works.
That's not to say that the human brain has a "god module" in the same way that it has a language module that evolved specifically for acquiring language. Rather, some of the unique cognitive capacities that have made us so successful as a species also work together to create a tendency for supernatural thinking. "There's now a lot of evidence that some of the foundations for our religious beliefs are hard-wired," says Bloom.
Much of that evidence comes from experiments carried out on children, who are seen as revealing a "default state" of the mind that persists, albeit in modified form, into adulthood. "Children the world over have a strong natural receptivity to believing in gods because of the way their minds work, and this early developing receptivity continues to anchor our intuitive thinking throughout life," says anthropologist Justin Barrett of the University of Oxford.
So how does the brain conjure up gods? One of the key factors, says Bloom, is the fact that our brains have separate cognitive systems for dealing with living things - things with minds, or at least volition - and inanimate objects.
This separation happens very early in life. Bloom and colleagues have shown that babies as young as five months make a distinction between inanimate objects and people. Shown a box moving in a stop-start way, babies show surprise. But a person moving in the same way elicits no surprise. To babies, objects ought to obey the laws of physics and move in a predictable way. People, on the other hand, have their own intentions and goals, and move however they choose.
Mind and matter
Bloom says the two systems are autonomous, leaving us with two viewpoints on the world: one that deals with minds, and one that handles physical aspects of the world. He calls this innate assumption that mind and matter are distinct "common-sense dualism". The body is for physical processes, like eating and moving, while the mind carries our consciousness in a separate - and separable - package. "We very naturally accept you can leave your body in a dream, or in astral projection or some sort of magic," Bloom says. "These are universal views."
There is plenty of evidence that thinking about disembodied minds comes naturally. People readily form relationships with non-existent others: roughly half of all 4-year-olds have had an imaginary friend, and adults often form and maintain relationships with dead relatives, fictional characters and fantasy partners. As Barrett points out, this is an evolutionarily useful skill. Without it we would be unable to maintain large social hierarchies and alliances or anticipate what an unseen enemy might be planning. "Requiring a body around to think about its mind would be a great liability," he says.
Useful as it is, common-sense dualism also appears to prime the brain for supernatural concepts such as life after death. In 2004, Jesse Bering of Queen's University Belfast, UK, put on a puppet show for a group of pre-school children. During the show, an alligator ate a mouse. The researchers then asked the children questions about the physical existence of the mouse, such as: "Can the mouse still be sick? Does it need to eat or drink?" The children said no. But when asked more "spiritual" questions, such as "does the mouse think and know things?", the children answered yes.
Default to god
Based on these and other experiments, Bering considers a belief in some form of life apart from that experienced in the body to be the default setting of the human brain. Education and experience teach us to override it, but it never truly leaves us, he says. From there it is only a short step to conceptualising spirits, dead ancestors and, of course, gods, says Pascal Boyer, a psychologist at Washington University in St Louis, Missouri. Boyer points out that people expect their gods' minds to work very much like human minds, suggesting they spring from the same brain system that enables us to think about absent or non-existent people.
The ability to conceive of gods, however, is not sufficient to give rise to religion. The mind has another essential attribute: an overdeveloped sense of cause and effect which primes us to see purpose and design everywhere, even where there is none. "You see bushes rustle, you assume there's somebody or something there," Bloom says.
This over-attribution of cause and effect probably evolved for survival. If there are predators around, it is no good spotting them 9 times out of 10. Running away when you don't have to is a small price to pay for avoiding danger when the threat is real.
Again, experiments on young children reveal this default state of the mind. Children as young as three readily attribute design and purpose to inanimate objects. When Deborah Kelemen of the University of Arizona in Tucson asked 7 and 8-year-old children questions about inanimate objects and animals, she found that most believed they were created for a specific purpose. Pointy rocks are there for animals to scratch themselves on. Birds exist "to make nice music", while rivers exist so boats have something to float on. "It was extraordinary to hear children saying that things like mountains and clouds were 'for' a purpose and appearing highly resistant to any counter-suggestion," says Kelemen.
In similar experiments, Olivera Petrovich of the University of Oxford asked pre-school children about the origins of natural things such as plants and animals. She found they were seven times as likely to answer that they were made by god than made by people.
These cognitive biases are so strong, says Petrovich, that children tend to spontaneously invent the concept of god without adult intervention: "They rely on their everyday experience of the physical world and construct the concept of god on the basis of this experience." Because of this, when children hear the claims of religion they seem to make perfect sense.
Our predisposition to believe in a supernatural world stays with us as we get older. Kelemen has found that adults are just as inclined to see design and intention where there is none. Put under pressure to explain natural phenomena, adults often fall back on teleological arguments, such as "trees produce oxygen so that animals can breathe" or "the sun is hot because warmth nurtures life". Though she doesn't yet have evidence that this tendency is linked to belief in god, Kelemen does have results showing that most adults tacitly believe they have souls.
Boyer is keen to point out that religious adults are not childish or weak-minded. Studies reveal that religious adults have very different mindsets from children, concentrating more on the moral dimensions of their faith and less on its supernatural attributes.
Even so, religion is an inescapable artefact of the wiring in our brain, says Bloom. "All humans possess the brain circuitry and that never goes away." Petrovich adds that even adults who describe themselves as atheists and agnostics are prone to supernatural thinking. Bering has seen this too. When one of his students carried out interviews with atheists, it became clear that they often tacitly attribute purpose to significant or traumatic moments in their lives, as if some agency were intervening to make it happen. "They don't completely exorcise the ghost of god - they just muzzle it," Bering says.
The fact that trauma is so often responsible for these slips gives a clue as to why adults find it so difficult to jettison their innate belief in gods, Atran says. The problem is something he calls "the tragedy of cognition". Humans can anticipate future events, remember the past and conceive of how things could go wrong - including their own death, which is hard to deal with. "You've got to figure out a solution, otherwise you're overwhelmed," Atran says. When natural brain processes give us a get-out-of-jail card, we take it.
That view is backed up by an experiment published late last year (Science, vol 322, p 115). Jennifer Whitson of the University of Texas in Austin and Adam Galinsky of Northwestern University in Evanston, Illinois, asked people what patterns they could see in arrangements of dots or stock market information. Before asking, Whitson and Galinsky made half their participants feel a lack of control, either by giving them feedback unrelated to their performance or by having them recall experiences where they had lost control of a situation.
The results were striking. The subjects who sensed a loss of control were much more likely to see patterns where there were none. "We were surprised that the phenomenon is as widespread as it is," Whitson says. What's going on, she suggests, is that when we feel a lack of control we fall back on superstitious ways of thinking. That would explain why religions enjoy a revival during hard times.
So if religion is a natural consequence of how our brains work, where does that leave god? All the researchers involved stress that none of this says anything about the existence or otherwise of gods: as Barratt points out, whether or not a belief is true is independent of why people believe it.
It does, however, suggests that god isn't going away, and that atheism will always be a hard sell. Religious belief is the "path of least resistance", says Boyer, while disbelief requires effort.
These findings also challenge the idea that religion is an adaptation. "Yes, religion helps create large societies - and once you have large societies you can outcompete groups that don't," Atran says. "But it arises as an artefact of the ability to build fictive worlds. I don't think there's an adaptation for religion any more than there's an adaptation to make airplanes."
I don't think there's an adaptation for religion any more than there's an adaptation to make airplanes
Supporters of the adaptation hypothesis, however, say that the two ideas are not mutually exclusive. As David Sloan Wilson of Binghamton University in New York state points out, elements of religious belief could have arisen as a by-product of brain evolution, but religion per se was selected for because it promotes group survival. "Most adaptations are built from previous structures," he says. "Boyer's basic thesis and my basic thesis could both be correct."
Robin Dunbar of the University of Oxford - the researcher most strongly identified with the religion-as-adaptation argument - also has no problem with the idea that religion co-opts brain circuits that evolved for something else. Richard Dawkins, too, sees the two camps as compatible. "Why shouldn't both be correct?" he says. "I actually think they are."
Ultimately, discovering the true origins of something as complex as religion will be difficult. There is one experiment, however, that could go a long way to proving whether Boyer, Bloom and the rest are onto something profound. Ethical issues mean it won't be done any time soon, but that hasn't stopped people speculating about the outcome.
It goes something like this. Left to their own devices, children create their own "creole" languages using hard-wired linguistic brain circuits. A similar experiment would provide our best test of the innate religious inclinations of humans. Would a group of children raised in isolation spontaneously create their own religious beliefs? "I think the answer is yes," says Bloom.
Read our related editorial: The credit crunch could be a boon for irrational belief
God of the gullibile
In The God Delusion, Richard Dawkins argues that religion is propagated through indoctrination, especially of children. Evolution predisposes children to swallow whatever their parents and tribal elders tell them, he argues, as trusting obedience is valuable for survival. This also leads to what Dawkins calls "slavish gullibility" in the face of religious claims.
If children have an innate belief in god, however, where does that leave the indoctrination hypothesis? "I am thoroughly happy with believing that children are predisposed to believe in invisible gods - I always was," says Dawkins. "But I also find the indoctrination hypothesis plausible. The two influences could, and I suspect do, reinforce one another." He suggests that evolved gullibility converts a child's general predisposition to believe in god into a specific belief in the god (or gods) their parents worship.
The Origins of Suicidal Brains
By Melinda Wenner
Suicide rates in the U.S. have increased for the first time in a decade, according to a report published in October by the Johns Hopkins Bloomberg School of Public Health. But what leads a person to commit suicide? Three new studies suggest that the neurological changes in a brain of a suicide victim differ markedly from those in other brains and that these changes develop over the course of a lifetime.
The most common pathway to suicide is through depression, which afflicts two thirds of all people who kill themselves. In October researchers in Canada found that the depressed who commit suicide have an abnormal distribution of receptors for the chemical GABA, one of the most abundant neurotransmitters in the brain. GABA’s role is to inhibit neuron activity. “If you think about the gas pedal and brakes on a car, GABA is the brakes,” explains co-author Michael Poulter, a neuroscientist at the Robarts Research Institute at the University of Western Ontario.
Poulter and his colleagues found that one of the thousands of types of receptors for GABA is underrepresented in the frontopolar cortex of people with major depressive disorder who have committed suicide as compared with nondepressed people who died of other causes. The frontopolar cortex is involved in higher-order thinking, such as decision making. The scientists do not yet know how this abnormality leads to the type of major depression that makes someone suicidal, but “anything that disturbs that system would be predicted to have some sort of important outcome,” Poulter says.
Interestingly, this GABA receptor problem is not the result of abnormal or mutated genes. Rather the change is epigenetic, meaning some environmental influence affected how often the relevant genes were expressed—that is, made into proteins. [For more about epigenetics, see “The New Genetics of Mental Illness,” by Edmund S. Higgins; Scientific American Mind, June/July 2008.] In the frontopolar cortex of suicide brains, the gene for the GABA-A receptor often had a molecule called a methyl group attached to it, the team found. When a methyl group is attached to a gene, it keeps that gene hidden from cells’ protein-building machinery—in this case, preventing the cells from manufacturing GABA-A receptors.
The addition of this methyl tag, called methylation, occurs more extensively in rodents that are handled by humans than in rodents that are not. Less is known about what causes methylation in the human brain, but another recent study suggests it could be related to abuse during childhood. In May researchers at McGill University reported that the gene responsible for creating cells’ protein-building machinery is more frequently methylated in the hippocampus—the brain region responsible for short-term memory and spatial navigation—of depressed suicide victims who suffered child abuse than in the brains of nonsuicide victims who were not abused.
Again, the researchers do not yet know how problems with protein-building machinery lead to depression and suicide. But “it makes sense that if you have some limited capacity for protein synthesis, you gradually are depriving yourself of building critical synapses,” or connections between neurons, which could be important for staying happy, says co-author Moshe Szyf, a pharmacologist at McGill. “Our hypothesis is that there are social events early in life that kind of epigenetically program the brain,” he says. He and his colleagues are now comparing the brains of suicide victims who were abused with those of suicide victims who were not abused to see if their methylation patterns differ.
Even in the womb, epigenetic influences can change the developing brain in ways that increase the risk of eventual suicide. In February 2008 a study revealed that baby boys who are born either short or with low birth weight are more likely to commit violent suicide as adults than longer and heavier babies are, irrespective of their height and weight as adults. Similarly, baby boys born prematurely are four times more likely to attempt violent suicide than those born at full term.
The researchers, publishing in the Journal of Epidemiology and Community Health, suggest that the chemical serotonin, which is involved in fetal brain growth, may play a role. A stressful or deprived womb environment may interfere with the development of the fetus and its serotonin system; other studies have shown that the brains of people who exhibit suicidal behaviors have reduced serotonin activity.
Ultimately, these findings reveal that suicide brains differ from other brains in multiple ways—in other words, “we’re really dealing with some sort of biological imbalance,” Poulter says. “It’s not an attitude problem.” And because epigenetic changes typically occur early in life, it may one day be possible to identify young people at risk for suicide by studying their methylation patterns and then to treat them with drugs that regulate this mechanism, Szyf notes.
Note: This article was originally printed with the title, "The Suicidal Brain".
Monday, February 2, 2009
Exploring the Folds of the Brain--And Their Links to Autism
New studies are revealing how the brain's convolutions take shape. The findings could aid the diagnosis and treatment of autism, schizophrenia and other mental disorders
By Claus C. Hilgetag and Helen Barbas
One of the first things people notice about the human brain is its intricate landscape of hills and valleys. These convolutions derive from the cerebral cortex, a two- to four-millimeter-thick mantle of gelatinous tissue packed with neurons sometimes called gray matter that mediates our perceptions, thoughts, emotions and actions. Other large-brained mammals such as whales, dogs and our great ape cousins have a corrugated cortex, too each with its own characteristic pattern of convolutions. But small-brained mammals and other vertebrates have relatively smooth brains. The cortex of large-brained mammals expanded considerably over the course of evolution much more so than the skull. Indeed, the surface area of a flattened human cortex equivalent to that of an extra-large pizza is three times larger than the inner surface of the braincase. Thus, the only way the cortex of humans and other brainy species can fit into the skull is by folding.
This folding is not random, as in a crumpled piece of paper. Rather it exhibits a pattern that is consistent from person to person. How does this folding occur in the first place? And what, if anything, can the resulting topography reveal about brain function? New research indicates that a network of nerve fibers physically pulls the pliable cortex into shape during development and holds it in place throughout life. Disturbances to this network during development or later, as a result of a stroke or injury, can have far-reaching consequences for brain shape and neural communication. These discoveries could therefore lead to new strategies for diagnosing and treating patients with certain mental disorders.
Scientists have pondered the brain's intricate form for centuries. In the early 1800s German physician Franz Joseph Gall proposed that the shape of a person's brain and skull spoke volumes about that individual's intelligence and personality a theory known as phrenology. This influential, albeit scientifically unsupported, idea led to the collection and study of "criminal," "degenerate" and "genius" brains. Then, in the latter part of the 19th century, Swiss anatomist Wilhelm His posited that the brain develops as a sequence of events guided by physical forces. British polymath D'Arcy Thompson built on that foundation, showing that the shapes of many structures, biological and inanimate, result from physical self-organization.
Provocative though they were, these early suppositions eventually faded from view. Phrenology became known as a pseudoscience, and modern genetic theories eclipsed the biomechanical approach to understanding brain structure. Recently, however, evidence from novel brain-imaging techniques, aided by sophisticated computational analyses, has lent fresh support to some of those 19th-century notions.
Hints that His and Thompson were on the right track with their ideas about physical forces shaping biological structures surfaced in 1997. Neurobiologist David Van Essen of Washington University in St. Louis published a hypothesis in Nature in which he suggested that the nerve fibers that link different regions of the cortex, thereby enabling them to communicate with one another, produce small tension forces that pull at this gelatinous tissue. In a human fetus, the cortex starts out smooth and mostly remains that way for the first six months of development. During that time, the newborn neurons send out their spindly fibers, or axons, to hook up with the receptive components, or dendrites, of target neurons in other regions of the cortex. The axons then become tethered to the dendrites. As the cortex expands, the axons grow ever more taught, stretching like rubber bands. Late in the second trimester, while neurons are still emerging, migrating and connecting, the cortex begins to fold. By the time of birth the cortex has more or less completed development and attained its characteristically wrinkled form.
Van Essen argued that two regions that are strongly linked that is, connected via numerous axons are drawn closer together during development by way of the mechanical tension along the tethered axons, producing an outward bulge, or gyrus, between them. A pair of weakly connected regions, in contrast, drifts apart, becoming separated by a valley, or sulcus.
Modern techniques for tracing neural pathways have made it possible to test the hypothesis that the communication system of the cortex is also responsible for shaping the brain. According to a simple mechanical model, if each axon pulls with a small amount of force, the combined force of axons linking strongly connected areas should straighten the axon paths. Using a tool known as retrograde tracing, in which a dye injected in a small area of the cortex is taken up by the endings of axons and transported backward to the parent cell body, it is possible to show which regions send axons to the injection site. Furthermore, the method can reveal both how dense the connections of an area are and what shapes their axon paths take. Our retrograde tracing studies of a large number of neural connections in the rhesus macaque have demonstrated that, as predicted, most connections follow straight or slightly curved paths. Moreover, the denser the connections are, the straighter they tend to run.
The sculpting power of neural connections is particularly evident in the shape differences between language regions in the left and right hemispheres of the human brain [see "Specializations of the Human Brain," by Norman Geschwind; Scientific American, September 1979]. Take, for instance, the form of the Sylvian fissure a prominent sulcus that separates the frontal and posterior language regions. The fissure on the left side of the brain is quite a bit shallower than the one on the right. The asymmetry seems to be related to the anatomy of a large fiber bundle called the arcuate fascicle, which travels around the fissure to connect the frontal and posterior language regions. Based on this observation and the fact that the left hemisphere is predominantly responsible for language in most people, we postulated in a paper published in 2006 that the arcuate fascicle on the left is denser than the one on the right. A number of imaging studies of the human brain have confirmed this asymmetrical fiber density. In theory, then, the larger fiber bundle should have a greater pulling strength and therefore be straighter than the bundle on the right side. This hypothesis has yet to be tested, however.
Mechanical forces shape more than just the large-scale features of the cerebral cortex. They also have an effect on its layered structure. The cortex is made of horizontal tiers of cells, stacked up as in a multilayered cake. Most areas have six layers, and individual layers in those areas vary in thickness and composition. For example, the regions of the cortex that govern the primary senses have a thick layer 4, and the region that controls voluntary motor functions has a thick layer 5. Meanwhile the association areas of the cortex which underlie thinking and memory, among other things have a thick layer 3.
Such variations in laminar structure have been used to divide the cortex into specialized areas for more than 100 years, most famously by German anatomist Korbinian Brodmann, who created a map of the cortex that is still in use today. Folding changes the relative thickness of the layers, as would happen if one were to bend a stack of sponges. In the gyri, the top layers of the cortex are stretched and thinner, whereas in the sulci, the top layers are compressed and thicker. The relations are reversed in the deep layers of the cortex.
Based on these observations, some scientists had suggested that whereas the shapes of layers and neurons change as they are stretched or compressed, the total area of the cortex and the number of neurons it comprises are the same. If so, thick regions (such as the deep layers of gyri) should contain fewer neurons than thin regions of the cortex. This isometric model, as it is known, assumes that during development neurons first migrate to the cortex and then the cortex folds. For an analogy, imagine folding a bag of rice. The shape of the bag changes, but its capacity and the number of grains are the same before and after folding.
Our investigations into the density of neurons in areas of the prefrontal cortex in rhesus macaques reveal that the isometric model is wrong, though. Using estimates based on representative samples of frontal cortex, we determined that the deep layers of gyri are just as densely populated with neurons as the deep layers of sulci. And because the deep layers of gyri are thicker, there are actually more neurons under a unit area in gyri than in sulci.
Our discovery hinted that the physical forces that mold the gyri and sulci also influence neuronal migration. Developmental studies in humans have bolstered this suggestion. Rather than occurring sequentially, the migration of neurons to the cortex and the folding of the cortex partially overlap in time. Consequently, as the cortex folds, the resulting stretching and compressing of layers may well affect the passage of newly born neurons that migrate into the cortex late in development, which in turn would affect the composition of the cortex.
Moreover, the shapes of individual neurons differ depending on where in the cortex they reside. Neurons situated in the deep layers of gyri, for example, are squeezed from the sides and appear elongated. In contrast, neurons located in the deep layers of sulci are stretched and look flattened. The shapes of these cells are consistent with having been modified by mechanical forces as the cortex folded. It will be an intriguing challenge to figure out whether such systematic differences in the shapes of neurons in gyri and sulci also affect their function.
Our computer simulations suggest that they do: for example, because the cortical sheet is much thicker in the gyri than in the sulci, signals impinging on the dendrites of neurons at the bottom of a gyrus must travel a longer distance to the cell body than do signals impinging on the dendrites of neurons at the bottom of a sulci. Researchers can test the effect of these physical differences on the function of neurons by recording the activity of individual neurons across the undulating cortical landscape work that, to our knowledge, has yet to be undertaken.
To fully understand the relation between form and function, scientists will need to look at an abundance of brains. The good news is that now that we can observe the living human brain using noninvasive imaging techniques, such as structural magnetic resonance imaging, and reconstruct it in three dimensions on computers, we are able to collect images of a great many brains far more than are featured in any of the classical collections of brains obtained after death. Re searchers are systematically studying these extensive databases using sophisticated computer programs to analyze brain shape. One of the key findings from this research is that clear differences exist between the cortical folds of healthy people and those of patients with mental diseases that start in development, when neurons, connections and convolutions form. The mechanical relation between fiber connections and convolutions may explain these deviations from the norm.
Research into this potential link is still in its early phases. But over the past couple of years, several research teams have reported that the brains of schizophrenic patients exhibit reduced cortical folding overall relative to the brains of normal people. The findings are controversial, because the location and type of the folding aberrations vary considerably from person to person. We can say with certainty, however, that brain shape generally differs between schizophrenics and healthy people. Experts have often attributed schizophrenia to a perturbed neurochemical balance. The new work suggests that there is additionally a flaw in the circuitry of the brain's communication system, although the nature of the flaw remains unknown.
People diagnosed with autism also exhibit abnormal cortical convolutions. Specifically, some of their sulci appear to be deeper and slightly out of place as compared with those of healthy subjects. In light of this finding, researchers have begun to conceive of autism as a condition that arises from the miswiring of the brain. Studies of brain function support that notion, showing that in autistic people, communication between nearby cortical areas increases, whereas communication between distant areas decreases. As a result, these patients have difficulties ignoring irrelevant things and shifting their attention when it is appropriate to do so.
Mental disorders and learning disabilities can also be associated with aberrations in the composition of the cortical layers. For instance, in the late 1970s neurologist Albert Galaburda of Harvard Medical School found that in dyslexia the pyramidal neurons that form the chief communication system of the cerebral cortex are shifted from their normal position in the layers of the language and auditory areas of the frontal cortex. Schizophrenia, too, may leave imprints on the cortical architecture: some frontal areas of the cortex in affected individuals are aberrant in their neural density. Abnormal distribution of neurons in cortical layers disrupts their pattern of connections, which ultimately impairs the fundamental function of the nervous system in communication. Researchers are just beginning to probe structural abnormalities of the cortex in people with autism, which may further elucidate this puzzling condition.
More studies are needed to ascertain whether other neurological diseases that originate during development also bring about changes in the number and positions of neurons in the cortical layers. Thinking of schizophrenia and autism as disorders that affect neural networks, as opposed to localized parts of the brain, might lead to novel strategies for diagnosis and treatment. For instance, patients who have these conditions might profit from performing tasks that engage different parts of the brain, just as dyslexics benefit from using visual and multimodal aids in learning.
Modern neuroimaging methods have also enabled scientists to test the phrenological notion that cortical convolutions or the amount of gray matter in different brain regions can reveal a person's talents. Here, too, linking form and function is fraught with difficulty. The connection is clearest in people who routinely engage in well-defined coordinated mental and physical exercise.
Professional musicians offer one such example. These individuals, who need to practice extensively, systematically differ from nonmusicians in motor regions of the cortex that are involved in the control of their particular instruments. Still, clear folding patterns that distinguish broader mental talents remain elusive.
We still have much to unravel. For one, we do not yet understand how individual gyri attain their specific size and shape, any more than we understand the developmental underpinnings of variation in ear or nose shape among individuals. Variation is a very complex problem. Computational models that simulate the diversity of physical interactions between neurons during cortical development may throw light on this question in the future. So far, however, the models are very preliminary because of the complexity of the physical interactions and the limited amount of developmental data available.
Scholars are also keen to know more about how the cortex develops. Topping our wish list is a detailed timetable of the formation of the many different connections that make up its extensive communication system. By marking neurons in animals, we will be able to determine when different parts of the cortex develop in the womb, which, in turn, will enable scientists to experimentally modify the development of distinct layers or neurons. Information about the sequence of development will help reveal events that result in abnormal brain morphology and function. The range of neurological diseases with vastly different symptoms such as those seen in schizophrenia, autism, Williams syndrome, childhood epilepsy, and other disorders may be the result of pathology arising at different times in development and variously affecting regions, layers and sets of neurons that happen to be emerging, migrating or connecting when the process goes awry.
To be sure, mechanical forces are not alone in modeling the brain. Comparisons of brain shape have demonstrated that brains of closely related people are more similar to one another than are brains of unrelated people, indicating that genetic programs are at work, too. Perhaps genetic processes control the timing of development of the cortex, and simple physical forces shape the brain as nerve cells are born, migrate and interconnect in a self-organizing fashion. Such a combination may help explain the remarkable regularity of the major convolutions among individuals, as well as the diversity of small convolutions, which differ even in identical twins.
Many of the current concepts about the shape of the brain have come full circle from ideas first proposed more than a century ago, including the notion of a link between brain shape and brain function. Systematic comparisons of brain shape across populations of normal subjects and patients with brain disorders affirm that the landscape of the brain does correlate with mental function and dysfunction.
But even with the advanced imaging methods for measuring brains, experts still cannot recognize the cortex of a genius or a criminal when they see one. New models of cortex folding that combine genetics and physical principles will help us integrate what we know about morphology, development and connectivity so that we may eventually unlock these and other secrets of the brain.