Monday, October 29, 2007
Medical News Today
24 Oct 2007
A new US study suggests that people with more years of formal education experience the onset of memory decline associated with dementia later, but once it starts it declines more rapidly, compared to people with fewer years of education.
The study is published in the 23rd October issue journal Neurology and is the work of Dr Charles B Hall, from the Albert Einstein College of Medicine in Bronx, New York and colleagues.
Hall said that:
"Higher levels of education delay the onset of dementia, but once it begins, the accelerated memory loss is more rapid in people with more education."
He and his colleagues found that a person with 16 years of formal education experienced a 50 per cent faster rate of memory decline than someone with only 4 years of education.
The researchers wanted to test something called the cognitive reserve hypothesis. This suggests that people with more years of education have a reserve of cognitive ability that hides (and this can be without the person realising it) the onset of memory decline (for instance they use thinking skills to work out answers to memory tests). This could explain why low education is a well known risk factor for Alzheimer's Disease (AD), a condition that is characterised by accelerated memory decline.
Hall and colleagues followed 488 people in the Bronx Aging Study of whom 117 developed dementia. The participants underwent detailed assessment of their cognitive skills at the start of the study, and also every year during the 6 years of follow up, when they completed a memory test called the Buschke Selective Reminding Test. Their formal education ranged from fewer than 3 years of elementary school to completion of postgraduate education.
The researchers estimated the point at which the rate of cognitive decline began to accelerate (the change point as they called it), and the rates of decline before and after this point.
They found that every additional year of formal education delayed the change point by 0.21 years (2.5 months). After the change point, the rate of memory decline increased by 0.10 points for each year of additional formal education. This translated to a 4 per cent faster rate of decline for each additional year of formal education.
The researchers gave an example. The onset of accelerated memory decline for a college graduate with 16 years of formal education who is diagnosed with dementia at 85 years of age would have started four years earlier, at age 81. But a person with only 4 years of formal education, who is diagnosed at the same age of 85, would have started to experience a slower rate of decline six years before diagnosis, at age 79.
This seemed to reflect previous research findings that showed people with more years of education suffered memory loss more quickly once they were diagnosed with dementia, wrote the researchers.
The researchers concluded that:
"As predicted by the cognitive reserve hypothesis, higher education delays the onset of accelerated cognitive decline; once it begins it is more rapid in persons with more education."
Commenting on the findings, Hall said that:
"This rapid decline may be explained by how people with more education have a greater cognitive reserve, or the brain's ability to maintain function in spite of damage."
"So while they're often diagnosed with dementia at a later date, once the cognitive reserve is no longer able to compensate for the damage that's occurred, then the symptoms emerge," he added.
The researchers wrote that while their study was important because it was the first to test the cognitive reserve hypothesis in people with preclinical dementia, they pointed out that the people in the study were born at the turn of the 20th century and their life experiences and education may not be representative of people currently in education today.
"Education delays accelerated decline on a memory test in persons who develop dementia."
Hall, C. B., Derby, C., LeValley, A., Katz, M. J., Verghese, J., Lipton, R. B.
Neurology 2007 69: 1657-1664
Sunday, October 28, 2007
A team of neuroscientists announced a scientific breakthrough last week in the use of brain scans to discover what's on someone's mind.
Researchers from the Max Planck Institute for Human Cognitive and Brain Sciences, along with scientists from London and Tokyo, asked subjects to secretly decide in advance whether to add or subtract two numbers they would later be shown. Using computer algorithms and functional magnetic resonance imaging, or fMRI, the scientists were able to determine with 70 percent accuracy what the participants' intentions were, even before they were shown the numbers.
The study used "multivariate pattern recognition" to identify oxygen flow in the brain that occurs in association with specific thoughts. The researchers trained a computer to recognize these flow patterns and to extrapolate from what it had learned to accurately read intentions.
The finding raises issues about the application of such tools for screening suspected terrorists -- as well as for predicting future dangerousness more generally. Are we closer than ever to the
crime-prediction technology of Minority Report?
As I've argued in this space before, the popular press tends to over-dramatize scientific advances in mind reading. FMRI results have to account for heart rate, respiration, motion and a number of other factors that might all cause variance in the signal. Also, individual brains differ, so scientists need to study a subject's patterns before they can train a computer to identify those patterns or make predictions.
While the details of this particular study are not yet published, the subjects' limited options of either adding or subtracting the numbers means the computer already had a 50/50 chance of guessing correctly even without fMRI readings. The researchers indisputably made physiological findings that are significant for future experiments, but we're still a long way from mind reading.
Still, the more we learn about how the brain operates, the more predictable human beings seem to become. In the Dec. 19, 2006, issue of The Economist, an article questioned the scientific validity of the notion of free will: Individuals with particular congenital genetic characteristics are predisposed, if not predestined, to violence.
Studies have shown that genes and organic factors like frontal lobe impairments, low serotonin levels and dopamine receptors are highly correlated with criminal behavior. Studies of twins show that heredity is a major factor in criminal conduct. While no one gene may make you
a criminal, a mixture of biological factors, exacerbated by environmental conditions, may well do so.
Looking at scientific advances like these, legal scholars are beginning to question the foundational principles of our criminal justice system.
For example, University of Florida law professor Christopher Slobogin, who is visiting at Stanford this year, has set forth a compelling case for putting prevention before retribution in criminal justice.
Two weeks ago, Slobogin gave a talk based on his book, Minding Justice. He pointed to the studies showing that our behavior is predetermined or strongly influenced by biology, and that if we can identify those biological factors, we can predict behavior. He argues that the justice system should provide treatment for potential wrongdoers based on predictions of dangerousness instead of settling for punishing them after the fact.
It's a tempting thought. If there is no such thing as free will, then a system that punishes transgressive behavior as a matter of moral condemnation does not make a lot of sense. It's compelling to contemplate a system that manages and reduces the risk of criminal behavior in the first place.
Yet, despite last week's announcement from the Max Planck Institute, neuroscience and bioscience are not at a point where we can reliably predict human behavior. To me, that's the most powerful objection to a preventative justice system -- if we aren't particularly good at
predicting future behavior, we risk criminalizing the innocent.
We aren't particularly good at rehabilitation, either, so even if we were sufficiently accurate in identifying future offenders, we wouldn't really know what to do with them.
Nor is society ready to deal with the ethical and practical problems posed by a system that classifies and categorizes people based on oxygen flow, genetics and environmental factors that are correlated as much with poverty as with future criminality.
"Minority Report," a short story by Philip K. Dick that became the 2002 Steven Spielberg blockbuster, portrays a society where investigators can learn in advance that someone will commit a murder, even before the offender himself knows what he will do. Gattaca, a
1997 film, tells of a society that discriminates against genetically "inferior" individuals.
Science fiction has long grappled with the question of how a society that can predict future behavior should act. The stories suggest that people are more varied, more free, than the computer models allow. They also suggest that a system based on predictions is subject not
only to innocent mistakes but also to malicious manipulation at a level far greater than our current punishment-oriented regime.
In time, neuroscience may produce reliable behavior predictions. But until then, we should take the lessons of science fiction to heart when deciding how to use new predictive techniques.
LONDON -- They are life's perennial questions: Pepsi or Coke, Mac or PC, cremation or burial?
Scientists in the United Kingdom believe they may be close to unraveling some of the brain processes that ultimately dictate the choices we make as consumers.
Using a revolutionary method of imaging the brain, researchers from the Open University (OU) and the London Business School say they have identified the brain region that becomes active as the shopper reaches to the supermarket shelf to make their final choice.
If that sounds a frivolous finding, the scientists involved insist their work has serious applications.
"How people use their brains to learn, record and store memory is a fundamental question in contemporary neuroscience," said Steven Rose, director of the brain and behavior research group and professor of biology at the Open University.
Beyond the local supermarket, Rose and his team believe the findings could go on to show the brain processes behind the conscious decisions people make when it comes to important life choices, such as selecting a partner or career. The research may also one day help manufacturers and marketers shape advertising and branding strategies for their products.
The scientists used a technique known as magnetic encephalography (MEG) -- the fastest of all scanner methods -- to measure the minute magnetic fields around the brain and identify regions that are active during the second or so it takes for a person to make their shopping choice.
Subjects were taken on an 18-minute virtual tour of a supermarket. During regular pauses in the tour, they were asked to make a choice between different brands and products on the shelves by pressing a button.
Researchers found that the brain was hugely active during the 2.5 seconds it took for the button press to be made.
"Within 80 milliseconds their visual cortex responds as they perceive the choice items," Rose said. "A little later, regions of the brain associated with memory and speech become active."
More interesting for the researchers, however, was what happened as consumers made their final choice.
"After about 800 milliseconds -- and this was the surprising thing -- if and only if they really prefer one of the choice items, a region called the right parietal cortex becomes active. This then is the region of the brain involved in making conscious decisions -- in this experiment about shopping choices, but maybe for more important life choices too," Rose said.
Other scientists gave a guarded welcome to the findings of the OU researchers -- Rose, Professor Stephen Swithenby, Dr. Sven Braeutigam and Dr. John Stins -- who will publish their findings in the next issue of the journal Neural Plasticity.
Michael Miller, professor and chair in the department of neuroscience and physiology at SUNY Upstate Medical University, said that MEG scans provided a unique insight into the real-time activity of brain regions in humans and animals. "They are very insightful. There is a growing literature about the role of the prefrontal cortex and other areas that are involved in volitional activities," he said.
Dr. Wise Young, director of the W.M. Keck Center for Collaborative Neuroscience and a professor at Rutgers University, said that the finding was "interesting" although not particularly surprising or groundbreaking.
"The parietal cortex has been conjectured as one of the sites of decision-making in the brain. Because that part of the brain is showing activity during decision-making does not necessarily mean that the decision is actually made in that part of the brain. Also, the authors examined only several sites on the brain and there may be parts of the brain that were activated but they did not record," he said.
Young said that for centuries scientists have struggled with the question of whether function is localized or distributed in the brain.
"Much data suggests that many functions are quite broadly distributed, the brain is quite plastic and quite remarkable recovery can occur after injury to large parts of the brain. On the other hand, it is also clear that some areas are more important for some functions than others," Young said.
Dr. Susan Bookheimer, director of brain mapping center behavioral testing and training laboratory at UCLA, was more skeptical. "When I first read this, I thought it was a joke. It is not an interesting finding," she said. "The investigators have added all this supermarket stuff to make it more appealing to a general audience, but you'd find the same thing if you just used pictures of dots, for example."
While Bookheimer agrees there are areas of the brain responsible for making choices, she believes the real story is much more complex than that presented by the OU study.
"We must integrate knowledge before we make a choice," she said. "Many, not all choices, involve an emotional component controlled by a different area of the brain. Others require integrating information from multiple choices. We have to generate a response, search available responses and then initiate a plan to demonstrate the choice. It is likely that only this last process was involved in the study here -– the process of directing one's reach toward the goal."
A person's optimism in the future seems to be controlled by a small front part of the mid-brain, according to a study that used brain imaging.
That area deep behind the eyes activates when people think good thoughts about what might happen in the future.
The more optimistic a person is, the brighter the area showed up in brain scans, the scientists reported in a small study published online Thursday in the journal Nature.
That same part of the brain, called the rostral anterior cingulate cortex (rACC), seems to malfunction in people suffering depression, said the study co-authors, Elizabeth Phelps of New York University and Tali Sharot of University College London.
Researchers gave 15 people functional magnetic resonance imaging scans while they thought about future possibilities.
When the participants thought about good events, both the rACC and amygdala, which is involved in emotional responses including fear, were activated. But the correlation with optimism was biggest with the cingulate cortex.
The same study also found that people tended to think that happier events were closer in time and more vivid than the bad ones, even if they had no reason to believe it, Phelps said.
Psychologists have long known people have an "optimism bias," but the new study offers new details.
When researchers asked the subjects to think about 80 different future events that could be good, bad or neutral, they had a hard time getting people to think negatively, or even neutrally, about the future.
For example, when people were asked to ponder a future haircut, they imagined getting the best haircut of their lives, instead of just an ordinary trim, Phelps said.
The study makes sense and pulls together new and different parts of research on optimism and the brain, said Dan Schacter, a professor of psychology at Harvard University who wasn't part of the research.
Having our brains wired to optimism is generally a good thing because "if you were pessimistic about the future you would not be motivated to take a lot of action," Phelps said.
Among a sample of 184 young people being evaluated for psychiatric disorders and allergies, 105 (57 percent) had a history of allergic disorders, including asthma, hay fever, hives and eczema.
Psychiatric evaluations revealed that 124 (67 percent) had an internalizing disorder, either alone or in combination with an externalizing disorder, such as ADHD, oppositional defiant disorder
and conduct disorder. The children in the sample were between 4 and 20 years old; their average age was 13.
Researchers found that youth with internalizing disorders were almost twice as likely to have a history of allergies than those with a diagnosis that wasn't classified as an internalizing or externalizing disorder. The psychiatric disorders in this group included substance abuse, tic disorders, bed-wetting and attachment disorder.
Moreover, the association was found to be specific for "pure" internalizing disorders. That is, the likelihood of having a history of allergies was significant only among youths who had an internalizing disorder and no other psychiatric conditions.
"These findings add to the growing body of evidence supporting an association between anxiety, depressive, and allergic disorders," write Dr. Mauricio Infante and colleagues from University of
Wisconsin, Madison in the Journal of Clinical Psychiatry.
The findings also suggest that these psychiatric and medical disorders "may share risk factors and underlying pathways that contribute to the development of both types of disorders."
The Wisconsin team notes that studies are needed to identify the reasons for these associations so that effective treatment and prevention strategies that target both disorders can be developed.
SOURCE: Journal Clinical Psychiatry, September 2007.
Monday, October 22, 2007
September 21, 2005
Pressure from insurance companies and competition from drug therapies
are prompting analysts to get patients off the couch more quickly
By Jamie Talan
Wendy spent five years in psychoanalysis, delving so deeply into her
mind that she could no longer see the connection between her adult
problems and her teenage episodes of "cutting" her wrists. After she
and her analyst had their final session, during which he welcomed her
to move on with her life, Wendy was not completely happy, but she was
happier than she ever had been. And that, psychologists say, is
Psychoanalysis probes the unconscious mind to unlock the mysteries
that drive conscious emotions and behavior. The discipline is built on
pillars set by Sigmund Freud a century ago. It is characterized by
frequent sessions that can take place over many years, wherein
patients are encouraged to freely associate whatever comes to mind as
the analyst sits quietly and listens.
Today the practice is changing. The transformation is in part the
result of a better understanding of what works during self-analysis.
But increasingly, psychotherapy is changing just to survive, held
hostage to limits on insurance coverage determined by managed care
companies and facing replacement by psychoactive drugs that in the
long run are far cheaper than a patient's weekly visit to the
therapist's office. In this incarnation, it suddenly matters less that
symptoms may disappear without patients figuring out the underlying cause.
To keep psychoanalysis alive, contemporary therapists are revamping
Freud's theories. They have discarded some traditional beliefs and
have loosened requirements so patients can succeed in fewer sessions.
Many analysts are even talking to their patients and sharing their own
thoughts and feelings, a practice that Freud said would complicate the
Some experts chafe at the changes, however. They say that short-term
therapy can be successful for some problems such as phobias but does
not work for personality disorders, chronic depression and other
substantial mental illnesses. They claim that managed care companies
make decisions based on cost, not on any science that shows what works
best for a specific condition. Insurance companies argue that patients
can do just as well on medication as they can with talk therapy and
that for talk, "short term" is enough.
Extended analysis certainly is under siege. Today patients having
long-term psychotherapy--more than 20 sessions--account for only 15
percent of those who seek treatment, according to a study in the
American Journal of Psychiatry. Psychoanalysts contend that it takes
longer to work out issues that have been shaped by a lifetime of
emotion and experience, yet they know they must compete in a
magic-pill era in which people may be content to have their symptoms
disappear without much thought to why they emerged in the first place.
"A better understanding of the self is needed for a better recovery,"
asserts Gail Saltz, a Manhattan analyst and author of Becoming Real
(Riverhead Trade, 2005), a book about the benefits of analysis. She
says that there are still people who lie on the couch four times a
week, but many analysts have accepted a once-a-week regimen. And
although studies have shown that certain patients progress better when
therapy is frequent, Saltz believes once a week can still be
successful. Psychologists have at least agreed that even long-term
analysis should be completed within four years.
Patients may be content to have symptoms disappear without much
thought to why they ever emerged.
Regardless of frequency, Saltz says, the goal is to help patients
"better tolerate the ups and downs of life" or, as Freud put it, "get
beyond everyday human misery." Freud developed his ideas before
scientists knew much about the brain's workings, however, and today
some of his once popular theories about human development are seen as
High on the list is that infants have complicated sexual desires.
Peter D. Kramer, a Massachusetts psychiatrist who popularized the new
generation of antidepressants in his best-selling book Listening to
Prozac (Penguin, 1993), says that "there is no evidence that infants
have sexual desires." Kramer notes that although Freud believed that
adult complaints of childhood sexual abuse stemmed from such childhood
fantasies, the evidence today is plain that sexual abuse of children
is common, affecting up to 20 percent of girls and 10 percent of boys.
Freud also had little to offer the therapist in understanding trauma,
which experts now know can cause lifelong problems. Trauma therapy is
a relatively new field, built on work with war veterans.
Post-traumatic stress disorder is a hot topic in psychotherapy today,
one that was poorly addressed before, Kramer notes, because it was not
possible to have effective treatment when the theoretical
underpinnings were shaky.
Friend, Not Father
Readdressing the basic tenets of psychoanalysis has led to perhaps the
most radical change of all: modern psychologists are actually talking
to their patients. Freud's original "transference" theory demanded
that an analyst remain quiet and aloof so as to serve as a "screen"
onto which the patient could project her emotions. But therapists are
now sharing more of themselves. "How can I remain opaque when my
clients can go online and learn that I love Greek music?" asks
psychoanalyst Spyros D. Orfanos, clinic director in psychoanalysis at
New York University.
Orfanos says that today's analyst is not an authoritative father
figure but a partner in figuring out "the powerful emotional forces
that drive behavior." He thinks that having a dialog with a patient
is the best way to work toward change. Many analysts also now agree
that empathy is key to the relationship, and empathy requires
engagement, not just listening.
Psychoanalysis is also changing in the face of steady competition from
other forms of help, such as cognitive behavioral therapy, in which
patients try to change certain troubling behaviors, and goal-oriented
therapy, which lays out ways to attain, say, certain kinds of
relationships. These practices may or may not touch on the patient's
past. And to hold its own, psychoanalysis is shedding its image as a
privileged treatment for the wealthy; so-called training centers are
popping up everywhere that provide low-cost appointments.
Scientists are also attempting to study the biology of the analysis
process itself. At New York-Presbyterian Hospital/Weill Cornell
Medical Center, psychiatrists Otto F. Kernberg and David A.
Silbersweig are recording brain scans of patients before and after
analysis. Such studies may help end the debate over the effectiveness
of lengthy treatment, notes Kramer, who recently published Against
Depression (Viking Adult, 2005), an assessment of mood disorders. "We
don't know what works or what doesn't work."
Orfanos is dubious about scanning, maintaining that analysis is a
humanistic endeavor that does not necessarily fit into a biology-based
medical model. "It's about understanding how your mind works," he
says, "so that you can have more choices in your life."
October 05, 2007
A pair of brain regions work together to assess the threat of punishment and override our selfish tendencies
Whether you subscribe to the Ten Commandments, the Golden Rule or some instinctive moral code, society functions largely because most of its denizens adhere to a set of norms that allow them to live together in relative tranquility.
But, why is it that we put a vast amount of social resources into keeping stealing, murdering and other unfair (not to mention violent and illegal) acts to a minimum? Seems it all comes down to the fact that most of us don't cotton to being punished by our peers.
"The reason why punishment for norm violations is important is that it disciplines the potential norm violators," says Ernst Fehr, an economist at the University of Zurich and the senior author of a paper on the issue published this week in Neuron.
In the new study, Fehr and colleagues uncovered activity in two areas of the brain underlying the neural mechanism involved in conforming to society's values. They further determined that subjects with Machiavellian personalities—a strong sense of self-interest, opportunism and manipulation—have heightened activity in one of these regions, which the authors believe is related to assessing the threat of punishment.
During the study, which also involved scientists at the University of Ulm in Germany, 23 male students were instructed to play a version of the "ultimatum game" while their brains were scanned via functional magnetic resonance imaging (fMRI). Each participant was given a sum of money (100 monetary units) to split however he chose with an anonymous partner. In some cases the recipient simply had to accept any offer made. Other times, after an offer was made, the recipient had the option penalize the giver by taking some or all of their money, if the latter had not shared generously.
The subjects' brains were only scanned when they played the giver role. Before each trial, both players were told whether the recipient would be allowed to exact a punishment if he felt he got too slim a slice of the pie. Two areas of the cortex (the brain's primary processing unit) were particularly active during the trials when punishment was an option: the lateral orbitofrontal cortex, a region below the temples of the head that had, in previous research, been implicated in processing a threat stimulus, and a section just behind it called the dorsolateral prefrontal cortex.
"The lateral orbitofrontal cortex [activity] represents the punishment threat here," says Fehr, citing previous research that fingered it in threat assessment. "More specifically, how bad does the brain interpret this punishment threat?"
Alternatively, he says, "[the dorsolateral prefrontal cortex] is an area that is involved in cognitive control and overriding prepotent impulses. Here, we have a design where the prepotent impulse is not to share the money—at least to the extent that player B wants it shared."
Interestingly, the research team also had their subjects fill out a questionnaire to determine their degree of Machiavellian behavior. Those who proved to be the most ruthless of the bunch offered little to nothing when there was no threat of punishment, but within the punishment paradigm, they were generous enough to stave off retribution.
"These are socially intelligent, selfish people," Fehr says about the more calculating subjects. "They escape the punishments that are inherent in social interactions, because they seem to have a fine sense of when punishment is in the air."
Jorge Moll, principal investigator of the cognitive and behavioral neuroscience unit at the Rede Labs-D'Or Hospitals in Rio de Janeiro, says the most interesting findings were that individual scores on Machiavellianism predicted "how much a given subject will change his behavior depending on the presence of punishment," and "that the level of activity within the lateral orbitofrontal cortex is strongly related to Machiavellian personality style."
Researchers say the results could have wide-reaching implications, potentially paving the way to understand—and perhaps one day reverse—the neurobiology behind psychopathic and sociopathic personalities. They intend to repeat the study with patients suffering from antisocial anxiety and personality disorders to determine if their behavior can be explained by a lack of impulse control or a poor assessment of punishment.
Fehr argues the results could also impact the criminal justice system since the dorsolateral prefrontal cortex does not fully develop until after a person is around 20 years old.
"This area seems to be critically important in overriding self-interest," he says. Thus, "you just can't treat an immature adolescent the same way as a mature adult—that's at least my view of doing justice." It's unclear whether judges and juries see it that way, however.
By Amir S. Irvine, CA
Donald Wilson, a professor of zoology at the University of Oklahoma and co-author of the 2006 book Learning to Smell: Olfactory Perception from Neurobiology to Behavior, sniffs around for an answer.
In 2004 the Nobel Prize in Physiology or Medicine went to Linda B. Buck and Richard Axel for their research showing that there is a huge family of genes that encode proteins called olfactory receptors. Their findings, published in 1991, opened many doors toward understanding the function of the olfactory system.
One important observation was that individual olfactory sensory neurons typically express just one of those genes. Thus, signals coming from a given neuron provide information about odors that activate the specific receptor protein expressed by that cell. A single receptor protein, however, appears to bind (or recognize) many different odors. Thus, rather than having neurons that respond selectively to coffee or vanilla or Bordeaux, most individual cells (via their receptors) respond to submolecular features of the volatile chemicals coming from those objects. For example, an olfactory sensory receptor neuron may respond to a hydrocarbon chain of a particular length or a specific functional group like an alcohol or aldehyde.
This means that any given sensory neuron will respond to many different odors as long as they share a common feature. The brain (specifically, the olfactory bulb and olfactory cortex) then looks at the combination of sensory neurons activated at any given time and interprets that pattern in the context of previous patterns that have been experienced and other kinds of available information. The interpreted pattern is what you perceive as smell.
Olfactory sensory neurons, which sit in the mucus in the back of the nose and relay data into the brain via axons (fingerlike projections that transmit information out from the cell body), do not live forever. In fact, they are one of the increasingly large number of neuron types that are known to die and be replaced throughout life.
Fortunately, they do not all die at the same time. There are many thousands of olfactory sensory neurons expressing any particular olfactory receptor. When a small subset dies, the pattern of activity that the olfactory processing regions in the brain receives for a specific smell doesn't change very much. In fact, when an olfactory sensory neuron expressing a particular receptor gene dies and a new neuron expressing that same gene matures, the new neuron's axons plug in to the same group of olfactory bulb neurons that its predecessor did. This results in remarkable pattern stability over years, despite continual rewiring.
Let's imagine that on a recent holiday you tried a new wine. The odor (or bouquet, to oenophiles) of that wine is composed of dozens of different volatile chemicals, and each chemical has several submolecular features. Therefore, the wine activates a complex pattern of olfactory sensory neurons by engaging each of their olfactory receptor proteins, which recognize these different features. This pattern is processed and remembered by neural circuits in the olfactory bulb and olfactory cortex.
Several weeks later when you return home, you find the wine in your local market. Despite having replaced at least a subset of the olfactory sensory neurons that first interacted with that wine's unique odor, you are still able to recognize its aroma when you pour a glass, because the overall pattern of activity within the olfactory system remains relatively constant.
It could not have been easy being Elliott Roosevelt. If the alcohol wasn't getting him, the morphine was. If it wasn't the morphine, it was the struggle with depression. Then, of course, there were the constant comparisons with big brother Teddy.
In 1883, the year Elliott began battling melancholy, Teddy had already published his first book and been elected to the New York State assembly. By 1891—about the time Elliott, still unable to establish a career, had to be institutionalized to deal with his addictions—Teddy was U.S. Civil Service Commissioner and the author of eight books. Three years later, Elliott, 34, died of alcoholism. Seven years after that, Teddy, 42, became President.
Elliott Roosevelt was not the only younger sibling of an eventual President to cause his family heartaches—or at least headaches. There was Donald Nixon and the loans he wangled from billionaire Howard Hughes. There was Billy Carter and his advocacy on behalf of the pariah state Libya. There was Roger Clinton and his year in jail on a cocaine conviction. And there is Neil Bush, younger sib of both a President and a Governor, implicated in the savings-and-loan scandals of the 1980s and recently gossiped about after the release of a 2002 letter in which he lamented to his estranged wife, "I've lost patience for being compared to my brothers."
Welcome to a very big club, Bro. It can't be easy being a runt in a litter that includes a President. But it couldn't have been easy being Billy Ripken either, an unexceptional major league infielder craning his neck for notice while the press swarmed around Hall of Famer and elder brother Cal. It can't be easy being Eli Manning, struggling to prove himself as an NFL quarterback while big brother Peyton polishes a Super Bowl trophy and his superman stats. And you may have never heard of Tisa Farrow, an actress of no particular note beyond her work in the 1979 horror film Zombie, but odds are you've heard of her sister Mia.
Of all the things that shape who we are, few seem more arbitrary than the sequence in which we and our siblings pop out of the womb. Maybe it's your genes that make you a gifted athlete, your training that makes you an accomplished actress, an accident of brain chemistry that makes you a drunk instead of a President. But in family after family, case study after case study, the simple roll of the birth-date dice has an odd and arbitrary power all its own.
The importance of birth order has been known—or at least suspected—for years. But increasingly, there's hard evidence of its impact. In June, for example, a group of Norwegian researchers released a study showing that firstborns are generally smarter than any siblings who come along later, enjoying on average a three-point IQ advantage over the next eldest—probably a result of the intellectual boost that comes from mentoring younger siblings and helping them in day-to-day tasks. The second child, in turn, is a point ahead of the third. While three points might not seem like much, the effect can be enormous. Just 2.3 IQ points can correlate to a 15-point difference in sat scores, which makes an even bigger difference when you're an Ivy League applicant with a 690 verbal score going head to head against someone with a 705. "In many families," says psychologist Frank Sulloway, a visiting scholar at the University of California, Berkeley, and the man who has for decades been seen as the U.S.'s leading authority on birth order, "the firstborn is going to get into Harvard and the second-born isn't."
The differences don't stop there. Studies in the Philippines show that later-born siblings tend to be shorter and weigh less than earlier-borns. (Think the slight advantage the 6-ft. 5-in. [196 cm] Peyton Manning has over the 6-ft. 4-in. [193 cm] Eli doesn't help when he's trying to throw over the outstretched arms of a leaping lineman?) Younger siblings are less likely to be vaccinated than older ones, with last-borns getting immunized sometimes at only half the rate of firstborns. Eldest siblings are also disproportionately represented in high-paying professions. Younger siblings, by contrast, are looser cannons, less educated and less strapping, perhaps, but statistically likelier to live the exhilarating life of an artist or a comedian, an adventurer, entrepreneur, GI or firefighter. And middle children? Well, they can be a puzzle—even to researchers.
For families, none of this comes as a surprise. There are few extended clans that can't point to the firstborn, with the heir-apparent bearing, who makes the best grades, keeps the other kids in line and, when Mom and Dad grow old, winds up as caretaker and executor too. There are few that can't point to the lost-in-the-thickets middle-born or the wild-child last-born.
Indeed, to hear families tell it, the birth-order effect may only be getting stronger. In the past, girls were usually knocked out of the running for the job and college perks their place in the family should have accorded them. In most other ways, however, there was little to distinguish a first-, second- or third-born sister from a first-, second- or third-born brother. Now, with college and careers more equally available, the remaining differences have largely melted away.
"There are stereotypes out there about birth order, and very often those stereotypes are spot-on," says Delroy Paulhus, a professor of psychology at the University of British Columbia in Vancouver. "I think this is one of those cases in which people just figured things out on their own."
But have they? Stack up enough anecdotal maybes, and they start to look like a scientific definitely. Things that appear definite, however, have a funny way of surprising you, and birth order may conceal all manner of hidden dimensions—within individuals, within families, within the scientific studies. "People read birth-order books the way they read horoscopes," warns Toni Falbo, professor of educational psychology at the University of Texas. "'I'm a middle-born, so that explains everything in my life'—it's just not like that." Still, such skepticism does not prevent more and more researchers from being drawn to the field, and as they are, their findings, and the debate over them, continue to grow.
Humans aren't alone
If you think it's hard to manage the birth-order issues in your family, be thankful you're not an egret or an orange blossom. Egrets are not the intellectual heavyweights of the animal kingdom—or even the bird world—but nature makes them remarkably cunning when it comes to planning their families. Like most other birds, egrets lay multiple eggs, but rather than brooding them all the same way so that the chicks emerge on more or less the same day, the mother begins incubating her first and second eggs before laying the remaining ones in her clutch. That causes the babies to appear on successive days, which gives the first-arriving chick the earliest crack at the food and a 24-hour head start on growth. The second-hatched may not have too difficult a time catching up, but the third may struggle. The fourth and beyond will have the hardest go, getting pushed aside or even pecked to death if food, water and shelter become scarce. All that makes for a nasty nursery, but that's precisely the way the mother wants it. "The parents overproduce a bit," says Douglas Mock, professor of zoology at the University of Oklahoma, "maybe making one more baby than they can normally afford to raise and then letting it take the fall if the resource budget is limited."
Orange trees are even tougher on their young. A typical orange tree carries about 100,000 pollinated blossoms, each of which is a potential orange, complete with the seeds that are potential trees. But in the course of a season, only about 500 oranges are actually produced. The tree determines which ones make the cut, shedding the blossoms that are not receiving enough light or that otherwise don't seem viable. It is, for a tree, a sort of selective termination on a vast scale. "You've got 99% of the babies being thrown out by the parent," says Mock. "The tree just drops all the losers."
Even mammals, warm-blooded in metabolism and—we like to think—temperament, can play a similarly pitiless game. Runts of litters are routinely ignored, pushed out or consigned to the worst nursing spots somewhere near Mom's aft end, where the milk flow is the poorest and the outlook for survival the bleakest. The rest of the brood is left to fight it out for the best, most milk-rich positions.
Humans, more sentimental than birds, trees or litter bearers, don't like to see themselves as coming from the same child-rearing traditions, but we face many of the same pressures. As recently as 100 years ago, children in the U.S. had only about a 50% chance of surviving into adulthood, and in less developed parts of the world, the odds remain daunting. It can be a sensible strategy to have multiple offspring to continue your line in case some are claimed by disease or injury.
While the eldest in an overpopulated brood has it relatively easy—getting 100% of the food the parents have available—things get stretched thinner when a second-born comes along. Later-borns put even more pressure on resources. Over time, everyone might be getting the same rations, but the firstborn still enjoys a caloric head start that might never be overcome.
Food is not the only resource. There's time and attention too and the emotional nourishment they provide. It's not for nothing that family scrapbooks are usually stuffed with pictures and report cards of the firstborn and successively fewer of the later-borns—and the later-borns notice it. Educational opportunities can be unevenly shared too, particularly in families that can afford the tuition bills of only one child. Catherine Salmon, an assistant professor of psychology at the University of Redlands in Redlands, Calif., laments that even today she finds it hard to collect enough subjects for birth-order studies from the student body alone, since the campus population is typically overweighted with eldest sibs. "Families invest a lot in the firstborn," she says.
All of this favoritism can become self-reinforcing. As parental pampering produces a fitter, smarter, more confident firstborn, Mom and Dad are likely to invest even more in that child, placing their bets on an offspring who—in survival terms at least—is looking increasingly like a sure thing. "From a parental perspective," says Salmon, "you want offspring who are going to survive and reproduce."
Firstborns do more than survive; they thrive. In a recent survey of corporate heads conducted by Vistage, an international organization of ceos, poll takers reported that 43% of the people who occupy the big chair in boardrooms are firstborns, 33% are middle-borns and 23% are last-borns. Eldest siblings are disproportionately represented among surgeons and M.B.A.s too, according to Stanford University psychologist Robert Zajonc. And a recent study found a statistically significant overload of firstborns in what is—or at least ought to be—the country's most august club: the U.S. Congress. "We know that birth order determines occupational prestige to a large extent," says Zajonc. "There is some expectation that firstborns are somehow better qualified for certain occupations."
Little sibs, big role
For eldest siblings, this is a pretty sweet deal. There is not much incentive for them to change a family system that provides them so many goodies, and typically they don't try to. Younger siblings see things differently and struggle early on to shake up the existing order. They clearly don't have size on their side, as their physically larger siblings keep them in line with what researchers call a high-power strategy. "If you're bigger than your siblings, you punch 'em," Sulloway says.
But there are low-power strategies too, and one of the most effective ones is humor. It's awfully hard to resist the charms of someone who can make you laugh, and families abound with stories of last-borns who are the clowns of the brood, able to get their way simply by being funny or outrageous. Birth-order scholars often observe that some of history's great satirists—Voltaire, Jonathan Swift, Mark Twain—were among the youngest members of large families, a pattern that continues today. Faux bloviator Stephen Colbert—who yields to no one in his ability to get a laugh—often points out that he's the last of 11 children.
Such examples might be little more than anecdotal, but personality tests show that while firstborns score especially well on the dimension of temperament known as conscientiousness—a sense of general responsibility and follow-through—later-borns score higher on what's known as agreeableness, or the simple ability to get along in the world. "Kids recognize a good low-power strategy," says Sulloway. "It's the way any sensible organism sizes up the niches that are available."
Even more impressive is how early younger siblings develop what's known as the theory of mind. Very small children have a hard time distinguishing the things they know from the things they assume other people know. A toddler who watches an adult hide a toy will expect that anyone who walks into the room afterward will also know where to find it, reckoning that all knowledge is universal knowledge. It usually takes a child until age 3 to learn that that's not so. For children who have at least one elder sibling, however, the realization typically comes earlier. "When you're less powerful, it's advantageous to be able to anticipate what's going on in someone else's mind," says Sulloway.
Later-borns, however, don't try merely to please other people; they also try to provoke them. Richard Zweigenhaft, a professor of psychology at Guilford College in Greensboro, N.C., who revealed the overrepresentation of firstborns in Congress, conducted a similar study of picketers at labor demonstrations. On the occasions that the events grew unruly enough to lead to arrests, he would interview the people the police rounded up. Again and again, he found, the majority were later- or last-borns. "It was a statistically significant pattern," says Zweigenhaft. "A disproportionate number of them were choosing to be arrested."
Later-borns are similarly willing to take risks with their physical safety. All sibs are equally likely to be involved in sports, but younger ones are likelier to choose the kinds that could cause injury. "They don't go out for tennis," Sulloway says. "They go out for rugby, ice hockey." Even when siblings play the same sport, they play it differently. Sulloway is currently collaborating on a study of 300 brothers who were major league ballplayers. Though the work is not complete, he is so far finding that the elder brothers excel at skills that involve less physical danger. Younger siblings are the ones who put themselves in harm's way—crouching down in catcher's gear to block an incoming runner, say. "It doesn't just hold up in this study but a dozen studies," Sulloway says.
It's not clear whether such behavior extends to career choice, but Sandra Black, an associate professor of economics at ucla, is intrigued by findings that firstborns tend to earn more than later-borns, with income dropping about 1% for every step down the birth-order ladder. Most researchers assume this is due to the educational advantages eldest siblings get, but Black thinks there may be more to it. "I'd be interested in whether it's because the second child is taking the riskier jobs," she says.
Black's forthcoming studies will be designed to answer that question, but research by Ben Dattner, a business consultant and professor of industrial and organizational psychology at New York University, is showing that even when later-borns take conservative jobs in the corporate world, they approach their work in a high-wire way. Firstborn ceos, for example, do best when they're making incremental improvements in their companies: shedding underperforming products, maximizing profits from existing lines and generally making sure the trains run on time. Later-born ceos are more inclined to blow up the trains and lay new track. "Later-borns are better at transformational change," says Dattner. "They pursue riskier, more innovative, more creative approaches."
If eldest sibs are the dogged achievers and youngest sibs are the gamblers and visionaries, where does this leave those in between? That it's so hard to define what middle-borns become is largely due to the fact that it's so hard to define who they are growing up. The youngest in the family, but only until someone else comes along, they are both teacher and student, babysitter and babysat, too young for the privileges of the firstborn but too old for the latitude given the last. Middle children are expected to step up to the plate when the eldest child goes off to school or in some other way drops out of the picture—and generally serve when called. The Norwegian intelligence study showed that when firstborns die, the IQ of second-borns actually rises a bit, a sign that they're performing the hard mentoring work that goes along with the new job.
Stuck for life in a center seat, middle children get shortchanged even on family resources. Unlike the firstborn, who spends at least some time as the only-child eldest, and the last-born, who hangs around long enough to become the only-child youngest, middlings are never alone and thus never get 100% of the parents' investment of time and money. "There is a U-shaped distribution in which the oldest and youngest get the most," says Sulloway. That may take an emotional toll. Sulloway cites other studies in which the self-esteem of first-, middle- and last-borns is plotted on a graph and follows the same curvilinear trajectory.
The phenomenon known as de-identification may also work against a middle-born. Siblings who hope to stand out in a family often do so by observing what the elder child does and then doing the opposite. If the firstborn gets good grades and takes a job after school, the second-born may go the slacker route. The third-born may then de-de-identify, opting for industriousness, even if in the more unconventional ways of the last-born. A Chinese study in the 1990s showed just this kind of zigzag pattern, with the first child generally scoring high as a "good son or daughter," the second scoring low, the third scoring high again and so on. In a three-child family, the very act of trying to be unique may instead leave the middling lost, a pattern that may continue into adulthood.
The holes in the theories
The birth-order effect, for all its seeming robustness, is not indestructible. There's a lot that can throw it out of balance—particularly family dysfunction. In a 2005 study, investigators at the University of Birmingham in Britain examined the case histories of 400 abused children and the 795 siblings of those so-called index kids. In general, they found that when only one child in the family was abused, the scapegoat was usually the eldest. When a younger child was abused, some or all of the other kids usually were as well. Mistreatment of any of the children usually breaks the bond the parents have with the firstborn, turning that child from parental ally to protector of the brood. At the same time, the eldest may pick up some of the younger kids' agreeableness skills—the better to deal with irrational parents—while the youngest learn some of the firstborn's self-sufficiency. Abusiveness is going to "totally disrupt the birth-order effects we would expect," says Sulloway.
The sheer number of siblings in a family can also trump birth order. The 1% income difference that Black detected from child to child tends to flatten out as you move down the age line, with a smaller earnings gap between a third and fourth child than between a second and third. The IQ-boosting power of tutoring, meanwhile, may actually have less influence in small families, with parents of just two or three kids doing most of the teaching, than in the six- or eight-child family, in which the eldest sibs have to pitch in more. Since the Norwegian IQ study rests on the tutoring effect, those findings may be open to question. "The good birth-order studies will control for family size," says Bo Cleveland, associate professor of human development and family studies at Penn State University. "Sometimes that makes the birth-order effect go away; sometimes it doesn't."
The most vocal detractors of birth-order research question less the findings of the science than the methods. To achieve any kind of statistical significance, investigators must assemble large samples of families and look for patterns among them. But families are very different things—distinguished by size, income, hometown, education, religion, ethnicity and more. Throw enough random factors like those into the mix, and the results you get may be nothing more than interesting junk.
The alternative is what investigators call the in-family studies, a much more pointillist process, requiring an exhaustive look at a single family, comparing every child with every other child and then repeating the process again and again with hundreds of other families. Eventually, you may find threads that link them all. "I would throw out all the between-family studies," says Cleveland. "The proof is in the in-family design."
Ultimately, of course, the birth-order debate will never be entirely settled. Family studies and the statistics they yield are cold and precise things, parsing human behavior down to decimal points and margins of error. But families are a good deal sloppier than that, a mishmash of competing needs and moods and clashing emotions, better understood by the people in the thick of them than by anyone standing outside. Yet millenniums of families would swear by the power of birth order to shape the adults we eventually become. Science may yet overturn the whole theory, but for now, the smart money says otherwise.
— Reported by Dan Cray/Los Angeles
Sunday, October 21, 2007
Half with severe disorders in developed countries get no care, study says
Most people in the world with mental illness get no treatment at all, and scarce mental health resources are not reaching the people who need them most, U.S. researchers said on Thursday.
“The treatment data we have are pretty troubling,” said Dr. Philip Wang of the U.S. National Institute of Mental Health, whose study appears in the Lancet medical journal.
Wang and colleagues studied mental health treatment data on 84,850 adults in 17 developed and developing countries taken from the World Health Organization’s mental health surveys.
The lack of mental health treatment was most severe in less-developed countries, where only a few people with serious disorders received any treatment in the past year.
But even in developed nations, roughly half of those with severe disorders got no care at all.
“Even in the United States, which is by far the most resourced country, it is by no means adequate. In our country, folks who meet the criteria for the most serious illness, only about half get anything,” Wang said in a telephone interview.
“Many aren’t receiving healthcare at all. The situation is concerning,” he said.
Patients who were male, married, less-educated and at the extremes of age or income got the least amount of care, the researchers found.
Not surprisingly, the number of people using any mental health services was generally lower in developing countries compared with developed countries.
The researchers also found a correlation between use of mental health services and the percentage of a nation’s gross domestic product spent on health services.
But Wang and colleagues also found that resources are poorly allocated when they are used.
And while efforts to control mental health spending, such as utilization review and prior authorization, might reduce use, they do little to direct care to the neediest patients.
“We’re good at reducing utilization. We’re not so good at channeling it among the people with the greatest need,” Wang said.
Saturday, October 20, 2007
More Americans than ever with mental disorders are trying to get care, but only a third receive effective treatments, says a landmark government survey out Monday.Rates of mental illness have flattened in the past 15 years after steadily rising from the 1950s. "That's reassuring and a little surprising, given the economic slumps and 9/11," says survey director Ronald Kessler of Harvard Medical School.About one in four adults have the symptoms of at least one mental illness every year, and nearly half suffer disorders during their lifetimes, according to the study of 9,282 people published in the Archives of General Psychiatry. The study is a detailed update of large federal surveys done in the '80s and '90s.
On the positive side, 41% with a disorder went for treatment in the past year, up from 25% a decade ago and 19% two decades ago. The more severe the disorder, the more likely a person was to get good care.Most people with disorders — about four out of five — have mild to moderate mental illness. Overall, 6% of Americans have disorders that seriously impair their daily lives. Younger adults are most likely to seek prompt care, so the stigma of mental illness may be waning, Kessler says.
But positive signs in the survey may be overshadowed by two realities: Only about a third get effective care, and the most serious disorders begin at a young age, often going undetected and worsening for a decade or longer. Half of all mental illness begins by age 14, and three-fourths of adults have their symptoms by age 24, the survey shows.
But research has focused on adults. Much more research on the adolescent brain is needed, and large treatment studies not financed by drug companies must be done, says Thomas Insel, director of the National Institute of Mental Health, which paid for the survey.But there's a shortage of researchers focusing on treating children, "and most are working full time on drug-company-funded studies," Insel says.That only a third of adults get effective care "is pretty disturbing," Insel says. "We've got to figure out how to do this better. If I told you only a third of breast-cancer patients were getting adequate care, you'd wonder, how could that be?"
Patients got the most effective care from mental health experts, such as psychologists and psychiatrists, the survey shows. Yet even specialists gave adequate care to just under half of their patients. And 52% saw medical doctors for treatment of mental disorders, with 13% receiving adequate care.
Friday, October 19, 2007
Medical News Today
19 Oct 2007
A new US study on mice has suggested that differences in the chemistry of reward circuits in the brain may explain why some people are more susceptible than others to post traumatic stress, depression and other mental and emotional disorders when faced with adversity. It is hoped the findings will lead to new types of drugs and treatments for people who are victims of or working in high stress situations such as combat zones and disasters.
The study is published in the 19th October issue of the journal Cell and is the work of researchers at The University of Texas Southwestern Medical Center (UTSWMC) in Dallas and colleagues from Harvard University, Cambridge, Massachusetts, and Weill Medical College of Cornell University, New York.
The researchers found that mice that coped less well with "social defeat" had higher levels of BDNF (brain-derived neurotrophic factor) in a part of the brain that is important for controlling behaviours related to reward and emotions. The mice that coped well showed lower levels of BDNF when exposed to the same stressor.
BDNF promotes brain plasticity by helping to make new connections between neurons, an important function for memory and learning, said the researchers.
Eric Nestler (UTSWMC), corresponding author of the study suggested that:
"The increase in BDNF may have an adaptive role normally, allowing an animal to learn that a situation is bad and avoid it in the future."
"But under conditions of extreme social stress, susceptible animals may be 'over-learning' this principle and generalizing it to other situations. They avoid their aggressors, but they also avoid all mice and even other fun things like sugar or sex," explained Nestler.
All the mice in the experiment were virtually genetically identical and were raised in the same carefully controlled enviroment, said Nestler. The researchers said it was possible that environment and social factors (eg the dominance hierarchy in a litter) could explain the differences in the reactions of the mice, or even random events during their development.
A person's response to stress is thought to be due to a complex interplay of genetic and environmental factors, said the researchers, paying witness to a large body of literature on the effects of acute and chronic stress on physiology and behaviour. But much less is known about the biological differences of different stress reactions, they said.
The experiment comprised forcing large mice to be aggressive toward smaller mice. An earlier experiment had shown that after experiencing 10 defeats in 10 days, the small mice tended to avoid social interaction for a long time afterwards.
But this latest experiment demonstrated a range of responses within that reaction. While all the small mice showed signs of anxiety, only some of them showed symptoms similar to post traumatic stress and depression. The more susceptible mice lost weight and became less interested in sugar, symptoms consistent with depressive states. These mice also had greater sensitivity to low doses of cocaine.
When the researchers examined the brains of the more susceptible mice they found they had 90 per cent higher levels of BDNF in the "mesolimbic dopamine" reward circuit compared to the other, more resilient, mice. The BDNF levels in the brains of the resilient mice had not changed.
When they did genetic tests on the brains of the two groups of mice the researchers found that the more resilient mice had more activity in the genes that stop neurons becoming over-excited. They also found that the more vulnerable mice had dopamine neurons that fired at a faster rate than those of the resilient mice. It appeared that resilience to social stress was linked to having a less active type of BDNF.
The researchers also said that when they carried out post mortem tests on the brain tissue of human patients that had been depressed, they found their levels of BDNF were 40 per cent higher than in patients who had not been depressed.
"Molecular Adaptations Underlying Susceptibility and Resistance to Social Defeat in Brain Reward Regions."
Vaishnav Krishnan, Ming-Hu Han, Danielle L. Graham, Olivier Berton, William Renthal, Scott J. Russo, Quincey LaPlant, Ami Graham, Michael Lutter, Diane C. Lagace, Subroto Ghose, Robin Reister, Paul Tannous, Thomas A. Green, Rachael L. Neve, Sumana Chakravarty, Arvind Kumar, Amelia J. Eisch, David W. Self, Francis S. Lee, Carol A. Tamminga, Donald C. Cooper, Howard K. Gershenfeld, and Eric J. Nestler.
Cell, Vol 131, 391-404, 19 October 2007.
WASHINGTON, Oct. 18 (Reuters) — Many veterans of combat in Iraq and Afghanistan are clearly suffering from post-traumatic stress disorder, but it is not at all clear which treatments work to help them, an expert panel from the Institute of Medicine reported Thursday.
The only treatment that has been shown to work, the panel said, is exposure therapy, a gradual, step-by-step process in which patients are asked to confront memories of a trauma by recounting it in detail. Veterans Affairs hospitals now use that treatment.
“At this time, we can make no judgment about the effectiveness of most psychotherapies or about any medications in helping patients with P.T.S.D.,” said the panel’s chairman, Dr. Alfred Berg of the University of Washington, Seattle.
“These therapies may or may not be effective — we just don’t know in the absence of good data,” Dr. Berg said. “Our findings underscore the urgent need for high-quality studies that can assist clinicians in providing the best possible care to veterans and others who suffer from this serious disorder.”
Post-traumatic stress disorder is the most commonly diagnosed service-related mental disorder among military personnel returning from Iraq and Afghanistan. The panel quoted surveys showing that 12.6 percent of troops who fought in Iraq and 6.2 percent who were in Afghanistan have experienced it.
The experts, who were appointed by the institute, the medical arm of the National Academy of Sciences, reviewed 53 studies of drugs and 37 studies of psychotherapy approaches used to treat the stress disorder. They found most of the studies lacking.
“The majority of drug studies have been funded by the pharmaceutical manufacturers, and the majority of psychotherapy studies have been conducted by the individuals who developed the techniques or their close collaborators,” the panel said in a statement.
Drugs studied included anticonvulsants, antipsychotics, tranquilizers and antidepressants. Several studies were flawed because participants discontinued treatment, the panel said.
It concluded that the government and psychiatric researchers needed to take steps “to ensure that the right studies are undertaken to yield clearer, more reliable data that would help clinicians treat P.T.S.D. sufferers.”
Doctors for the agency also use drugs in treating the disorder. In a statement, the department said it was important to note that the new report says only that “more research is needed, not that medications have been found to be ineffective.”
Sunday, October 14, 2007
Sandra Blakeslee, New York Times
Wednesday, May 12, 1999
The ability to do mathematics -- everything from simple arithmetic to thinking up equations that explain an expanding universe -- may stem from the interaction of two brain circuits that handle numbers differently, two scientists have found.
In a series of experiments involving bilingual college students, the researchers discovered that one circuit gives names to numbers and carries out exact calculations. A second circuit operates intuitively and is used for estimating quantities and other numerical relationships.
During human evolution, they suggest, these two brain areas combined forces and gave rise to the remarkable human capacity for manipulating numbers arithmetically. Their interaction may also underlie many kinds of advanced mathematics.
Dr. Stanislas Dehaene, a neuroscientist at the National Institute for Health and Medical Research (known as Inserm) in Orsay, France, and Dr. Elizabeth Spelke, a cognitive psychologist at the Massachusetts Institute of Technology in Cambridge, Mass., described the results of their study in last week's issue of Science magazine.
Dr. Brian Butterworth, a professor of cognitive neuroscience at University College London, hailed the new work as ``virtuoso stuff.''
``Their experiments reveal the brain's numerical processes in unprecedented detail,'' said Butterworth, who is the author of ``What Counts: How Every Brain Is Hardwired for Math,'' to be published in August by the Free Press.
Until recently, little has been known about how the human brain actually does math. In the past, Dehaene said, ``the only source of information about the mental representations used in mathematics was the introspection of mathematicians.''
Some mathematicians and scientists say language is crucial. But Albert Einstein, for example, once said, ``words or language, as they are written or spoken, do not seem to play any role in my mechanism of thought.''
More recent experiments have provided hints about where and how mathematical knowledge is embedded in the brain. Studies show that rhesus monkeys have a number sense and that chimps can use symbols for numbers. Birds and rats can count. Human infants can detect changes in the number of objects in an array, suggesting they have a number sense.
At the opposite end of the scale, stroke patients can lose aspects of their ability to do arithmetic, Dehaene said. For example, some cannot decide what number falls between 2 and 4 or whether 9 is closer to 10 or 5, yet they can easily rattle off multiplication tables.
Others cannot decide if 2 + 2 equals 3 or 4 but if asked which number they prefer as an answer -- 3 or 9 -- they choose 3. From these clues, Dehaene postulated that within elementary arithmetic, there are at least two circuits for representing a number. One is language based. It stores tables of exact arithmetic knowledge, like the multiplication tables. The second is independent of language. It represents number magnitudes and has been called a mental ``number line'' used to approximate and manipulate quantities.
To test this idea, Dr. Spelke asked volunteers who were fluent in Russian and English to solve a series of arithmetic problems. One group was schooled in Russian, the other in English. One set of the math problems involved exact calculations: does 53 plus 68 equal 121 or 127? Another set of problems involved approximating answers: is 53 plus 68 closer to 120 or 150?
When approximating answers, both groups performed the same whether they were tested in English or Russian, Dehaene said. But a different pattern emerged for exact calculations. When those taught in Russian were tested in English or vice versa, he said, the volunteers needed up to a full second or more to solve the problem.
The researchers concluded that knowledge about exact problems is stored in a language area because subjects had to translate internally to solve the problem. But in approximating answers, no translation time is required, suggesting that this ability is stored independently from language.
``I was amazed that the dissociation could be so sharp,'' Dehaene said. ``We presented our subjects with tasks that are superficially extremely similar. Our brains really solve these two tasks in quite different ways.''
Later, the researchers gave similar tests in exact and approximate arithmetic to French college students and used imaging techniques to to see which areas of their brains were active. When approximating answers, their parietal lobes lit up.
These are regions on both sides of the brain that carry out visual and spatial tasks. It is where the mind makes analogies, guides hand and eye movements, rotates objects mentally and orients attention. When doing exact calculations, on the other hand, the left frontal lobe lit up the most. This is where the brain associates verbs and nouns and carries out other language tasks.
To rule out the possibility that people first use one area and switch to the other, the researchers took another set of brain images using a technique that captures the timing of events in hundredths of a second. They could see that the parietal and frontal circuits operate independently in doing the different kinds of arithmetic problems.
Dehaene said the new findings cannot determine which children are naturally better at math. Nor do they herald big changes in the way math is taught. The sole factor known to predict exceptional expertise in any field, Butterworth said, is hard work.
It looks like humans are ``born with a start-up kit for numbers,'' Butterworth said, and must practice, just as musicians do.
Friday, October 12, 2007
Glaxo Smith Kline R&D
Major depressive disorder (MDD), the most common of the psychiatric disorders, is a serious illness that is associated with high rates of chronicity, relapse, recurrence, and mortality. MDD currently ranks as the fourth leading cause of disability among all medical illnesses, and is expected to rank second by the year 2030 (Murray and Lopez, 1996). It also has potentially life-threatening consequences – suicide is an all-too-frequent outcome – heightening the importance of understanding the pathophysiology of MDD and developing successful treatments.
There is a substantial body of evidence linking the neurotransmitter serotonin (5-HT) to the pathophysiology and treatment of MDD. 5-HT is produced by serotonergic neurons - it has been estimated that several hundred thousand serotonergic neurons are present in the human brain, primarily within discrete nuclei in the brainstem. The highest concentrations of serotonergic neurons reside in the dorsal raphe nucleus in the midbrain. This nucleus innervates the cerebral cortex, thalamus, basal ganglia, and the limbic system (hippocampus, amygdala). There 5-HT is thought to modulate or “balance” the circuitry between the frontal cortex and limbic system for normal affective function (Stahl, 1996).
Evidence gathered over several decades suggests that alterations in 5-HT function play a role in the etiology of MDD. Asberg et al (1976) examined the concentration of 5-hydroxyindoleacetic acid (5-HIAA), the major metabolite of 5-HT, in the brain of depressed patients. The results showed that up to 40% of these patients had significantly lower levels of 5-HIAA compared to normal controls. Importantly, patients showing decreased levels of 5-HIAA were significantly more likely to have attempted suicide. Other studies have demonstrated a significant reduction in 5-HT concentrations of whole brain, hypothalamus, and amygdala in postmortem tissues from depressed patients or suicide victims (Owens and Nemeroff, 1994). These findings suggest that alterations in brain 5-HT availability may significantly alter mood and may play an important role in the etiology of MDD.
Serotonergic neurotransmission appears to be terminated primarily by reuptake of 5-HT via the serotonin reuptake transporter (SERT). This effectively lowers concentrations of 5-HT to levels incapable of maintaining postsynaptic activation. Utilizing [3H]-imipramine and [3H]-paroxetine binding, decreased 5-HT transporter density and uptake capacity has been measured in platelets from depressed patients as well as in postmortem brain tissue from suicide victims (Owens and Nemeroff, 1994). The observation of decreased platelet and brain SERT binding in depression has been a consistent finding, and suggests that altered neuronal 5-HT disposition may play an important role in the pathophysiology of MDD.
Serotonin binds to specific receptors that are widely distributed throughout the brain. The serotonergic receptor family is extraordinarily large. To date, a total of 14 distinct serotonin receptors have been cloned. Each receptor is characterized by a unique structure, pharmacology, expression pattern, and second-messenger effector. Thirteen of these receptors belong to the superfamily of G protein–coupled receptors (linked either with excitatory or inhibitory G proteins). The large number of 5-HT receptor subtypes has posed a challenge to the design of subtype-specific agonists and antagonists (Glennon et al, 2000; Aghajanian and Sanders-Bush, 2002; Frazer and Hensler, 1999; Nestler et al, 2001).
THE 5-HT1A RECEPTOR
The 5-HT1A receptor subtype has been demonstrated to play an important role in both the etiology and treatment of MDD (Stahl, 1996; Blier and Ward, 2003) and will be the focus of the remainder of this article. Both pre- and postsynaptic 5-HT1A receptors have been identified. Presynaptic receptors are located primarily on cell bodies (soma) and input-receiving dendrites (i.e. somatodendritic) of the dorsal raphe and act as inhibitory autoreceptors that exert a negative feedback influence on 5-HT neuronal firing. When these receptors are activated by excess amounts of 5-HT or by an exogenous agonist, they hyperpolarize the neuron, causing it to slow down its firing activity (Nestler et al, 2001; Blier and Ward 2003; Stahl, 1996). Because 5-HT release is proportional to the firing rate of 5-HT neurons, excessive activation of 5-HT1A1A receptors leads to desensitization, diminishing the negative feedback influence, and ultimately resulting in a gradual normalization of postsynaptic 5-HT release (Blier and Ward, 2003).
Animal studies have established that presynaptic 5-HT1A somatodendritic receptors of the dorsal raphe influence the postsynaptic release of 5-HT into the prefrontal cortex (Stahl, 1996). Changes in the activity or density of these autoreceptors may therefore alter serotonergic release in the prefrontal cortex and thereby play a significant clinical role in MDD. Stockmeier et al (1998) examined the density of 5-HT1A autoreceptors in the dorsal raphe of suicide victims with MDD. [3H]-8-OH-DPAT (a selective 5-HT1A ligand) binding was shown to be significantly increased in the dorsal raphe of depressed suicide victims compared to psychiatrically normal control subjects. This increase in 5-HT1A autoreceptors might act to abnormally dampen 5-HT neuronal cell firing and subsequent postsynaptic release of 5-HT. An increase in 5-HT1A
Positron emission tomography (PET) can now be employed to study 5-HT1A receptor distribution in the human brain. In addition to their localization in the raphe nucleus, 5-HT1A1A binding in several cortical regions (mediotemporal, orbitofrontal, anterior cingulate, insula, and dorsolateral prefrontal cortex) (Shively et al, 2006). autoreceptors results in the short term in a decrease of postsynaptic 5-HT release to the areas where these neurons project (e.g., cortex, limbic system) (Stahl, 1996; Nestler et al, 2001). However, longer term activation of presynaptic 5-HT mediated negative feedback in MDD is also consistent with the hypothesis that the therapeutic effect of antidepressants may involve long term physiological desensitization (or disinhibition) of this negative feedback influence on postsynaptic 5-HT output. receptors have a distribution in cortical and limbic regions that roughly approximates that of the SERT (Figure 1). PET studies in humans with MDD have also shown decreases in postsynaptic 5-HT
Several investigators have also reported an increased density of postsynaptic 5-HT2 receptor binding sites in the frontal cortices of depressed suicide victims and unmedicated depressed patients. Up-regulation of cortical 5-HT2 receptors in MDD has been suggested to signal an adaptive response to reduced postsynaptic 5-HT output to these regions. (Owens and Nemeroff, 1994).
THE 5-HT1A RECEPTOR
Currently, selective serotonin reuptake inhibitors (SSRIs) are the most commonly prescribed medications for the treatment of MDD. Significant progress has been made in understanding the mechanisms by which these agents exert their antidepressant effects. Evidence gathered to date supports the following sequence of events in response to administration of an SSRI (Stahl, 1996):
Approximately 40 to 50% of MDD patients report sexual dysfunction (Kennedy, 1999). Antidepressant treatments, particularly SSRIs, further compound this problem. Sexual dysfunction, such as loss of libido, impotence, ejaculatory problems, and anorgasmia, has been reported for most classes of antidepressants, with 30-70% of patients reporting treatment-related sexual dysfunction (Kennedy, 2000). In some cases, permanent deleterious effects on sexual functioning have been reported (Meston, 2004). The sexual side effects of SSRIs appear to be mediated by overactivation of 5-HT2A and 5-HT2C receptors.
Changes in appetite have also been shown to occur with the use of SSRIs, which may be manifest as significant changes in body weight. Preclinical studies have linked the effect of SSRIs on appetite to 5-HT2C receptors. Nausea has also been shown to be a major consequence of SSRI use and is mediated by the binding of 5-HT to 5-HT3 receptors (Stahl, 1996). As with all antidepressants, the adverse effects associated with SSRI use may have a significant negative impact on patient compliance, thus reducing their effectiveness in the treatment of MDD.
Selective 5-HT1A agonists, such as the azapyrones, appear to represent another serotonergic treatment strategy for MDD. Neurochemical studies of this drug class confirm their binding as agonists to both presynaptic and postsynaptic 5-HT1A receptor binding sites in animals and humans. Preclinical and clinical data also indicate antidepressant efficacy. Evidence gathered to date supports the following sequence of events in response to administration of 5-HT1A agonists (Blier and Ward, 2003):
Preclinical data suggest that activation of postsynaptic 5-HT1A receptors is central to the antidepressant effect of antidepressants. Animal behavioral models of stress and depression have consistently shown that selective activation of postsynaptic 5-HT1A receptors produces effects similar to those elicited by SSRIs and other antidepressants. The phenomenon of hippocampal neurogenesis (sprouting of new neurons), which has been shown to occur in response to various types of antidepressant treatments, appears to be mediated by activation of postsynaptic 5-HT1A receptors (Blier and Ward, 2003; Haddjeri et al, 1998). Another potentially important additional consequence of the activation of 5-HT1A postsynaptic receptors is a reduction in the density of cortical 5-HT2 receptor binding sites. This effect has been linked with an improvement in sleep architecture and with anxiolysis (Blier and Ward 2003).
Although 5-HT1A agonists and SSRIs have different primary mechanisms of action, they appear to induce similar adaptive changes in the brain, probably as a consequence of their common downstream effects on postsynaptic 5-HT release. Both presynaptic 5-HT1A autoreceptors in the dorsal raphe, as well as postsynaptic 5-HT1A receptors in cortical and limbic areas, appear to play a key role in the therapeutic effects of these drugs. The similarity of adaptive changes seen with both mechanisms, coupled with the greater selectivity of 5-HT1A agonists suggest that the latter could represent a more targeted approach to the treatment of patients with MDD.
- Aghajanian GK, Sanders-Bush E. Serotonin. In:
KL, Charney D, Coyle JT, Nemeroff C, eds.Neuropsychopharmacology. The Fifth Generation of Progress. Davis : Lippincott Williams & Wilkins; 2002:15-34. Philadelphia, PA
- Asberg M, Traskman L, Thoren P. 5-HIAA in the cerebrospinal fluid: a biochemical suicide predictor? Arch Gen Psychiatry. 1976;33(10):1193-1197.
- Blier P, Ward NM. Is there a role for 5-HT1A agonists in the treatment of depression? Biol Psychiatry. 2003;53(3):193-203.
- Frazer A, Hensler JG. Serotonin receptors. In: Siegel GJ, Agranoff BW, Albers RW, Fisher SK, Uhler MD, eds.Basic Neurochemistry: Molecular, Cellular and Medical Aspects. 6th ed.
: Lippincott Williams & Wilkins; 1999:263-292. Philadelphia, PA
- Glennon RA, Dukat M, Westkaemper RB. Serotonin receptor subtypes and ligands. In: Bloom FE, Kupfer DJ, eds. Psychopharmacology:The Fourth Generation of Progress.
: Raven Press;1995 415-429. New York, NY
- Haddjeri N, Blier P, de Montigny C. Long-term antidepressant treatments result in a tonic activation of forebrain 5-HT1A receptors. J Neurosci. 1998;18(23):10150-10156.
- Kennedy SH, Dickens SE, Eisfeld BS, Bagby RM. Sexual dysfunction before antidepressant therapy in major depression. J Affect Disord 1999;56:201-8.
- Kennedy SH, Eisfeld BS, Dickens SE, Bacchiochi JR,
Antidepressant-induced sexual dysfunction during treatment with moclobemide, paroxetine, sertraline, and venlafaxine. J Clin Psychiatry 2000;61:276-281. Bagby RM.
- Meston CM. A randomized, placebo-controlled, crossover study of ephedrine for SSRI-induced female sexual dysfunction.J Sex Marital Ther. 2004;30(2):57-68.
CJ, Lopez AD. The global burden of disease. A comprehensive assessment of mortality and disability from diseases, injuries, and risk factors in 1990 and projected to 2020. Murray Cambridge: Press; 1996 Harvard University
- Nestler E, Hyman S, Malenka R. Molecular Neuropharmacology: A Foundation for Clinical Neuroscience.
: McGraw-Hill; 2001. New York, NY
- Owens MJ, Nemeroff CB. Role of serotonin in the pathophysiology of depression: focus on the serotonin transporter. Clin Chem. 1994;40(2):288-295.
- Shively CA, Friedman DP, Gage D, et al. Behavioral depression and positron emission tomography–determined serotonin 1A receptor binding potential in cynomolgus monkeys.Arch Gen Psychiatry. 2006;63(4):396-403.
- Stahl SM. Essential Psychopharmacology: Neuroscientific Basis and Clinical Applications.
Cambridge, UK: Press; 1996. Cambridge University
- Stockmeier CA, Shapiro LA, Dilley GE, Kolli TN, Friedman L, Rajkowska G. Increase in serotonin-1A autoreceptors in the midbrain of suicide victims with major depression—postmortem evidence for decreased serotonin activity. J Neurosci. 1998;18(18):7394-7401.