Monday, October 29, 2007

Memory Decline Begins Later But Progresses More Quickly In The More Educated

Written by: Catharine Paddock
Medical News Today

24 Oct 2007

A new US study suggests that people with more years of formal education experience the onset of memory decline associated with dementia later, but once it starts it declines more rapidly, compared to people with fewer years of education.

The study is published in the 23rd October issue journal Neurology and is the work of Dr Charles B Hall, from the Albert Einstein College of Medicine in Bronx, New York and colleagues.

Hall said that:

"Higher levels of education delay the onset of dementia, but once it begins, the accelerated memory loss is more rapid in people with more education."

He and his colleagues found that a person with 16 years of formal education experienced a 50 per cent faster rate of memory decline than someone with only 4 years of education.

The researchers wanted to test something called the cognitive reserve hypothesis. This suggests that people with more years of education have a reserve of cognitive ability that hides (and this can be without the person realising it) the onset of memory decline (for instance they use thinking skills to work out answers to memory tests). This could explain why low education is a well known risk factor for Alzheimer's Disease (AD), a condition that is characterised by accelerated memory decline.

Hall and colleagues followed 488 people in the Bronx Aging Study of whom 117 developed dementia. The participants underwent detailed assessment of their cognitive skills at the start of the study, and also every year during the 6 years of follow up, when they completed a memory test called the Buschke Selective Reminding Test. Their formal education ranged from fewer than 3 years of elementary school to completion of postgraduate education.

The researchers estimated the point at which the rate of cognitive decline began to accelerate (the change point as they called it), and the rates of decline before and after this point.

They found that every additional year of formal education delayed the change point by 0.21 years (2.5 months). After the change point, the rate of memory decline increased by 0.10 points for each year of additional formal education. This translated to a 4 per cent faster rate of decline for each additional year of formal education.

The researchers gave an example. The onset of accelerated memory decline for a college graduate with 16 years of formal education who is diagnosed with dementia at 85 years of age would have started four years earlier, at age 81. But a person with only 4 years of formal education, who is diagnosed at the same age of 85, would have started to experience a slower rate of decline six years before diagnosis, at age 79.

This seemed to reflect previous research findings that showed people with more years of education suffered memory loss more quickly once they were diagnosed with dementia, wrote the researchers.

The researchers concluded that:

"As predicted by the cognitive reserve hypothesis, higher education delays the onset of accelerated cognitive decline; once it begins it is more rapid in persons with more education."

Commenting on the findings, Hall said that:

"This rapid decline may be explained by how people with more education have a greater cognitive reserve, or the brain's ability to maintain function in spite of damage."

"So while they're often diagnosed with dementia at a later date, once the cognitive reserve is no longer able to compensate for the damage that's occurred, then the symptoms emerge," he added.

The researchers wrote that while their study was important because it was the first to test the cognitive reserve hypothesis in people with preclinical dementia, they pointed out that the people in the study were born at the turn of the 20th century and their life experiences and education may not be representative of people currently in education today.

"Education delays accelerated decline on a memory test in persons who develop dementia."
Hall, C. B., Derby, C., LeValley, A., Katz, M. J., Verghese, J., Lipton, R. B.
Neurology 2007 69: 1657-1664

Sunday, October 28, 2007

Tapping Brains for Future Crimes

By: Jennifer Granick
Wired

A team of neuroscientists announced a scientific breakthrough last week in the use of brain scans to discover what's on someone's mind.

Researchers from the Max Planck Institute for Human Cognitive and Brain Sciences, along with scientists from London and Tokyo, asked subjects to secretly decide in advance whether to add or subtract two numbers they would later be shown. Using computer algorithms and functional magnetic resonance imaging, or fMRI, the scientists were able to determine with 70 percent accuracy what the participants' intentions were, even before they were shown the numbers.

The study used "multivariate pattern recognition" to identify oxygen flow in the brain that occurs in association with specific thoughts. The researchers trained a computer to recognize these flow patterns and to extrapolate from what it had learned to accurately read intentions.

The finding raises issues about the application of such tools for screening suspected terrorists -- as well as for predicting future dangerousness more generally. Are we closer than ever to the
crime-prediction technology of Minority Report?

As I've argued in this space before, the popular press tends to over-dramatize scientific advances in mind reading. FMRI results have to account for heart rate, respiration, motion and a number of other factors that might all cause variance in the signal. Also, individual brains differ, so scientists need to study a subject's patterns before they can train a computer to identify those patterns or make predictions.

While the details of this particular study are not yet published, the subjects' limited options of either adding or subtracting the numbers means the computer already had a 50/50 chance of guessing correctly even without fMRI readings. The researchers indisputably made physiological findings that are significant for future experiments, but we're still a long way from mind reading.

Still, the more we learn about how the brain operates, the more predictable human beings seem to become. In the Dec. 19, 2006, issue of The Economist, an article questioned the scientific validity of the notion of free will: Individuals with particular congenital genetic characteristics are predisposed, if not predestined, to violence.

Studies have shown that genes and organic factors like frontal lobe impairments, low serotonin levels and dopamine receptors are highly correlated with criminal behavior. Studies of twins show that heredity is a major factor in criminal conduct. While no one gene may make you
a criminal, a mixture of biological factors, exacerbated by environmental conditions, may well do so.

Looking at scientific advances like these, legal scholars are beginning to question the foundational principles of our criminal justice system.

For example, University of Florida law professor Christopher Slobogin, who is visiting at Stanford this year, has set forth a compelling case for putting prevention before retribution in criminal justice.

Two weeks ago, Slobogin gave a talk based on his book, Minding Justice. He pointed to the studies showing that our behavior is predetermined or strongly influenced by biology, and that if we can identify those biological factors, we can predict behavior. He argues that the justice system should provide treatment for potential wrongdoers based on predictions of dangerousness instead of settling for punishing them after the fact.

It's a tempting thought. If there is no such thing as free will, then a system that punishes transgressive behavior as a matter of moral condemnation does not make a lot of sense. It's compelling to contemplate a system that manages and reduces the risk of criminal behavior in the first place.

Yet, despite last week's announcement from the Max Planck Institute, neuroscience and bioscience are not at a point where we can reliably predict human behavior. To me, that's the most powerful objection to a preventative justice system -- if we aren't particularly good at
predicting future behavior, we risk criminalizing the innocent.

We aren't particularly good at rehabilitation, either, so even if we were sufficiently accurate in identifying future offenders, we wouldn't really know what to do with them.

Nor is society ready to deal with the ethical and practical problems posed by a system that classifies and categorizes people based on oxygen flow, genetics and environmental factors that are correlated as much with poverty as with future criminality.

"Minority Report," a short story by Philip K. Dick that became the 2002 Steven Spielberg blockbuster, portrays a society where investigators can learn in advance that someone will commit a murder, even before the offender himself knows what he will do. Gattaca, a
1997 film, tells of a society that discriminates against genetically "inferior" individuals.

Science fiction has long grappled with the question of how a society that can predict future behavior should act. The stories suggest that people are more varied, more free, than the computer models allow. They also suggest that a system based on predictions is subject not
only to innocent mistakes but also to malicious manipulation at a level far greater than our current punishment-oriented regime.

In time, neuroscience may produce reliable behavior predictions. But until then, we should take the lessons of science fiction to heart when deciding how to use new predictive techniques.

Where the Brain Makes Decisions

By Dermot McGrath

LONDON -- They are life's perennial questions: Pepsi or Coke, Mac or PC, cremation or burial?

Scientists in the United Kingdom believe they may be close to unraveling some of the brain processes that ultimately dictate the choices we make as consumers.

Using a revolutionary method of imaging the brain, researchers from the Open University (OU) and the London Business School say they have identified the brain region that becomes active as the shopper reaches to the supermarket shelf to make their final choice.

If that sounds a frivolous finding, the scientists involved insist their work has serious applications.

"How people use their brains to learn, record and store memory is a fundamental question in contemporary neuroscience," said Steven Rose, director of the brain and behavior research group and professor of biology at the Open University.

Beyond the local supermarket, Rose and his team believe the findings could go on to show the brain processes behind the conscious decisions people make when it comes to important life choices, such as selecting a partner or career. The research may also one day help manufacturers and marketers shape advertising and branding strategies for their products.

The scientists used a technique known as magnetic encephalography (MEG) -- the fastest of all scanner methods -- to measure the minute magnetic fields around the brain and identify regions that are active during the second or so it takes for a person to make their shopping choice.

Subjects were taken on an 18-minute virtual tour of a supermarket. During regular pauses in the tour, they were asked to make a choice between different brands and products on the shelves by pressing a button.

Researchers found that the brain was hugely active during the 2.5 seconds it took for the button press to be made.

"Within 80 milliseconds their visual cortex responds as they perceive the choice items," Rose said. "A little later, regions of the brain associated with memory and speech become active."

More interesting for the researchers, however, was what happened as consumers made their final choice.

"After about 800 milliseconds -- and this was the surprising thing -- if and only if they really prefer one of the choice items, a region called the right parietal cortex becomes active. This then is the region of the brain involved in making conscious decisions -- in this experiment about shopping choices, but maybe for more important life choices too," Rose said.

Other scientists gave a guarded welcome to the findings of the OU researchers -- Rose, Professor Stephen Swithenby, Dr. Sven Braeutigam and Dr. John Stins -- who will publish their findings in the next issue of the journal Neural Plasticity.

Michael Miller, professor and chair in the department of neuroscience and physiology at SUNY Upstate Medical University, said that MEG scans provided a unique insight into the real-time activity of brain regions in humans and animals. "They are very insightful. There is a growing literature about the role of the prefrontal cortex and other areas that are involved in volitional activities," he said.

Dr. Wise Young, director of the W.M. Keck Center for Collaborative Neuroscience and a professor at Rutgers University, said that the finding was "interesting" although not particularly surprising or groundbreaking.

"The parietal cortex has been conjectured as one of the sites of decision-making in the brain. Because that part of the brain is showing activity during decision-making does not necessarily mean that the decision is actually made in that part of the brain. Also, the authors examined only several sites on the brain and there may be parts of the brain that were activated but they did not record," he said.

Young said that for centuries scientists have struggled with the question of whether function is localized or distributed in the brain.

"Much data suggests that many functions are quite broadly distributed, the brain is quite plastic and quite remarkable recovery can occur after injury to large parts of the brain. On the other hand, it is also clear that some areas are more important for some functions than others," Young said.

Dr. Susan Bookheimer, director of brain mapping center behavioral testing and training laboratory at UCLA, was more skeptical. "When I first read this, I thought it was a joke. It is not an interesting finding," she said. "The investigators have added all this supermarket stuff to make it more appealing to a general audience, but you'd find the same thing if you just used pictures of dots, for example."

While Bookheimer agrees there are areas of the brain responsible for making choices, she believes the real story is much more complex than that presented by the OU study.

"We must integrate knowledge before we make a choice," she said. "Many, not all choices, involve an emotional component controlled by a different area of the brain. Others require integrating information from multiple choices. We have to generate a response, search available responses and then initiate a plan to demonstrate the choice. It is likely that only this last process was involved in the study here -– the process of directing one's reach toward the goal."

Scientists Pinpoint Optimism Center of Brain

Thursday , October 25, 2007
AP-WASHINGTON
A person's optimism in the future seems to be controlled by a small front part of the mid-brain, according to a study that used brain imaging.

That area deep behind the eyes activates when people think good thoughts about what might happen in the future.

The more optimistic a person is, the brighter the area showed up in brain scans, the scientists reported in a small study published online Thursday in the journal Nature.

That same part of the brain, called the rostral anterior cingulate cortex (rACC), seems to malfunction in people suffering depression, said the study co-authors, Elizabeth Phelps of New York University and Tali Sharot of University College London.

Researchers gave 15 people functional magnetic resonance imaging scans while they thought about future possibilities.

When the participants thought about good events, both the rACC and amygdala, which is involved in emotional responses including fear, were activated. But the correlation with optimism was biggest with the cingulate cortex.

The same study also found that people tended to think that happier events were closer in time and more vivid than the bad ones, even if they had no reason to believe it, Phelps said.

Psychologists have long known people have an "optimism bias," but the new study offers new details.

When researchers asked the subjects to think about 80 different future events that could be good, bad or neutral, they had a hard time getting people to think negatively, or even neutrally, about the future.

For example, when people were asked to ponder a future haircut, they imagined getting the best haircut of their lives, instead of just an ordinary trim, Phelps said.

The study makes sense and pulls together new and different parts of research on optimism and the brain, said Dan Schacter, a professor of psychology at Harvard University who wasn't part of the research.

Having our brains wired to optimism is generally a good thing because "if you were pessimistic about the future you would not be motivated to take a lot of action," Phelps said.

Depression, anxiety tied to allergies in kids

NEW YORK (Reuters Health) - Research in psychiatrically ill children and adolescents suggests that those with depression, anxiety and other so-called "internalizing" disorders are more likely to have allergies.

Among a sample of 184 young people being evaluated for psychiatric disorders and allergies, 105 (57 percent) had a history of allergic disorders, including asthma, hay fever, hives and eczema.

Psychiatric evaluations revealed that 124 (67 percent) had an internalizing disorder, either alone or in combination with an externalizing disorder, such as ADHD, oppositional defiant disorder
and conduct disorder. The children in the sample were between 4 and 20 years old; their average age was 13.

Researchers found that youth with internalizing disorders were almost twice as likely to have a history of allergies than those with a diagnosis that wasn't classified as an internalizing or externalizing disorder. The psychiatric disorders in this group included substance abuse, tic disorders, bed-wetting and attachment disorder.

Moreover, the association was found to be specific for "pure" internalizing disorders. That is, the likelihood of having a history of allergies was significant only among youths who had an internalizing disorder and no other psychiatric conditions.

"These findings add to the growing body of evidence supporting an association between anxiety, depressive, and allergic disorders," write Dr. Mauricio Infante and colleagues from University of
Wisconsin, Madison in the Journal of Clinical Psychiatry.

The findings also suggest that these psychiatric and medical disorders "may share risk factors and underlying pathways that contribute to the development of both types of disorders."

The Wisconsin team notes that studies are needed to identify the reasons for these associations so that effective treatment and prevention strategies that target both disorders can be developed.

SOURCE: Journal Clinical Psychiatry, September 2007.

Monday, October 22, 2007

Upsetting Psychotherapy

Scientific American
September 21, 2005

Pressure from insurance companies and competition from drug therapies
are prompting analysts to get patients off the couch more quickly

By Jamie Talan

Wendy spent five years in psychoanalysis, delving so deeply into her
mind that she could no longer see the connection between her adult
problems and her teenage episodes of "cutting" her wrists. After she
and her analyst had their final session, during which he welcomed her
to move on with her life, Wendy was not completely happy, but she was
happier than she ever had been. And that, psychologists say, is
successful therapy.

Psychoanalysis probes the unconscious mind to unlock the mysteries
that drive conscious emotions and behavior. The discipline is built on
pillars set by Sigmund Freud a century ago. It is characterized by
frequent sessions that can take place over many years, wherein
patients are encouraged to freely associate whatever comes to mind as
the analyst sits quietly and listens.

Today the practice is changing. The transformation is in part the
result of a better understanding of what works during self-analysis.
But increasingly, psychotherapy is changing just to survive, held
hostage to limits on insurance coverage determined by managed care
companies and facing replacement by psychoactive drugs that in the
long run are far cheaper than a patient's weekly visit to the
therapist's office. In this incarnation, it suddenly matters less that
symptoms may disappear without patients figuring out the underlying cause.

Harsh Reality
To keep psychoanalysis alive, contemporary therapists are revamping
Freud's theories. They have discarded some traditional beliefs and
have loosened requirements so patients can succeed in fewer sessions.
Many analysts are even talking to their patients and sharing their own
thoughts and feelings, a practice that Freud said would complicate the
treatment process.

Some experts chafe at the changes, however. They say that short-term
therapy can be successful for some problems such as phobias but does
not work for personality disorders, chronic depression and other
substantial mental illnesses. They claim that managed care companies
make decisions based on cost, not on any science that shows what works
best for a specific condition. Insurance companies argue that patients
can do just as well on medication as they can with talk therapy and
that for talk, "short term" is enough.

Extended analysis certainly is under siege. Today patients having
long-term psychotherapy--more than 20 sessions--account for only 15
percent of those who seek treatment, according to a study in the
American Journal of Psychiatry. Psychoanalysts contend that it takes
longer to work out issues that have been shaped by a lifetime of
emotion and experience, yet they know they must compete in a
magic-pill era in which people may be content to have their symptoms
disappear without much thought to why they emerged in the first place.

"A better understanding of the self is needed for a better recovery,"
asserts Gail Saltz, a Manhattan analyst and author of Becoming Real
(Riverhead Trade, 2005), a book about the benefits of analysis. She
says that there are still people who lie on the couch four times a
week, but many analysts have accepted a once-a-week regimen. And
although studies have shown that certain patients progress better when
therapy is frequent, Saltz believes once a week can still be
successful. Psychologists have at least agreed that even long-term
analysis should be completed within four years.

Patients may be content to have symptoms disappear without much
thought to why they ever emerged.

Regardless of frequency, Saltz says, the goal is to help patients
"better tolerate the ups and downs of life" or, as Freud put it, "get
beyond everyday human misery." Freud developed his ideas before
scientists knew much about the brain's workings, however, and today
some of his once popular theories about human development are seen as
simply wrong.

High on the list is that infants have complicated sexual desires.
Peter D. Kramer, a Massachusetts psychiatrist who popularized the new
generation of antidepressants in his best-selling book Listening to
Prozac (Penguin, 1993), says that "there is no evidence that infants
have sexual desires." Kramer notes that although Freud believed that
adult complaints of childhood sexual abuse stemmed from such childhood
fantasies, the evidence today is plain that sexual abuse of children
is common, affecting up to 20 percent of girls and 10 percent of boys.

Freud also had little to offer the therapist in understanding trauma,
which experts now know can cause lifelong problems. Trauma therapy is
a relatively new field, built on work with war veterans.
Post-traumatic stress disorder is a hot topic in psychotherapy today,
one that was poorly addressed before, Kramer notes, because it was not
possible to have effective treatment when the theoretical
underpinnings were shaky.

Friend, Not Father
Readdressing the basic tenets of psychoanalysis has led to perhaps the
most radical change of all: modern psychologists are actually talking
to their patients. Freud's original "transference" theory demanded
that an analyst remain quiet and aloof so as to serve as a "screen"
onto which the patient could project her emotions. But therapists are
now sharing more of themselves. "How can I remain opaque when my
clients can go online and learn that I love Greek music?" asks
psychoanalyst Spyros D. Orfanos, clinic director in psychoanalysis at
New York University.

Orfanos says that today's analyst is not an authoritative father
figure but a partner in figuring out "the powerful emotional forces
that drive behavior." He thinks that having a dialog with a patient
is the best way to work toward change. Many analysts also now agree
that empathy is key to the relationship, and empathy requires
engagement, not just listening.

Psychoanalysis is also changing in the face of steady competition from
other forms of help, such as cognitive behavioral therapy, in which
patients try to change certain troubling behaviors, and goal-oriented
therapy, which lays out ways to attain, say, certain kinds of
relationships. These practices may or may not touch on the patient's
past. And to hold its own, psychoanalysis is shedding its image as a
privileged treatment for the wealthy; so-called training centers are
popping up everywhere that provide low-cost appointments.

Scientists are also attempting to study the biology of the analysis
process itself. At New York-Presbyterian Hospital/Weill Cornell
Medical Center, psychiatrists Otto F. Kernberg and David A.
Silbersweig are recording brain scans of patients before and after
analysis. Such studies may help end the debate over the effectiveness
of lengthy treatment, notes Kramer, who recently published Against
Depression (Viking Adult, 2005), an assessment of mood disorders. "We
don't know what works or what doesn't work."

Orfanos is dubious about scanning, maintaining that analysis is a
humanistic endeavor that does not necessarily fit into a biology-based
medical model. "It's about understanding how your mind works," he
says, "so that you can have more choices in your life."

Crime and Punishment: Why Do We Conform to Society?

Scientific American

October 05, 2007

A pair of brain regions work together to assess the threat of punishment and override our selfish tendencies

Whether you subscribe to the Ten Commandments, the Golden Rule or some instinctive moral code, society functions largely because most of its denizens adhere to a set of norms that allow them to live together in relative tranquility.

But, why is it that we put a vast amount of social resources into keeping stealing, murdering and other unfair (not to mention violent and illegal) acts to a minimum? Seems it all comes down to the fact that most of us don't cotton to being punished by our peers.

"The reason why punishment for norm violations is important is that it disciplines the potential norm violators," says Ernst Fehr, an economist at the University of Zurich and the senior author of a paper on the issue published this week in Neuron.

In the new study, Fehr and colleagues uncovered activity in two areas of the brain underlying the neural mechanism involved in conforming to society's values. They further determined that subjects with Machiavellian personalities—a strong sense of self-interest, opportunism and manipulation—have heightened activity in one of these regions, which the authors believe is related to assessing the threat of punishment.

During the study, which also involved scientists at the University of Ulm in Germany, 23 male students were instructed to play a version of the "ultimatum game" while their brains were scanned via functional magnetic resonance imaging (fMRI). Each participant was given a sum of money (100 monetary units) to split however he chose with an anonymous partner. In some cases the recipient simply had to accept any offer made. Other times, after an offer was made, the recipient had the option penalize the giver by taking some or all of their money, if the latter had not shared generously.

The subjects' brains were only scanned when they played the giver role. Before each trial, both players were told whether the recipient would be allowed to exact a punishment if he felt he got too slim a slice of the pie. Two areas of the cortex (the brain's primary processing unit) were particularly active during the trials when punishment was an option: the lateral orbitofrontal cortex, a region below the temples of the head that had, in previous research, been implicated in processing a threat stimulus, and a section just behind it called the dorsolateral prefrontal cortex.

"The lateral orbitofrontal cortex [activity] represents the punishment threat here," says Fehr, citing previous research that fingered it in threat assessment. "More specifically, how bad does the brain interpret this punishment threat?"

Alternatively, he says, "[the dorsolateral prefrontal cortex] is an area that is involved in cognitive control and overriding prepotent impulses. Here, we have a design where the prepotent impulse is not to share the money—at least to the extent that player B wants it shared."

Interestingly, the research team also had their subjects fill out a questionnaire to determine their degree of Machiavellian behavior. Those who proved to be the most ruthless of the bunch offered little to nothing when there was no threat of punishment, but within the punishment paradigm, they were generous enough to stave off retribution.

"These are socially intelligent, selfish people," Fehr says about the more calculating subjects. "They escape the punishments that are inherent in social interactions, because they seem to have a fine sense of when punishment is in the air."

Jorge Moll, principal investigator of the cognitive and behavioral neuroscience unit at the Rede Labs-D'Or Hospitals in Rio de Janeiro, says the most interesting findings were that individual scores on Machiavellianism predicted "how much a given subject will change his behavior depending on the presence of punishment," and "that the level of activity within the lateral orbitofrontal cortex is strongly related to Machiavellian personality style."

Researchers say the results could have wide-reaching implications, potentially paving the way to understand—and perhaps one day reverse—the neurobiology behind psychopathic and sociopathic personalities. They intend to repeat the study with patients suffering from antisocial anxiety and personality disorders to determine if their behavior can be explained by a lack of impulse control or a poor assessment of punishment.

Fehr argues the results could also impact the criminal justice system since the dorsolateral prefrontal cortex does not fully develop until after a person is around 20 years old.

"This area seems to be critically important in overriding self-interest," he says. Thus, "you just can't treat an immature adolescent the same way as a mature adult—that's at least my view of doing justice." It's unclear whether judges and juries see it that way, however.