By William Todd Schultz
May 29 2009
A recent post on "real psychology"--as opposed to all the fake or unreal psychology out there--got me thinking. The day we all decide on what real psychology is, is the day psychology dies. Real psychology equals dead-ended, myopic, oversimplification--of subject matter and of methodology. Real psychology is a means-centered approach. That is to say: only psychologists making use of prescribed, narrowly-defined "scientistic" methods are allowed into the fold. All others are touchy-feely, hopelessly subjective trespassers. Such a stance is 1) naïve, 2) unhistorical, and 3) regressive.
The study of the mind goes way back, of course, but let's just look at the 20th century. We had Wundt and his "experimental introspection," research into things like reaction time. We had the wonderfully overreaching brilliance of William James, who was into the same things as Wundt--attention, memory, sensation--but also psychic phenomena, religious experience, and philosophy and art. We had Freud and psychoanalysis. We had Jung and his association experiments. Then there was the biologically reductionistic doings of psychiatry that led, by the 1950s, to seizure therapies and lobotomy. Skinner's radical behaviorism had its day, followed by the cognitive revolution and, in time, by neuroscience. Lots always going on, in other words, from lots of different angles. Methodologically speaking, there was case study, experimentation, introspection, animal behavior, surveys, projective techniques, dream analysis, phenomenology, lesion studies--the list goes on and on. Methodological pluralism was/is the norm. But still today, let's face it, psychology is more or less in the Stone Ages. No doubt much has been accomplished. Powerful mid-level theories do exist that are promisingly predictive. But as for the great big questions, those enduring mysteries, we've taken only very small steps. We still don't know why we dream. We still don't know what causes schizophrenia. We still can't make solid sense of the function of consciousness. So let's not start proclaiming what real psychology is. Better to keep that question helpfully unanswered.
Psychology's disorder now is multiple personality, and in a way that's fine. What we've got is something like 60 sub-disciplines leaving in their wake a farrago of sub sub-disciplines. Each sub-discipline is pretty insular, there is little harmony overall (far more cacophony), and what's especially funny is this: every sub-discipline tends to believe--according to an in-group, out-group dynamic--that it is THE ONLY ONE DOING REAL PSYCHOLOGY. In fact, each is focusing on its own little hiccup of mind, its own pet variables, while mainly neglecting the questions other sub-disciplines find so essential. So each sub-discipline inflates the importance of its methods/questions while devaluing the methods/questions of other sub-disciplines. That attitude was on hair-raising display in the post cited at the top of this one.
Take my situation. I have a PhD in Personality from UC Davis. Now, presently, with some important exceptions, Social Psychologists sometimes devalue Personality Science while Personality Psychologists sometimes devalue Social Psychology. I like to think of this as the narcissism of minor differences, but that's another subject altogether.
I also do qualitative case study research that in my case goes by the name of Psychobiography. According to some, that's not real psychology because it is not experimental. Well, someone should have clued in Piaget, Erikson, Maslow, Freud, Jung, James, Skinner (who also used single-subject design), RD Laing, Henry Murray, Silvan Tomkins, etc etc etc, ALL OF WHOM DID CASE STUDY AND ALL OF WHOM ARE REGARDED AS SEMINAL FIGURES IN THE FIELD. I don't know, it's a strangely territorial neurotic mind-set that 1) believes itself in possession of true knowledge and 2) feels a need to tell lowly others that what they are up to is BS.
I say this: we psychologists know a lot less than we think we do, and at this very early stage of the game in the study of mind, all promising approaches and questions are welcome. The more the merrier. Does anything go? No. But is there one real psychology? Double no.
Published on Psychology Today (http://www.psychologytoday.com)
A Scientific Study of the Human Mind and the Understanding of Human Behavior through the analysis and research of Meta Psychology.
Wednesday, August 26, 2009
Friday, August 14, 2009
Artistic tendencies linked to 'schizophrenia gene'
Salvador Dali's mental disorders were also the key to his creativity.
We're all familiar with the stereotype of the tortured artist. Salvador Dali's various disorders and Sylvia Plath's depression spring to mind. Now new research seems to show why: a genetic mutation linked to psychosis and schizophrenia also influences creativity.
The finding could help to explain why mutations that increase a person's risk of developing mental illnesses such as schizophrenia and bipolar syndrome have been preserved, even preferred, during human evolution, says Szabolcs Kéri, a researcher at Semmelweis University in Budapest, Hungary, who carried out the study.
Kéri examined a gene involved in brain development called neuregulin 1, which previous studies have linked to a slightly increased risk of schizophrenia. Moreover, a single DNA letter mutation that affects how much of the neuregulin 1 protein is made in the brain has been linked to psychosis, poor memory and sensitivity to criticism.
About 50 per cent of healthy Europeans have one copy of this mutation, while 15 per cent possess two copies.
Creative thinking
To determine how these variations affect creativity, Kéri genotyped 200 adults who responded to adverts seeking creative and accomplished volunteers. He also gave the volunteers two tests of creative thinking, and devised an objective score of their creative achievements, such as filing a patent or writing a book.
People with two copies of the neuregulin 1 mutation – about 12 per cent of the study participants – tended to score notably higher on these measures of creativity, compared with other volunteers with one or no copy of the mutation. Those with one copy were also judged to be more creative, on average, than volunteers without the mutation. All told, the mutation explained between 3 and 8 per cent of the differences in creativity, Kéri says.
Exactly how neuregulin 1 affects creativity isn't clear. Volunteers with two copies of the mutation were no more likely than others to possess so-called schizotypal traits, such as paranoia, odd speech patterns and inappropriate emotions. This would suggest that the mutation's connection to mental illness does not entirely explain its link to creativity, Kéri says.
Dampening the brain
Rather, Kéri speculates that the mutation dampens a brain region that reins in mood and behaviour, called the prefrontal cortex. This change could unleash creative potential in some people and psychotic delusions in others.
Intelligence could be one factor that determines whether the neuregulin 1 mutation boosts creativity or contributes to psychosis. Kéri's volunteers tended to be smarter than average. In contrast, another study of families with a history of schizophrenia found that the same mutation was associated with lower intelligence and psychotic symptoms.
"My clinical experience is that high-IQ people with psychosis have more intellectual capacity to deal with psychotic experiences," Kéri says. "It's not enough to experience those feelings, you have to communicate them."
Intelligence's influence
Jeremy Hall, a geneticist at the University of Edinburgh in the UK who uncovered the link between the neuregulin 1 mutation and psychosis, agrees that the gene's effects are probably influenced by cognitive factors such as intelligence.
This doesn't mean that psychosis and creativity are the same, though. "There's always been this slightly romantic idea that madness and genius are the flipside to the same coin. How much is that true? Madness is often madness and doesn't have as much genetic association with intelligence," Hall says.
Bernard Crespi, a behavioural geneticist at Simon Fraser University in Burnaby, British Columbia, Canada, is holding his applause for now. "This is a very interesting study with remarkably strong results, though it must be replicated in an independent population before the results can be accepted with confidence," he says.
http://www.newscientist.com/article/dn17474-artistic-tendencies-linked-to-schizophrenia-gene.html
We're all familiar with the stereotype of the tortured artist. Salvador Dali's various disorders and Sylvia Plath's depression spring to mind. Now new research seems to show why: a genetic mutation linked to psychosis and schizophrenia also influences creativity.
The finding could help to explain why mutations that increase a person's risk of developing mental illnesses such as schizophrenia and bipolar syndrome have been preserved, even preferred, during human evolution, says Szabolcs Kéri, a researcher at Semmelweis University in Budapest, Hungary, who carried out the study.
Kéri examined a gene involved in brain development called neuregulin 1, which previous studies have linked to a slightly increased risk of schizophrenia. Moreover, a single DNA letter mutation that affects how much of the neuregulin 1 protein is made in the brain has been linked to psychosis, poor memory and sensitivity to criticism.
About 50 per cent of healthy Europeans have one copy of this mutation, while 15 per cent possess two copies.
Creative thinking
To determine how these variations affect creativity, Kéri genotyped 200 adults who responded to adverts seeking creative and accomplished volunteers. He also gave the volunteers two tests of creative thinking, and devised an objective score of their creative achievements, such as filing a patent or writing a book.
People with two copies of the neuregulin 1 mutation – about 12 per cent of the study participants – tended to score notably higher on these measures of creativity, compared with other volunteers with one or no copy of the mutation. Those with one copy were also judged to be more creative, on average, than volunteers without the mutation. All told, the mutation explained between 3 and 8 per cent of the differences in creativity, Kéri says.
Exactly how neuregulin 1 affects creativity isn't clear. Volunteers with two copies of the mutation were no more likely than others to possess so-called schizotypal traits, such as paranoia, odd speech patterns and inappropriate emotions. This would suggest that the mutation's connection to mental illness does not entirely explain its link to creativity, Kéri says.
Dampening the brain
Rather, Kéri speculates that the mutation dampens a brain region that reins in mood and behaviour, called the prefrontal cortex. This change could unleash creative potential in some people and psychotic delusions in others.
Intelligence could be one factor that determines whether the neuregulin 1 mutation boosts creativity or contributes to psychosis. Kéri's volunteers tended to be smarter than average. In contrast, another study of families with a history of schizophrenia found that the same mutation was associated with lower intelligence and psychotic symptoms.
"My clinical experience is that high-IQ people with psychosis have more intellectual capacity to deal with psychotic experiences," Kéri says. "It's not enough to experience those feelings, you have to communicate them."
Intelligence's influence
Jeremy Hall, a geneticist at the University of Edinburgh in the UK who uncovered the link between the neuregulin 1 mutation and psychosis, agrees that the gene's effects are probably influenced by cognitive factors such as intelligence.
This doesn't mean that psychosis and creativity are the same, though. "There's always been this slightly romantic idea that madness and genius are the flipside to the same coin. How much is that true? Madness is often madness and doesn't have as much genetic association with intelligence," Hall says.
Bernard Crespi, a behavioural geneticist at Simon Fraser University in Burnaby, British Columbia, Canada, is holding his applause for now. "This is a very interesting study with remarkably strong results, though it must be replicated in an independent population before the results can be accepted with confidence," he says.
http://www.newscientist.com/article/dn17474-artistic-tendencies-linked-to-schizophrenia-gene.html
Wednesday, August 12, 2009
Monday, August 10, 2009
Sunday, August 9, 2009
Providing Psychotherapy for the Poor
Innovative counseling programs in developing countries are repairing the psyches of civil war survivors and depressed mothers alike
By Mason Inman
It had been four years since 13-year-old Mohamed Abdul escaped civil war in Somalia, but he still had nightmares and flashbacks. When he was nine years old, a crowd fleeing a street shooting trampled him, putting him in the hospital for two weeks. A month later he saw the aftermath of an apparent massacre: about 20 corpses floating in the ocean. Soon after, militia-men shot him in the leg, knocked him unconscious, then raped his best friend, a girl named Halimo.
Recovering in the hospital, Abdul (not his real name) was overwhelmed by fear—and guilt, for not having helped Halimo. He felt unprovoked fury: he mistook people he knew well for the rapist and threatened to kill them. A few months later Abdul fled his homeland and landed in the Nakivale refugee settlement in Uganda. “I felt as if there were two personalities living inside me,” he said at the time. “One was smart and kind and normal; the other one was crazy and violent.”
Abdul had post-traumatic stress disorder (PTSD), an ailment characterized by fear, hyperarousal and vivid replays of the traumatic event. Fortunately, this refugee camp had an extraordinary resource. Psychologist Frank Neuner of Bielefeld University in Germany was offering “narrative exposure therapy” to its 14,400 Africans, mostly Rwandans. The approach coaxes trauma survivors to assimilate their troubling memories into their life stories and thereby regain some emotional balance.
After four 60- to 90-minute therapy sessions, Abdul’s flashbacks and nightmares disappeared; he was still easily startled but no longer felt out of control. His doctors deemed him “cured.”
Researchers and aid workers have historically overlooked mental health in developing countries, focusing instead on issues such as malnutrition, disease and high infant mortality, but that is changing. “What’s changed in the past 10 years is the realization that mental health is not separate from general health,” explains child psychiatrist Atif Rahman of the University of Liverpool in England.
Recent psychotherapy trials have achieved remarkable success in improving the lives of war survivors such as Abdul, poor mothers with postpartum depression and others victimized by the stresses of extreme poverty. The keys to a workable program for the impoverished include training ordinary citizens to be counselors and, in some cases, disguising the remedy as something other than a fix for emotional troubles.
Treating Trauma
Although many people think of mental illness as a plague of fast-paced modern life, some psychiatric ailments are actually more prevalent in the developing world, according to the World Health Organization. Of the several dozen wars and armed conflicts around the globe, nearly all are in developing countries, and this violence is leading to PTSD, which hinders recovery after the conflicts subside. Across South Asia, new mothers suffer from depression more frequently than they do in richer countries, according to a 2003 report by Rahman and his colleagues.
People in underprivileged nations also experience more severe economic stresses. “This pileup of adversities is associated with low mental health,” says sociologist Ronald Kessler of Harvard Medical School. For individuals living on the edge of survival, the economic ramifications of a mental illness can be especially devastating. When someone has a major mental illness, “you’ve lost their labor and their input,” notes mental health researcher Paul Bolton of Johns Hopkins University.
To make up for the deficit of mental health care professionals in the developing world, Neuner and his team recruited refugees from the camp. Anybody who could read, write and be empathetic was a candidate. Because nearly one third of the Rwandan refugees and half of the Somalis suffered from PTSD, many of the would-be counselors needed to be treated first.
For a PTSD sufferer, distressing experiences are divorced from time or place and out of sync with the person’s life story. “Once these memories are activated, usually the interpretation of the brain of what’s happening is that there’s a danger right now, because the brain is not really aware that it’s just a memory,” Neuner points out. “We want to nail down this vivid emotional representation. We want to bring it where it belongs and connect it with your life history.”
Accordingly, refugee therapists spent six weeks learning to help patients shape their lives into a coherent story, incorporating major traumas into the narrative. The strategy worked. Seventy percent of those who received the therapy no longer displayed significant PTSD symptoms at a nine-month follow-up assessment compared to a 37 percent recovery rate among a group of untreated refugees.
Empowering Mothers
In Rawalpindi, a largely rural district of Pakistan, nearly 30 percent of new mothers become depressed—about twice the rate in the developed world. In addition to its toll on mothers, postpartum depression can harm babies’ emotional and, in South Asia, physical development. Most of these women consider their symptoms the fate of poor folk or believe that they are caused by tawiz, or black magic. Many are anxious about talking about their problems and being labeled as ill. What is more, Rawalpindi has only three psychiatrists for its more than 3.5 million residents.
To get around such stigmas and barriers, Rahman and his colleagues recruited government employees known as lady health workers to integrate mental health therapy into their home visits to mothers. Ordinarily, these workers visit homes 16 times a year to give advice on infant nutrition and child rearing.
A two-day course enabled these health workers to add mental health to their curriculum. Rahman’s approach is based on cognitive-behavior therapy, in which a counselor tries to correct distorted and negative ways of thinking either by discussing them openly or by suggesting more adaptive behaviors. If a mother said she could not afford to feed her baby healthful food, for example, the lady health worker would question that assumption and suggest incremental improvements to the baby’s diet. A year after giving birth, mothers given this psychologically sensitive advice showed half the rate of major depression of those who received traditional health visits. The strategy worked by empowering the women to solve problems, Rahman believes.
More efforts to bring psychiatry to the poor are under way, such as a trial in Pakistan in which community health workers help to ensure that schizophrenics take their medications. But the biggest hurdle is scaling up these treatments to meet the great need.
Note: This article was originally printed with the title, "Psychotherapy for the Poor".
By Mason Inman
It had been four years since 13-year-old Mohamed Abdul escaped civil war in Somalia, but he still had nightmares and flashbacks. When he was nine years old, a crowd fleeing a street shooting trampled him, putting him in the hospital for two weeks. A month later he saw the aftermath of an apparent massacre: about 20 corpses floating in the ocean. Soon after, militia-men shot him in the leg, knocked him unconscious, then raped his best friend, a girl named Halimo.
Recovering in the hospital, Abdul (not his real name) was overwhelmed by fear—and guilt, for not having helped Halimo. He felt unprovoked fury: he mistook people he knew well for the rapist and threatened to kill them. A few months later Abdul fled his homeland and landed in the Nakivale refugee settlement in Uganda. “I felt as if there were two personalities living inside me,” he said at the time. “One was smart and kind and normal; the other one was crazy and violent.”
Abdul had post-traumatic stress disorder (PTSD), an ailment characterized by fear, hyperarousal and vivid replays of the traumatic event. Fortunately, this refugee camp had an extraordinary resource. Psychologist Frank Neuner of Bielefeld University in Germany was offering “narrative exposure therapy” to its 14,400 Africans, mostly Rwandans. The approach coaxes trauma survivors to assimilate their troubling memories into their life stories and thereby regain some emotional balance.
After four 60- to 90-minute therapy sessions, Abdul’s flashbacks and nightmares disappeared; he was still easily startled but no longer felt out of control. His doctors deemed him “cured.”
Researchers and aid workers have historically overlooked mental health in developing countries, focusing instead on issues such as malnutrition, disease and high infant mortality, but that is changing. “What’s changed in the past 10 years is the realization that mental health is not separate from general health,” explains child psychiatrist Atif Rahman of the University of Liverpool in England.
Recent psychotherapy trials have achieved remarkable success in improving the lives of war survivors such as Abdul, poor mothers with postpartum depression and others victimized by the stresses of extreme poverty. The keys to a workable program for the impoverished include training ordinary citizens to be counselors and, in some cases, disguising the remedy as something other than a fix for emotional troubles.
Treating Trauma
Although many people think of mental illness as a plague of fast-paced modern life, some psychiatric ailments are actually more prevalent in the developing world, according to the World Health Organization. Of the several dozen wars and armed conflicts around the globe, nearly all are in developing countries, and this violence is leading to PTSD, which hinders recovery after the conflicts subside. Across South Asia, new mothers suffer from depression more frequently than they do in richer countries, according to a 2003 report by Rahman and his colleagues.
People in underprivileged nations also experience more severe economic stresses. “This pileup of adversities is associated with low mental health,” says sociologist Ronald Kessler of Harvard Medical School. For individuals living on the edge of survival, the economic ramifications of a mental illness can be especially devastating. When someone has a major mental illness, “you’ve lost their labor and their input,” notes mental health researcher Paul Bolton of Johns Hopkins University.
To make up for the deficit of mental health care professionals in the developing world, Neuner and his team recruited refugees from the camp. Anybody who could read, write and be empathetic was a candidate. Because nearly one third of the Rwandan refugees and half of the Somalis suffered from PTSD, many of the would-be counselors needed to be treated first.
For a PTSD sufferer, distressing experiences are divorced from time or place and out of sync with the person’s life story. “Once these memories are activated, usually the interpretation of the brain of what’s happening is that there’s a danger right now, because the brain is not really aware that it’s just a memory,” Neuner points out. “We want to nail down this vivid emotional representation. We want to bring it where it belongs and connect it with your life history.”
Accordingly, refugee therapists spent six weeks learning to help patients shape their lives into a coherent story, incorporating major traumas into the narrative. The strategy worked. Seventy percent of those who received the therapy no longer displayed significant PTSD symptoms at a nine-month follow-up assessment compared to a 37 percent recovery rate among a group of untreated refugees.
Empowering Mothers
In Rawalpindi, a largely rural district of Pakistan, nearly 30 percent of new mothers become depressed—about twice the rate in the developed world. In addition to its toll on mothers, postpartum depression can harm babies’ emotional and, in South Asia, physical development. Most of these women consider their symptoms the fate of poor folk or believe that they are caused by tawiz, or black magic. Many are anxious about talking about their problems and being labeled as ill. What is more, Rawalpindi has only three psychiatrists for its more than 3.5 million residents.
To get around such stigmas and barriers, Rahman and his colleagues recruited government employees known as lady health workers to integrate mental health therapy into their home visits to mothers. Ordinarily, these workers visit homes 16 times a year to give advice on infant nutrition and child rearing.
A two-day course enabled these health workers to add mental health to their curriculum. Rahman’s approach is based on cognitive-behavior therapy, in which a counselor tries to correct distorted and negative ways of thinking either by discussing them openly or by suggesting more adaptive behaviors. If a mother said she could not afford to feed her baby healthful food, for example, the lady health worker would question that assumption and suggest incremental improvements to the baby’s diet. A year after giving birth, mothers given this psychologically sensitive advice showed half the rate of major depression of those who received traditional health visits. The strategy worked by empowering the women to solve problems, Rahman believes.
More efforts to bring psychiatry to the poor are under way, such as a trial in Pakistan in which community health workers help to ensure that schizophrenics take their medications. But the biggest hurdle is scaling up these treatments to meet the great need.
Note: This article was originally printed with the title, "Psychotherapy for the Poor".
A Wiring Diagram in the Brain for Depression
Researchers pinpoint a crucial crossroads for brain communication and a target for a radical depression treatment
By David Dobbs
April 6, 2009
Depression’s Wiring Diagram
When Helen Mayberg started curing depression by stimulating a previously unknown neural junction box in a brain area called Brodmann’s area 25—discovered through 20 years of dogged research—people asked her where she was going to look next. Her reaction was, “What do you mean, Where am I going to look next? I’m going to look more closely here!”
Her closer look is now paying off. In a series of papers last year, Mayberg and several of her colleagues used diffusion tensor imaging (DTI) to reveal the neural circuitry of depression at new levels of precision. This MRI technique illuminates the connective tracts in the brain. For depression, the resulting map may allow a better understanding of what drives the disorder—and much better targeting and patient selection for treatments such as deep-brain stimulation (DBS) that seek to tweak this circuitry.
In the early 2000s Mayberg and Wayne C. Drevets, then at Washington University Medical School, separately established that area 25, which appeared to connect several brain regions involved in mood, thought and emotion, is hyperactive in depressed patients. The area’s significance was confirmed when Mayberg and her colleagues at the University of Toronto—neurosurgeon Andres Lazano and psychiatrist Sidney Kennedy—used DBS devices to bring relief to 12 out of 20 intractably depressed patients [see “Turning Off Depression,” by David Dobbs; Scientific American Mind, August/September 2006]. “That confirmed my hypothesis that area 25 is an important crossroads,” Mayberg says. “But exactly what circuits were we affecting?”
The recent papers take her much closer to answering this question. Working with fellow imaging experts Heidi Johansen-Berg and Tim Behrens of the University of Oxford and others, Mayberg used DTI to produce detailed images of area 25’s “tractography,” the layout of the white matter tracts that connect disparate brain regions. They identified five connective tracts that run through this pea-size region, carrying neural traffic among five vital areas: the amygdala, a deep-brain area that moderates fear and other emotions; the orbitofrontal and medial frontal cortices, two poorly understood areas that appear to be significant in expectation, reward processing, error assessment, learning and decision making; the hippocampus, vital to memory; and the hypothalamus, which helps to regulate stress and arousal.
The refined imaging of these tracts does more than just confirm Mayberg’s previous work identifying area 25 as a junction box. It also gives her a map that provides diagnostic and targeting information for DBS treatments of the area. As she expected, the locations of those tracts varies among individuals. “And this variation,” Mayberg says, “along with variations in the nature of different patients’ depression, probably explains why some patients respond better than others. Because the location varies, we’re not hitting all five tracts the same way in every patient.”
In a new study of 20 more patients she began at Emory University, Mayberg plans to analyze the tractography and electrode placement to see which of the tracts seems to be most essential to the treatment’s success. That investigation may reveal yet more about the nature of depression—and it might help Mayberg identify which patients will benefit from surgery so she can spare those it will not help.
Meanwhile a kind of DBS gold rush has developed as other scientists slide neuromodulators into different brain areas to try to treat depression, obsessive-compulsive disorder, eating disorders, Tourette’s syndrome, headaches and chronic pain [see “Sparking Recovery with Brain ‘Pacemakers,’ ” by Morton L. Kringelbach and Tipu Z. Aziz; Scientific American Mind, December 2008/January 2009].
Although DBS treatment for depression might receive fda approval in as soon as four or five years, Mayberg does not think it will become common. She is following closely the work of researchers who are seeking ways to modulate tightly defined brain areas such as area 25 with tools less intrusive than electrodes. Stanford University bioengineer Karl Deisseroth, for instance, is having luck stimulating targeted brain areas in mice with proteins called opsins (cousins of retinal cells used in night vision) that can be placed noninvasively and then stimulated with light via a very thin fiber-optic cable rather than electricity from a bulky electrode. He and others hope to develop these or similar tools to create less invasive “switches” that modulate brain areas more cleanly than electrodes do. “There may come a time,” Mayberg says, “when we can work these circuits some other way.”
Note: This article was originally printed with the title, "Insights into the Brain's Circuitry".
"A Wiring Diagram for Depression," Scientific American Mind, April/May 2009, by David Dobbs.
By David Dobbs
April 6, 2009
Depression’s Wiring Diagram
When Helen Mayberg started curing depression by stimulating a previously unknown neural junction box in a brain area called Brodmann’s area 25—discovered through 20 years of dogged research—people asked her where she was going to look next. Her reaction was, “What do you mean, Where am I going to look next? I’m going to look more closely here!”
Her closer look is now paying off. In a series of papers last year, Mayberg and several of her colleagues used diffusion tensor imaging (DTI) to reveal the neural circuitry of depression at new levels of precision. This MRI technique illuminates the connective tracts in the brain. For depression, the resulting map may allow a better understanding of what drives the disorder—and much better targeting and patient selection for treatments such as deep-brain stimulation (DBS) that seek to tweak this circuitry.
In the early 2000s Mayberg and Wayne C. Drevets, then at Washington University Medical School, separately established that area 25, which appeared to connect several brain regions involved in mood, thought and emotion, is hyperactive in depressed patients. The area’s significance was confirmed when Mayberg and her colleagues at the University of Toronto—neurosurgeon Andres Lazano and psychiatrist Sidney Kennedy—used DBS devices to bring relief to 12 out of 20 intractably depressed patients [see “Turning Off Depression,” by David Dobbs; Scientific American Mind, August/September 2006]. “That confirmed my hypothesis that area 25 is an important crossroads,” Mayberg says. “But exactly what circuits were we affecting?”
The recent papers take her much closer to answering this question. Working with fellow imaging experts Heidi Johansen-Berg and Tim Behrens of the University of Oxford and others, Mayberg used DTI to produce detailed images of area 25’s “tractography,” the layout of the white matter tracts that connect disparate brain regions. They identified five connective tracts that run through this pea-size region, carrying neural traffic among five vital areas: the amygdala, a deep-brain area that moderates fear and other emotions; the orbitofrontal and medial frontal cortices, two poorly understood areas that appear to be significant in expectation, reward processing, error assessment, learning and decision making; the hippocampus, vital to memory; and the hypothalamus, which helps to regulate stress and arousal.
The refined imaging of these tracts does more than just confirm Mayberg’s previous work identifying area 25 as a junction box. It also gives her a map that provides diagnostic and targeting information for DBS treatments of the area. As she expected, the locations of those tracts varies among individuals. “And this variation,” Mayberg says, “along with variations in the nature of different patients’ depression, probably explains why some patients respond better than others. Because the location varies, we’re not hitting all five tracts the same way in every patient.”
In a new study of 20 more patients she began at Emory University, Mayberg plans to analyze the tractography and electrode placement to see which of the tracts seems to be most essential to the treatment’s success. That investigation may reveal yet more about the nature of depression—and it might help Mayberg identify which patients will benefit from surgery so she can spare those it will not help.
Meanwhile a kind of DBS gold rush has developed as other scientists slide neuromodulators into different brain areas to try to treat depression, obsessive-compulsive disorder, eating disorders, Tourette’s syndrome, headaches and chronic pain [see “Sparking Recovery with Brain ‘Pacemakers,’ ” by Morton L. Kringelbach and Tipu Z. Aziz; Scientific American Mind, December 2008/January 2009].
Although DBS treatment for depression might receive fda approval in as soon as four or five years, Mayberg does not think it will become common. She is following closely the work of researchers who are seeking ways to modulate tightly defined brain areas such as area 25 with tools less intrusive than electrodes. Stanford University bioengineer Karl Deisseroth, for instance, is having luck stimulating targeted brain areas in mice with proteins called opsins (cousins of retinal cells used in night vision) that can be placed noninvasively and then stimulated with light via a very thin fiber-optic cable rather than electricity from a bulky electrode. He and others hope to develop these or similar tools to create less invasive “switches” that modulate brain areas more cleanly than electrodes do. “There may come a time,” Mayberg says, “when we can work these circuits some other way.”
Note: This article was originally printed with the title, "Insights into the Brain's Circuitry".
"A Wiring Diagram for Depression," Scientific American Mind, April/May 2009, by David Dobbs.
Friday, August 7, 2009
Do ADHD Drugs Take a Toll on the Brain?
Research hints that hidden risks might accompany long-term use of the medicines that treat attention-deficit hyperactivity disorderBy Edmund S. Higgins
A few years ago a single mother who had recently moved to town came to my office asking me to prescribe the stimulant drug Adderall for her sixth-grade son. The boy had been taking the medication for several years, and his mother had liked its effects: it made homework time easier and improved her son’s grades.
At the time of this visit, the boy was off the medication, and I conducted a series of cognitive and behavioral tests on him. He performed wonderfully. I also noticed that off the medication he was friendly and playful.
On a previous casual encounter, when the boy had been on Adderall, he had seemed reserved and quiet. His mother acknowledged this was a side effect of the Adderall. I told her that I did not think her son had attention-deficit hyperactivity disorder (ADHD) and that he did not need medication. That was the last time I saw her.
Attention-deficit hyperactivity disorder afflicts about 5 percent of U.S. children—twice as many boys as girls—age six to 17, according to a recent survey conducted by the Centers for Disease Control and Prevention. As its name implies, people with the condition have trouble focusing and often are hyperactive or impulsive. An estimated 9 percent of boys and 4 percent of girls in the U.S. are taking stimulant medications as part of their therapy for ADHD, the CDC reported in 2005. The majority of patients take methylphenidate (Ritalin, Concerta), whereas most of the rest are prescribed an amphetamine such as Adderall.
Although it sounds counterintuitive to give stimulants to a person who is hyperactive, these drugs are thought to boost activity in the parts of the brain responsible for attention and self-control. Indeed, the pills can improve attention, concentration and productivity and also suppress impulsive behavior, producing significant improvements in some people’s lives. Severe inattention and impulsivity put individuals at risk for substance abuse, unemployment, crime and car accidents. Thus, appropriate medication might keep a person out of prison, away from addictive drugs or in a job.
Over the past 15 years, however, doctors have been pinning the ADHD label on—and prescribing stimulants for—a rapidly rising number of patients, including those with moderate to mild inattention, some of whom, like the sixth grader I saw, have a normal ability to focus. This trend may be fueled in part by a relaxation of official diagnostic criteria for the disorder, combined with a lower tolerance in society for mild behavioral or cognitive problems.
In addition, patients are no longer just taking the medicines for a few years during grade school but are encouraged to stay on them into adulthood. In 2008 two new stimulants—Vyvanse (amphetamine) and Concerta—received U.S. Food and Drug Administration indications for treating adults, and pharmaceutical firms are pushing awareness of the adult forms of the disorder. What is more, many people who have no cognitive deficits are opting to take these drugs to boost their academic performance. A number of my patients—doctors, lawyers and other professionals—have asked me for stimulants in hopes of boosting their productivity. As a result of these developments, prescriptions for methylphenidate and amphetamine rose by almost 12 percent a year between 2000 and 2005, according to a 2007 study.
With the expanded and extended use of stimulants comes mounting concern that the drugs might take a toll on the brain over the long run. Indeed, a smattering of recent studies, most of them involving animals, hint that stimulants could alter the structure and function of the brain in ways that may depress mood, boost anxiety and, contrary to their short-term effects, lead to cognitive deficits. Human studies already indicate the medications can adversely affect areas of the brain that govern growth in children, and some researchers worry that additional harms have yet to be unearthed.
Medicine for the MindTo appreciate why stimulants could have negative effects over time, it helps to first understand what they do in the brain. One hallmark of ADHD is an underactive frontal cortex, a brain region that lies just behind the forehead and controls such “executive” functions as decision making, predicting future events, and suppressing emotions and urges. This area may, in some cases, be smaller than average in ADHD patients, compromising their executive abilities. Frontal cortex function depends greatly on a signaling chemical, or neurotransmitter, called dopamine, which is released in this structure by neurons that originate in deeper brain structures. Less dopamine in the prefrontal cortex is linked, for example, with cognitive difficulty in old age. Another set of dopamine-releasing neurons extends to the nucleus accumbens, a critical mediator of motivation, pleasure and reward whose function may also be impaired in ADHD.
Stimulants enhance communication in these dopamine-controlled brain circuits by binding to so-called dopamine transporters—the proteins on nerve endings that suck up excess dopamine—thereby deactivating them. As a result, dopamine accumulates outside the neurons, and the additional neurotransmitter is thought to improve the operation of neuronal circuits critical for motivation and impulse control.
Not only can methylphenidate and amphetamine ameliorate a mental deficit, they also can enhance cognitive performance. In studies dating back to the 1970s, researchers have shown that normal children who do not have ADHD also become more attentive—and often calmer—after taking stimulants. In fact, the drugs can lead to higher test scores in students of average and above-average intellectual ability
Since the 1950s, when doctors first started prescribing stimulants to treat behavior problems, millions of people have taken them without obvious incident. A number of studies have even exonerated them from causing possible adverse effects. For example, researchers have failed to find differences between stimulant-treated children and those not on meds in the larger-scale growth of the brain. In January 2009 child psychiatrist Philip Shaw of the National Institute of Mental Health and his colleagues used MRI scans to measure the change in the thickness of the cerebral cortex (the outer covering of the brain) of 43 youths between the ages of 12 and 16 who had ADHD.
The researchers found no evidence that stimulants slowed cortical growth. In fact, only the unmedicated adolescents showed more thinning of the cerebrum than was typical for their age, hinting that the drugs might facilitate normal cortical development in kids with ADHD.
Altering MoodDespite such positive reports, traces of a sinister side to stimulants have also surfaced. In February 2007 the FDA issued warnings about side effects such as growth stunting and psychosis, among other mental disorders. Indeed, the vast majority of adults with ADHD experience at least one additional psychiatric illness—often an anxiety disorder or drug addiction—in their lifetime. Having ADHD is itself a risk factor for other mental health problems, but the possibility also exists that stimulant treatment during childhood might contribute to these high rates of accompanying diagnoses.
After all, stimulants activate the brain’s reward pathways, which are part of the neural circuitry that controls mood under normal conditions. And at least three studies using animals hint that exposure to methylphenidate during childhood may alter mood in the long run, perhaps raising the risk of depression and anxiety in adulthood.
In an experiment published in 2003 psychiatrist Eric Nestler of the University of Texas Southwestern Medical Center and his colleagues injected juvenile rats twice a day with a low dose of methylphenidate similar to that prescribed for children with ADHD. When the rats became adults, the scientists observed the rodents’ responses to various emotional stimuli. The rodents that had received methylphenidate were significantly less responsive to natural rewards such as sugar, sex, and fun, novel environments than were untreated rats, suggesting that the drug-exposed animals find such stimuli less pleasurable. In addition, the stimulants apparently made the rats more sensitive to stressful situations such as being forced to swim inside a large tube. Similarly, in the same year psychiatrist William Carlezon of Harvard Medical School and his colleagues reported that methylphenidate-treated preadolescent rats displayed a muted response to a cocaine reward as adults as well as unusual apathy in a forced-swim test, a sign of depression.
In 2008 psychopharmacologist Leandro F. Vendruscolo and his co-workers at Federal University of Santa Catarina in Brazil echoed these results using spontaneously hypertensive rats, which—like children with ADHD—sometimes show attention deficits, hyperactivity and motor impulsiveness. The researchers injected these young rats with methylphenidate for 16 days at doses approximating those used to treat ADHD in young people. Four weeks later, when the rats were young adults, those that had been exposed to methylphenidate were unusually anxious: they avoided traversing the central area of an open, novel space more so than did rats not exposed to methylphenidate. Adverse effects of this stimulant, the authors speculate, could contribute to the high rates of anxiety disorders among ADHD patients.
Copying Cocaine? The long-term use of any drug that affects the brain’s reward circuitry also raises the specter of addiction. Methyl-phenidate has a chemical structure similar to that of cocaine and acts on the brain in a very similar way. Both cocaine and methamphetamine (also called “speed” or “meth”)—another highly addictive stimulant—block dopamine transporters just as ADHD drugs do [see “New Weapons against Cocaine Addiction,” by Peter Sergo; Scientific American Mind, April/May 2008]. In the case of the illicit drugs, the dopamine surge is so sudden that in addition to making a person unusually energetic and alert, it produces a “high.”
Recent experiments in animals have sounded the alarm that methylphenidate may alter the brain in ways similar to that of more powerfully addictive stimulants such as cocaine.
In February 2009 neuroscientists Yong Kim and Paul Greengard, along with their colleagues at the Rockefeller University, reported cocainelike structural and chemical alterations in the brains of mice given methylphenidate. The researchers injected the mice with either methylphenidate or cocaine daily for two weeks. Both treatments increased the density of tiny extensions called spines at the ends of neurons bearing dopamine receptors in the rodent nucleus accumbens. Compared with cocaine, methylphenidate had a somewhat more localized influence; it also had more power over longer spines and less effect on shorter ones. Otherwise, the drugs’ effects were strikingly similar.
Furthermore, the scientists found that methylphenidate boosted the amount of a protein called ΔFosB, which turns genes on and off, even more than cocaine did. That result could be a chemical warning of future problems: excess ΔFosB heightens an animal’s sensitivity to the rewarding effects of cocaine and makes the animal more likely to ingest the drug. Many former cocaine addicts struggle with depression, anxiety and cognitive problems. Researchers have found that cocaine has remodeled the brains of such ex-users. Similar problems—principally, perhaps, difficulty experiencing joy and excitement in life—could occur after many years of Ritalin or Adderall use.
Amphetamine and methylphenidate can also be addictive if abused by, say, crushing or snorting the pills. In a classic study published in 1995 research psychiatrist Nora Volkow, then at Stony Brook University, and her colleagues showed that injections of methylphenidate produced a cocainelike high in volunteers. More than seven million people in the U.S. have abused methylphenidate, and as many as 750,000 teenagers and young adults show signs of addiction, according to a 2006 report.
Typical oral doses of ADHD meds rarely produce such euphoria and are not usually addicting. Furthermore, the evidence to date, including two 2008 studies from the National Institute on Drug Abuse, indicates that children treated with stimulants early in life are not more likely than other children to become addicted to drugs as adults. In fact, the risk for severe cases of ADHD may run in the opposite direction. (A low addiction risk also jibes with Carlezon’s earlier findings, which indicated that methylphenidate use in early life mutes adult rats’ response to cocaine.)
Corrupting CognitionAmphetamines such as Adderall could alter the mind in other ways. A team led by psychologist Stacy A. Castner of the Yale University School of Medicine has documented long-lasting behavioral oddities, such as hallucinations, and cognitive impairment in rhesus monkeys that received escalating injected doses of amphetamine over either six or 12 weeks.
Compared with monkeys given inactive saline, the drug-treated monkeys displayed deficits in working memory—the short-term buffer that allows us to hold several items in mind—which persisted for at least three years after exposure to the drug. The researchers connected these cognitive problems to a significantly lower level of dopamine activity in the frontal cortex of the drug-treated monkeys as compared with that of the monkeys not given amphetamine.
Underlying such cognitive and behavioral effects may be subtle structural changes too small to show up on brain scans. In a 1997 study psychologists Terry E. Robinson and Bryan Kolb of the University of Michigan at Ann Arbor found that high injected doses of amphetamine in rats cause the major output neurons of the nucleus accumbens to sprout longer branches, or dendrites, as well as additional spines on those dendrites. A decade later Castner’s team linked lower doses of amphetamine to subtle atrophy of neurons in the prefrontal cortex of monkeys.
A report published in 2005 by neurologist George A. Ricaurte and his team at the Johns Hopkins University School of Medicine is even more damning to ADHD meds because the researchers used realistic doses and drug delivery by mouth instead of by injection. Ricaurte’s group trained baboons and squirrel monkeys to self-administer an oral formulation of amphetamine similar to Adderall: the animals drank an amphetamine-laced orange cocktail twice a day for four weeks, mimicking the dosing schedule in humans. Two to four weeks later the researchers detected evidence of amphetamine-induced brain damage, encountering lower levels of dopamine and fewer dopamine transporters on nerve endings in the striatum—a trio of brain regions that includes the nucleus accumbens—in amphetamine-treated primates than in untreated animals. The authors believe these observations reflect a drug-related loss of dopamine-releasing nerve fibers that reach the striatum from the brain stem.
One possible consequence of a loss of dopamine and its associated molecules is Parkinson’s disease, a movement disorder that can also lead to cognitive deficits. A study in humans published in 2006 hints at a link between Parkinson’s and a prolonged exposure to amphetamine in any form (not just that prescribed for ADHD). Before Parkinson’s symptoms such as tremors and muscle rigidity appear, however, dopamine’s function in the brain must decline by 80 to 90 percent, or by about twice as much as what Ricaurte and his colleagues saw in baboons that were drinking a more moderate dose of the drug. And some studies have found no connection between stimulant use and Parkinson’s.
Stimulants do seem to stunt growth in children. Otherwise, however, studies in humans have largely failed to demonstrate any clear indications of harm from taking ADHD medications as prescribed. Whether the drugs alter the human brain in the same way they alter that of certain animals is unknown, because so far little clinical data exist on their long-term neurological effects. Even when the dosing is similar or the animals have something resembling ADHD, different species’ brains may have varying sensitivities to stimulant medications.
Nevertheless, in light of the emerging evidence, many doctors and researchers are recommending a more cautious approach to the medical use of stimulants. Some are urging the adoption of strict diagnostic criteria for ADHD and a policy restricting prescriptions for individuals who fit those criteria. Others are advocating behavior modification—which can be as effective as stimulants over the long run—as a first-line approach to combating the disorder. Certain types of mental exercises may also ease ADHD symptoms [see “Train Your Brain,” by Ulrich Kraft; Scientific American Mind, February/March 2006]. For patients who require stimulants, some neurologists and psychiatrists have also suggested using the lowest dose needed or monitoring the blood levels of these drugs as a way of keeping concentrations below those shown to be problematic in other mammals. Without these or similar measures, large numbers of people who regularly take stimulants may ultimately struggle with a new set of problems spawned by the treatments themselves.
Growing ProblemsSo far the best-documented problem associated with the stimulants used to treat attention-deficit hyperactivity disorder (ADHD) concerns growth. Human growth is controlled at least in part through the hypothalamus and pituitary at the base of the brain. Studies in mice hint that stimulants may increase levels of the neurotransmitter dopamine in the hypothalamus as well as in the striatum (a three-part brain structure that includes part of its reward circuitry) and that the excess dopamine may reach the pituitary by way of the bloodstream and act to retard growth.
Recent work strongly indicates that the drugs can stunt growth in children. In a 2007 analysis of a National Institute of Mental Health study of ADHD treatments involving 579 children, research psychiatrist Nora Volkow, who directs the National Institute of Drug Abuse, and her colleagues compared growth rates of unmedicated seven- to 10-year-olds over three years with those of kids who took stimulants throughout that period. Relative to the unmedicated youths, the drug-treated youths showed a decrease in growth rate, gaining, on average, two fewer centimeters in height and 2.7 kilograms less in weight. Although this growth-stunting effect came to a halt by the third year, the kids on the meds never caught up to their counterparts.
A few years ago a single mother who had recently moved to town came to my office asking me to prescribe the stimulant drug Adderall for her sixth-grade son. The boy had been taking the medication for several years, and his mother had liked its effects: it made homework time easier and improved her son’s grades.
At the time of this visit, the boy was off the medication, and I conducted a series of cognitive and behavioral tests on him. He performed wonderfully. I also noticed that off the medication he was friendly and playful.
On a previous casual encounter, when the boy had been on Adderall, he had seemed reserved and quiet. His mother acknowledged this was a side effect of the Adderall. I told her that I did not think her son had attention-deficit hyperactivity disorder (ADHD) and that he did not need medication. That was the last time I saw her.
Attention-deficit hyperactivity disorder afflicts about 5 percent of U.S. children—twice as many boys as girls—age six to 17, according to a recent survey conducted by the Centers for Disease Control and Prevention. As its name implies, people with the condition have trouble focusing and often are hyperactive or impulsive. An estimated 9 percent of boys and 4 percent of girls in the U.S. are taking stimulant medications as part of their therapy for ADHD, the CDC reported in 2005. The majority of patients take methylphenidate (Ritalin, Concerta), whereas most of the rest are prescribed an amphetamine such as Adderall.
Although it sounds counterintuitive to give stimulants to a person who is hyperactive, these drugs are thought to boost activity in the parts of the brain responsible for attention and self-control. Indeed, the pills can improve attention, concentration and productivity and also suppress impulsive behavior, producing significant improvements in some people’s lives. Severe inattention and impulsivity put individuals at risk for substance abuse, unemployment, crime and car accidents. Thus, appropriate medication might keep a person out of prison, away from addictive drugs or in a job.
Over the past 15 years, however, doctors have been pinning the ADHD label on—and prescribing stimulants for—a rapidly rising number of patients, including those with moderate to mild inattention, some of whom, like the sixth grader I saw, have a normal ability to focus. This trend may be fueled in part by a relaxation of official diagnostic criteria for the disorder, combined with a lower tolerance in society for mild behavioral or cognitive problems.
In addition, patients are no longer just taking the medicines for a few years during grade school but are encouraged to stay on them into adulthood. In 2008 two new stimulants—Vyvanse (amphetamine) and Concerta—received U.S. Food and Drug Administration indications for treating adults, and pharmaceutical firms are pushing awareness of the adult forms of the disorder. What is more, many people who have no cognitive deficits are opting to take these drugs to boost their academic performance. A number of my patients—doctors, lawyers and other professionals—have asked me for stimulants in hopes of boosting their productivity. As a result of these developments, prescriptions for methylphenidate and amphetamine rose by almost 12 percent a year between 2000 and 2005, according to a 2007 study.
With the expanded and extended use of stimulants comes mounting concern that the drugs might take a toll on the brain over the long run. Indeed, a smattering of recent studies, most of them involving animals, hint that stimulants could alter the structure and function of the brain in ways that may depress mood, boost anxiety and, contrary to their short-term effects, lead to cognitive deficits. Human studies already indicate the medications can adversely affect areas of the brain that govern growth in children, and some researchers worry that additional harms have yet to be unearthed.
Medicine for the MindTo appreciate why stimulants could have negative effects over time, it helps to first understand what they do in the brain. One hallmark of ADHD is an underactive frontal cortex, a brain region that lies just behind the forehead and controls such “executive” functions as decision making, predicting future events, and suppressing emotions and urges. This area may, in some cases, be smaller than average in ADHD patients, compromising their executive abilities. Frontal cortex function depends greatly on a signaling chemical, or neurotransmitter, called dopamine, which is released in this structure by neurons that originate in deeper brain structures. Less dopamine in the prefrontal cortex is linked, for example, with cognitive difficulty in old age. Another set of dopamine-releasing neurons extends to the nucleus accumbens, a critical mediator of motivation, pleasure and reward whose function may also be impaired in ADHD.
Stimulants enhance communication in these dopamine-controlled brain circuits by binding to so-called dopamine transporters—the proteins on nerve endings that suck up excess dopamine—thereby deactivating them. As a result, dopamine accumulates outside the neurons, and the additional neurotransmitter is thought to improve the operation of neuronal circuits critical for motivation and impulse control.
Not only can methylphenidate and amphetamine ameliorate a mental deficit, they also can enhance cognitive performance. In studies dating back to the 1970s, researchers have shown that normal children who do not have ADHD also become more attentive—and often calmer—after taking stimulants. In fact, the drugs can lead to higher test scores in students of average and above-average intellectual ability
Since the 1950s, when doctors first started prescribing stimulants to treat behavior problems, millions of people have taken them without obvious incident. A number of studies have even exonerated them from causing possible adverse effects. For example, researchers have failed to find differences between stimulant-treated children and those not on meds in the larger-scale growth of the brain. In January 2009 child psychiatrist Philip Shaw of the National Institute of Mental Health and his colleagues used MRI scans to measure the change in the thickness of the cerebral cortex (the outer covering of the brain) of 43 youths between the ages of 12 and 16 who had ADHD.
The researchers found no evidence that stimulants slowed cortical growth. In fact, only the unmedicated adolescents showed more thinning of the cerebrum than was typical for their age, hinting that the drugs might facilitate normal cortical development in kids with ADHD.
Altering MoodDespite such positive reports, traces of a sinister side to stimulants have also surfaced. In February 2007 the FDA issued warnings about side effects such as growth stunting and psychosis, among other mental disorders. Indeed, the vast majority of adults with ADHD experience at least one additional psychiatric illness—often an anxiety disorder or drug addiction—in their lifetime. Having ADHD is itself a risk factor for other mental health problems, but the possibility also exists that stimulant treatment during childhood might contribute to these high rates of accompanying diagnoses.
After all, stimulants activate the brain’s reward pathways, which are part of the neural circuitry that controls mood under normal conditions. And at least three studies using animals hint that exposure to methylphenidate during childhood may alter mood in the long run, perhaps raising the risk of depression and anxiety in adulthood.
In an experiment published in 2003 psychiatrist Eric Nestler of the University of Texas Southwestern Medical Center and his colleagues injected juvenile rats twice a day with a low dose of methylphenidate similar to that prescribed for children with ADHD. When the rats became adults, the scientists observed the rodents’ responses to various emotional stimuli. The rodents that had received methylphenidate were significantly less responsive to natural rewards such as sugar, sex, and fun, novel environments than were untreated rats, suggesting that the drug-exposed animals find such stimuli less pleasurable. In addition, the stimulants apparently made the rats more sensitive to stressful situations such as being forced to swim inside a large tube. Similarly, in the same year psychiatrist William Carlezon of Harvard Medical School and his colleagues reported that methylphenidate-treated preadolescent rats displayed a muted response to a cocaine reward as adults as well as unusual apathy in a forced-swim test, a sign of depression.
In 2008 psychopharmacologist Leandro F. Vendruscolo and his co-workers at Federal University of Santa Catarina in Brazil echoed these results using spontaneously hypertensive rats, which—like children with ADHD—sometimes show attention deficits, hyperactivity and motor impulsiveness. The researchers injected these young rats with methylphenidate for 16 days at doses approximating those used to treat ADHD in young people. Four weeks later, when the rats were young adults, those that had been exposed to methylphenidate were unusually anxious: they avoided traversing the central area of an open, novel space more so than did rats not exposed to methylphenidate. Adverse effects of this stimulant, the authors speculate, could contribute to the high rates of anxiety disorders among ADHD patients.
Copying Cocaine? The long-term use of any drug that affects the brain’s reward circuitry also raises the specter of addiction. Methyl-phenidate has a chemical structure similar to that of cocaine and acts on the brain in a very similar way. Both cocaine and methamphetamine (also called “speed” or “meth”)—another highly addictive stimulant—block dopamine transporters just as ADHD drugs do [see “New Weapons against Cocaine Addiction,” by Peter Sergo; Scientific American Mind, April/May 2008]. In the case of the illicit drugs, the dopamine surge is so sudden that in addition to making a person unusually energetic and alert, it produces a “high.”
Recent experiments in animals have sounded the alarm that methylphenidate may alter the brain in ways similar to that of more powerfully addictive stimulants such as cocaine.
In February 2009 neuroscientists Yong Kim and Paul Greengard, along with their colleagues at the Rockefeller University, reported cocainelike structural and chemical alterations in the brains of mice given methylphenidate. The researchers injected the mice with either methylphenidate or cocaine daily for two weeks. Both treatments increased the density of tiny extensions called spines at the ends of neurons bearing dopamine receptors in the rodent nucleus accumbens. Compared with cocaine, methylphenidate had a somewhat more localized influence; it also had more power over longer spines and less effect on shorter ones. Otherwise, the drugs’ effects were strikingly similar.
Furthermore, the scientists found that methylphenidate boosted the amount of a protein called ΔFosB, which turns genes on and off, even more than cocaine did. That result could be a chemical warning of future problems: excess ΔFosB heightens an animal’s sensitivity to the rewarding effects of cocaine and makes the animal more likely to ingest the drug. Many former cocaine addicts struggle with depression, anxiety and cognitive problems. Researchers have found that cocaine has remodeled the brains of such ex-users. Similar problems—principally, perhaps, difficulty experiencing joy and excitement in life—could occur after many years of Ritalin or Adderall use.
Amphetamine and methylphenidate can also be addictive if abused by, say, crushing or snorting the pills. In a classic study published in 1995 research psychiatrist Nora Volkow, then at Stony Brook University, and her colleagues showed that injections of methylphenidate produced a cocainelike high in volunteers. More than seven million people in the U.S. have abused methylphenidate, and as many as 750,000 teenagers and young adults show signs of addiction, according to a 2006 report.
Typical oral doses of ADHD meds rarely produce such euphoria and are not usually addicting. Furthermore, the evidence to date, including two 2008 studies from the National Institute on Drug Abuse, indicates that children treated with stimulants early in life are not more likely than other children to become addicted to drugs as adults. In fact, the risk for severe cases of ADHD may run in the opposite direction. (A low addiction risk also jibes with Carlezon’s earlier findings, which indicated that methylphenidate use in early life mutes adult rats’ response to cocaine.)
Corrupting CognitionAmphetamines such as Adderall could alter the mind in other ways. A team led by psychologist Stacy A. Castner of the Yale University School of Medicine has documented long-lasting behavioral oddities, such as hallucinations, and cognitive impairment in rhesus monkeys that received escalating injected doses of amphetamine over either six or 12 weeks.
Compared with monkeys given inactive saline, the drug-treated monkeys displayed deficits in working memory—the short-term buffer that allows us to hold several items in mind—which persisted for at least three years after exposure to the drug. The researchers connected these cognitive problems to a significantly lower level of dopamine activity in the frontal cortex of the drug-treated monkeys as compared with that of the monkeys not given amphetamine.
Underlying such cognitive and behavioral effects may be subtle structural changes too small to show up on brain scans. In a 1997 study psychologists Terry E. Robinson and Bryan Kolb of the University of Michigan at Ann Arbor found that high injected doses of amphetamine in rats cause the major output neurons of the nucleus accumbens to sprout longer branches, or dendrites, as well as additional spines on those dendrites. A decade later Castner’s team linked lower doses of amphetamine to subtle atrophy of neurons in the prefrontal cortex of monkeys.
A report published in 2005 by neurologist George A. Ricaurte and his team at the Johns Hopkins University School of Medicine is even more damning to ADHD meds because the researchers used realistic doses and drug delivery by mouth instead of by injection. Ricaurte’s group trained baboons and squirrel monkeys to self-administer an oral formulation of amphetamine similar to Adderall: the animals drank an amphetamine-laced orange cocktail twice a day for four weeks, mimicking the dosing schedule in humans. Two to four weeks later the researchers detected evidence of amphetamine-induced brain damage, encountering lower levels of dopamine and fewer dopamine transporters on nerve endings in the striatum—a trio of brain regions that includes the nucleus accumbens—in amphetamine-treated primates than in untreated animals. The authors believe these observations reflect a drug-related loss of dopamine-releasing nerve fibers that reach the striatum from the brain stem.
One possible consequence of a loss of dopamine and its associated molecules is Parkinson’s disease, a movement disorder that can also lead to cognitive deficits. A study in humans published in 2006 hints at a link between Parkinson’s and a prolonged exposure to amphetamine in any form (not just that prescribed for ADHD). Before Parkinson’s symptoms such as tremors and muscle rigidity appear, however, dopamine’s function in the brain must decline by 80 to 90 percent, or by about twice as much as what Ricaurte and his colleagues saw in baboons that were drinking a more moderate dose of the drug. And some studies have found no connection between stimulant use and Parkinson’s.
Stimulants do seem to stunt growth in children. Otherwise, however, studies in humans have largely failed to demonstrate any clear indications of harm from taking ADHD medications as prescribed. Whether the drugs alter the human brain in the same way they alter that of certain animals is unknown, because so far little clinical data exist on their long-term neurological effects. Even when the dosing is similar or the animals have something resembling ADHD, different species’ brains may have varying sensitivities to stimulant medications.
Nevertheless, in light of the emerging evidence, many doctors and researchers are recommending a more cautious approach to the medical use of stimulants. Some are urging the adoption of strict diagnostic criteria for ADHD and a policy restricting prescriptions for individuals who fit those criteria. Others are advocating behavior modification—which can be as effective as stimulants over the long run—as a first-line approach to combating the disorder. Certain types of mental exercises may also ease ADHD symptoms [see “Train Your Brain,” by Ulrich Kraft; Scientific American Mind, February/March 2006]. For patients who require stimulants, some neurologists and psychiatrists have also suggested using the lowest dose needed or monitoring the blood levels of these drugs as a way of keeping concentrations below those shown to be problematic in other mammals. Without these or similar measures, large numbers of people who regularly take stimulants may ultimately struggle with a new set of problems spawned by the treatments themselves.
Growing ProblemsSo far the best-documented problem associated with the stimulants used to treat attention-deficit hyperactivity disorder (ADHD) concerns growth. Human growth is controlled at least in part through the hypothalamus and pituitary at the base of the brain. Studies in mice hint that stimulants may increase levels of the neurotransmitter dopamine in the hypothalamus as well as in the striatum (a three-part brain structure that includes part of its reward circuitry) and that the excess dopamine may reach the pituitary by way of the bloodstream and act to retard growth.
Recent work strongly indicates that the drugs can stunt growth in children. In a 2007 analysis of a National Institute of Mental Health study of ADHD treatments involving 579 children, research psychiatrist Nora Volkow, who directs the National Institute of Drug Abuse, and her colleagues compared growth rates of unmedicated seven- to 10-year-olds over three years with those of kids who took stimulants throughout that period. Relative to the unmedicated youths, the drug-treated youths showed a decrease in growth rate, gaining, on average, two fewer centimeters in height and 2.7 kilograms less in weight. Although this growth-stunting effect came to a halt by the third year, the kids on the meds never caught up to their counterparts.
Brain with ADHD develops differently
• Some brain regions of kids with ADHD are delayed in maturing, says study
• Their brains are delayed an average of three years compared to those without ADHD
• Delay is most evident in brain regions that control thinking, attention and planning
Bottom Line: This may explain why some kids seem to grow out of the disorder
A National Institutes of Health study from November 2007 found that in youth with attention deficit hyperactivity disorder, the brain matures in a normal pattern. However, it is delayed three years in some regions, on average, compared with youth without the disorder.
The researchers used a new image analysis technique that allowed them to pinpoint the thinning and thickening of sites in the cortex of the brains of hundreds of children and teens with and without the disorder. The findings bolster the idea that ADHD results from a delay in the maturation of the cortex.
Questions and answersHow does the brain development of kids to ADHD compare with that of other kids?
Dr. Sanjay Gupta, CNN chief medical correspondent: For years, the big question with ADHD has been: Are these kids' brains developing more slowly or are they developing in a completely different way? This NIH study tells us the kids' brains develop more slowly, especially those areas important for control, action and attention. For example, a child who has a healthy brain might achieve maturity in those regions at age 7, while a child with ADHD might not until age 10.
What is important to note about both the "healthy" brain and the "ADHD" brain is that they mature or develop in pretty much the same way, starting from the back to the front. However, the ADHD brain is maturing much more slowly than the brain that does not have ADHD.
What can parents of children with ADHD learn from this?
Gupta: The good news for parents: Your child's brain is developing the same way as the brain of a healthy child, but it may take a few years longer. They will probably outgrow the behaviors that come with ADHD. Will your kids ever catch up with kids who don't have ADHD? They may, but it might be well after adolescence or into adulthood. These brain studies are ongoing.
Does this mean a brain scan might one day help diagnose ADHD?
Gupta: We are not at the point of using this as a diagnostic tool, but this information is very important for understanding ADHD. Knowing that this slower development is an issue, we may one day see treatments that try to accelerate this process. We could also see a dulling of impulsivity such as those inappropriate actions that we see in kids with ADHD.
Does this mean a brain scan might one day help diagnose ADHD?
Gupta: We are not at the point of using this as a diagnostic tool, but this information is very important for understanding ADHD. Knowing that this slower development is an issue, we may one day see treatments that try to accelerate this process. We could also see a dulling of impulsivity such as those inappropriate actions that we see in kids with ADHD.
Subscribe to:
Posts (Atom)