Friday, September 25, 2009
By Alan Boyle, Science editor, updated 3:00 p.m. PT, Mon., May 2, 2005
Scientists are fond of running the evolutionary clock backward, using DNA analysis and the fossil record to figure out when our ancestors stood erect and split off from the rest of the primate evolutionary tree.
But the clock is running forward as well. So where are humans headed?
Evolutionary biologist Richard Dawkins says it's the question he's most often asked, and "a question that any prudent evolutionist will evade." But the question is being raised even more frequently as researchers study our past and contemplate our future.
Paleontologists say that anatomically modern humans may have at one time shared the Earth with as many as three other closely related types — Neanderthals, Homo erectus and the dwarf hominids whose remains were discovered last year in Indonesia.
Does evolutionary theory allow for circumstances in which "spin-off" human species could develop again?
Some think the rapid rise of genetic modification could be just such a circumstance. Others believe we could blend ourselves with machines in unprecedented ways — turning natural-born humans into an endangered species.
Present-day fact, not science fictionSuch ideas may sound like little more than science-fiction plot lines. But trend-watchers point out that we're already wrestling with real-world aspects of future human development, ranging from stem-cell research to the implantation of biocompatible computer chips. The debates are likely to become increasingly divisive once all the scientific implications sink in.
"These issues touch upon religion, upon politics, upon values," said Gregory Stock, director of the Program on Medicine, Technology and Society at the University of California at Los Angeles. "This is about our vision of the future, essentially, and we'll never completely agree about those things."
The problem is, scientists can't predict with precision how our species will adapt to changes over the next millennium, let alone the next million years. That's why Dawkins believes it's imprudent to make a prediction in the first place.
Others see it differently: In the book "Future Evolution," University of Washington paleontologist Peter Ward argues that we are making ourselves virtually extinction-proof by bending Earth's flora and fauna to our will. And assuming that the human species will be hanging around for at least another 500 million years, Ward and others believe there are a few most likely scenarios for the future, based on a reading of past evolutionary episodes and current trends.
Where are humans headed? Here's an imprudent assessment of five possible paths, ranging from homogenized humans to alien-looking hybrids bred for interstellar travel.
Unihumans: Will we all be assimilated?Biologists say that different populations of a species have to be isolated from each other in order for those populations to diverge into separate species. That's the process that gave rise to 13 different species of "Darwin's Finches" in the Galapagos Islands. But what if the human species is so widespread there's no longer any opening for divergence?
Evolution is still at work. But instead of diverging, our gene pool has been converging for tens of thousands of years — and Stuart Pimm, an expert on biodiversity at Duke University, says that trend may well be accelerating.
"The big thing that people overlook when speculating about human evolution is that the raw matter for evolution is variation," he said. "We are going to lose that variability very quickly, and the reason is not quite a genetic argument, but it's close. At the moment we humans speak something on the order of 6,500 languages. If we look at the number of languages we will likely pass on to our children, that number is 600."
Cultural diversity, as measured by linguistic diversity, is fading as human society becomes more interconnected globally, Pimm argued. "I do think that we are going to become much more homogeneous," he said.
Ken Miller, an evolutionary biologist at Brown University, agreed: "We have become a kind of animal monoculture."
Is that such a bad thing? A global culture of Unihumans could seem heavenly if we figure out how to achieve long-term political and economic stability and curb population growth. That may require the development of a more "domesticated" society — one in which our rough genetic edges are smoothed out.
But like other monocultures, our species could be more susceptible to quick-spreading diseases, as last year's bird flu epidemic illustrated.
"The genetic variability that we have protects us against suffering from massive harm when some bug comes along," Pimm said. "This idea of breeding the super-race, like breeding the super-race of corn or rice or whatever — the long-term consequences of that could be quite scary."
Environmental pressures wouldn't stopEven a Unihuman culture would have to cope with evolutionary pressures from the environment, the University of Washington's Peter Ward said.
Some environmentalists say toxins that work like estrogens are already having an effect: Such agents, found in pesticides and industrial PCBs, have been linked to earlier puberty for women, increased incidence of breast cancer and lower sperm counts for men.
"One of the great frontiers is going to be trying to keep humans alive in a much more toxic world," he observed from his Seattle office. "The whales of Puget Sound are the most toxic whales on Earth. Puget Sound is just a huge cesspool. Well, imagine if that goes global."
Global epidemics or dramatic environmental changes represent just two of the scenarios that could cause a Unihuman society to crack, putting natural selection — or perhaps not-so-natural selection — back into the evolutionary game. Then what?
Survivalistians: Coping with doomsdaySurviving doomsday is a story as old as Noah’s Ark, and as new as the post-bioapocalypse movie “28 Days Later.”
Catastrophes ranging from super-floods to plagues to nuclear war to asteroid strikes erase civilization as we know it, leaving remnants of humanity who go their own evolutionary ways.
The classic Darwinian version of the story may well be H.G. Wells’ “The Time Machine,” in which humanity splits off into two species: the ruthless, underground Morlock and the effete, surface-dwelling Eloi.
At least for modern-day humans, the forces that lead to species spin-offs have been largely held in abeyance: Populations are increasingly in contact with each other, leading to greater gene-mixing. Humans are no longer threatened by predators their own size, and medicine cancels out inherited infirmities ranging from hemophilia to nearsightedness.
“We are helping genes that would have dropped out of the gene pool,” paleontologist Peter Ward observed.
But in Wells’ tale and other science-fiction stories, a civilization-shattering catastrophe serves to divide humanity into separate populations, vulnerable once again to selection pressures. For example, people who had more genetic resistance to viral disease would be more likely to pass on that advantage to their descendants.
If different populations develop in isolation over many thousands of generations, it’s conceivable that separate species would emerge. For example, that virus-resistant strain of post-humans might eventually thrive in the wake of a global bioterror crisis, while less hardy humans would find themselves quarantined in the world’s safe havens.
Patterns in the spread of the virus that causes AIDS may hint at earlier, less catastrophic episodes of natural selection, said Stuart Pimm, a conservation biologist at Duke University:
“There are pockets of people who don’t seem to become HIV-positive, even though they have a lot of exposure to the virus — and that may be because their ancestors survived the plague 500 years ago.”
Evolution, or devolution?If the catastrophe ever came, could humanity recover? In science fiction, that’s an intriguingly open question. For example, Stephen Baxter’s novel “Evolution” foresees an environmental-military meltdown so severe that, over the course of 30 million years, humans devolve into separate species of eyeless mole-men, neo-apes and elephant-people herded by their super-rodent masters.
Even Ward gives himself a little speculative leeway in his book “Future Evolution,” where a time-traveling human meets his doom 10 million years from now at the hands — or in this case, the talons — of a flock of intelligent killer crows. But Ward finds it hard to believe that even a global catastrophe would keep human populations isolated long enough for our species to split apart.
“Unless we totally forget how to build a boat, we can quickly come back,” Ward said.
Even in the event of a post-human split-off, evolutionary theory dictates that one species would eventually subjugate, assimilate or eliminate their competitors for the top job in the global ecosystem. Just ask the Neanderthals.
“If you have two species competing over the same ecological niche, it ends badly for one of them, historically,” said Joel Garreau, the author of the forthcoming book “Radical Evolution.”
The only reason chimpanzees still exist today is that they “had the brains to stay up in the trees and not come down into the open grasslands,” he noted.
“You have this optimistic view that you’re not going to see speciation (among humans), and I desperately hope that’s right,” Garreau said. “But that’s not the only scenario.”
Numans: Rise of the superhumansWe’ve already seen the future of enhanced humans, and his name is Barry Bonds.
The controversy surrounding the San Francisco Giants slugger, and whether steroids played a role in the bulked-up look that he and other baseball players have taken on, is only a foretaste of what’s coming as scientists find new genetic and pharmacological ways to improve performance.
Developments in the field are coming so quickly that social commentator Joel Garreau argues that they represent a new form of evolution. This radical kind of evolution moves much more quickly than biological evolution, which can take millions of years, or even cultural evolution, which works on a scale of hundreds or thousands of years.
How long before this new wave of evolution spawns a new kind of human? “Try 20 years,” Garreau told MSNBC.com. In his latest book, “Radical Evolution,” Garreau reels off a litany of high-tech enhancements, ranging from steroid Supermen, to camera-equipped flying drones, to pills that keep soldiers going without sleep or food for days.
“If you look at the superheroes of the ’30s and the ’40s, just about all of the technologies they had exist today,” he said. Three kinds of humansSuch enhancements are appearing first on the athletic field and the battlefield, Garreau said, but eventually they’ll make their way to the collegiate scene, the office scene and even the dating scene.
“You’re talking about three different kinds of humans: the enhanced, the naturals and the rest,” Garreau said. “The enhanced are defined as those who have the money and enthusiasm to make themselves live longer, be smarter, look sexier. That’s what you’re competing against.”
In Garreau’s view of the world, the naturals will be those who eschew enhancements for higher reasons, just as vegetarians forgo meat and fundamentalists forgo what they see as illicit pleasures. Then there’s all the rest of us, who don’t get enhanced only because they can’t. “They loathe and despise the people who do, and they also envy them,” Garreau said.
Scientists acknowledge that some of the medical enhancements on the horizon could engender a “have vs. have not” attitude.
“But I could be a smart ass and ask how that’s different from what we have now,” said Brown University’s Ken Miller. Medical advances as equalizers Miller went on to point out that in the past, “advances in medical science have actually been great levelers of social equality.” For example, age-old scourges such as smallpox and polio have been eradicated, thanks to public health efforts in poorer as well as richer countries. That trend is likely to continue as scientists learn more about the genetic roots of disease, he said.
“In terms of making genetic modifications to ourselves, it’s much more likely we’ll start to tinker with genes for disease susceptibility. … Maybe there would be a long-term health project to breed HIV-resistant people,” he said.
When it comes to discussing ways to enhance humans, rather than simply make up for disabilities, the traits targeted most often are longevity and memory. Scientists have already found ways to enhance those traits in mice.
Imagine improvements that could keep you in peak working condition past the age of 100. Those are the sorts of enhancements you might want to pass on to your descendants — and that could set the stage for reproductive isolation and an eventual species split-off.
“In that scenario, why would you want your kid to marry somebody who would not pass on the genes that allowed your grandchildren to have longevity, too?” the University of Washington’s
Peter Ward asked.
But that would require crossing yet another technological and ethical frontier.
Instant superhumans — or monsters?To date, genetic medicine has focused on therapies that work on only one person at a time. The effects of those therapies aren’t carried on to future generations. For example, if you take muscle-enhancing drugs, or even undergo gene therapy for bigger muscles, that doesn’t mean your children will have similarly big muscles.
In order to make an enhancement inheritable, you’d have to have new code spliced into your germline stem cells — creating an ethical controversy of transcendent proportions.
Tinkering with the germline could conceivably produce a superhuman species in a single generation — but could also conceivably create a race of monsters. “It is totally unpredictable,” Ward said. “It’s a lot easier to understand evolutionary happenstance.”
Even then, there are genetic traits that are far more difficult to produce than big muscles or even super-longevity — for instance, the very trait that defines us as humans.
“It’s very, very clear that intelligence is a pretty subtle thing, and it’s clear that we don’t have a single gene that turns it on or off,” Miller said.
When it comes to intelligence, some scientists say, the most likely route to our future enhancement — and perhaps our future competition as well — just might come from our own machines.
Cyborgs: Merging with the machinesWill intelligent machines be assimilated, or will humans be eliminated?
Until a few years ago, that question was addressed only in science-fiction plot lines, but today the rapid pace of cybernetic change has led some experts to worry that artificial intelligence may outpace Homo sapiens’ natural smarts.
The pace of change is often stated in terms of Moore’s Law, which says that the number of transistors packed into a square inch should double every 18 months. “Moore’s Law is now on its 30th doubling. We have never seen that sort of exponential increase before in human history,” said Joel Garreau, author of the book “Radical Evolution.”
In some fields, artificial intelligence has already bested humans — with Deep Blue’s 1997 victory over world chess champion Garry Kasparov providing a vivid example.
Three years later, computer scientist Bill Joy argued in an influential Wired magazine essay that we would soon face challenges from intelligent machines as well as from other technologies ranging from weapons of mass destruction to self-replicating nanoscale “gray goo.”
Joy speculated that a truly intelligent robot may arise by the year 2030. “And once an intelligent robot exists, it is only a small step to a robot species — to an intelligent robot that can make evolved copies of itself,” he wrote.
Assimilating the robotsTo others, it seems more likely that we could become part-robot ourselves: We’re already making machines that can be assimilated — including prosthetic limbs, mechanical hearts, cochlear implants and artificial retinas. Why couldn’t brain augmentation be added to the list?
“The usual suggestions are that we’ll design improvements to ourselves,” said Seth Shostak, senior astronomer at the SETI Institute. “We’ll put additional chips in our head, and we won’t get lost, and we’ll be able to do all those math problems that used to befuddle us.”
Shostak, who writes about the possibilities for cybernetic intelligence in his book “Sharing the Universe,” thinks that’s likely to be a transitional step at best.
“My usual response is that, well, you can improve horses by putting four-cylinder engines in them. But eventually you can do without the horse part,” he said. “These hybrids just don’t strike me as having a tremendous advantage. It just means the machines aren’t good enough.”
Back to biologyUniversity of Washington paleontologist Peter Ward also believes human-machine hybrids aren’t a long-term option, but for different reasons.
“When you talk to people in the know, they think cybernetics will become biology,” he said. “So you’re right back to biology, and the easiest way to make changes is by manipulating genomes.”
It’s hard to imagine that robots would ever be given enough free rein to challenge human dominance, but even if they did break free, Shostak has no fear of a “Terminator”-style battle for the planet.
“I’ve got a couple of goldfish, and I don’t wake up in the morning and say, ‘I’m gonna kill these guys.’ … I just leave ’em alone,” Shostak said. “I suspect the machines would very quickly get to a level where we were kind of irrelevant, so I don’t fear them. But it does mean that we’re no longer No. 1 on the planet, and we’ve never had that happen before.”
Astrans: Turning into an alien raceIf humans survive long enough, there’s one sure way to grow new branches on our evolutionary family tree: by spreading out to other planets.
Habitable worlds beyond Earth could be a 23rd century analog to the Galapagos Islands, Charles Darwin’s evolutionary laboratory: just barely close enough for travelers to get to, but far enough away that there'd be little gene-mixing with the parent species.
“If we get off to the stars, then yes, we will have speciation,” said University of Washington paleontologist Peter Ward. “But can we ever get off the Earth?”
Currently, the closest star system thought to have a planet is Epsilon Eridani, 10.5 light-years away. Even if spaceships could travel at 1 percent the speed of light — an incredible 6.7 million mph — it would take more than a millennium to get there.
Even Mars might be far enough: If humans established a permanent settlement there, the radically different living conditions would change the evolutionary equation. For example, those who are born and raised in one-third of Earth’s gravity could never feel at home on the old “home planet.” It wouldn’t take long for the new Martians to become a breed apart.
As for distant stars, the SETI Institute’s Seth Shostak has already been thinking through the possibilities:
· Build a big ark: Build a spaceship big enough to carry an entire civilization to the destination star system. The problem is, that environment might be just too unnatural for natural humans. “If you talk to the sociologists, they’ll say that it will not work. … You’ll be lucky if anybody’s still alive after the third generation,” Shostak said.
· Go to warp speed: Somehow we discover a wormhole or find a way to travel at relativistic speeds. “That sounds OK, except for the fact that nobody knows how to do it,” Shostak said.
· Enter the Astrans: Humans are genetically engineered to tolerate ultra long-term hibernation aboard robotic ships. Once the ship reaches its destination, these “Astrans” are awakened to start the work of settling a new world. “That’s one possibility,” Shostak said.
The ultimate approach would be to send the instructions for making humans rather than the humans themselves, Shostak said. “We’re not going to put anything in a rocket, we’re just going to beam ourselves to the stars,” he explained. “The only trouble is, if there’s nobody on the other end to put you back together, there’s no point.”
So are we back to square one? Not necessarily, Shostak said. Setting up the receivers on other stars is no job for a human, “but the machines could make it work.”
In fact, if any other society is significantly further along than ours, such a network might be up and running by now. “The machines really could develop large tracts of galactic real estate, whereas it’s really hard for biology to travel,” Shostak said.
It all seems inconceivable, but if humans really are extinction-proof — if they manage to survive global catastrophes, genetic upheavals and cybernetic challenges — who’s to say what will be inconceivable millions of years from now? Two intelligent species, human and machine, just might work together to spread life through the universe.
“If you were sufficiently motivated,” Shostak said, “you could in fact keep it going forever.”
© 2009 msnbc.com Reprints URL: http://www.msnbc.msn.com/id/7103668/
Friday, September 18, 2009
New York Times
This is a story about a nearly 100-year-old book, bound in red leather, which has spent the last quarter century secreted away in a bank vault in Switzerland. The book is big and heavy and its spine is etched with gold letters that say “Liber Novus,” which is Latin for “New Book.” Its pages are made from thick cream-colored parchment and filled with paintings of otherworldly creatures and handwritten dialogues with gods and devils. If you didn’t know the book’s vintage, you might confuse it for a lost medieval tome.
And yet between the book’s heavy covers, a very modern story unfolds. It goes as follows: Man skids into midlife and loses his soul. Man goes looking for soul. After a lot of instructive hardship and adventure — taking place entirely in his head — he finds it again.
Some people feel that nobody should read the book, and some feel that everybody should read it. The truth is, nobody really knows. Most of what has been said about the book — what it is, what it means — is the product of guesswork, because from the time it was begun in 1914 in a smallish town in Switzerland, it seems that only about two dozen people have managed to read or even have much of a look at it.
Of those who did see it, at least one person, an educated Englishwoman who was allowed to read some of the book in the 1920s, thought it held infinite wisdom — “There are people in my country who would read it from cover to cover without stopping to breathe scarcely,” she wrote — while another, a well-known literary type who glimpsed it shortly after, deemed it both fascinating and worrisome, concluding that it was the work of a psychotic.
So for the better part of the past century, despite the fact that it is thought to be the pivotal work of one of the era’s great thinkers, the book has existed mostly just as a rumor, cosseted behind the skeins of its own legend — revered and puzzled over only from a great distance.
Which is why one rainy November night in 2007, I boarded a flight in Boston and rode the clouds until I woke up in Zurich, pulling up to the airport gate at about the same hour that the main branch of the United Bank of Switzerland, located on the city’s swanky Banhofstrasse, across from Tommy Hilfiger and close to Cartier, was opening its doors for the day. A change was under way: the book, which had spent the past 23 years locked inside a safe deposit box in one of the bank’s underground vaults, was just then being wrapped in black cloth and loaded into a discreet-looking padded suitcase on wheels. It was then rolled past the guards, out into the sunlight and clear, cold air, where it was loaded into a waiting car and whisked away.
THIS COULD SOUND, I realize, like the start of a spy novel or a Hollywood bank caper, but it is rather a story about genius and madness, as well as possession and obsession, with one object — this old, unusual book — skating among those things. Also, there are a lot of Jungians involved, a species of thinkers who subscribe to the theories of Carl Jung, the Swiss psychiatrist and author of the big red leather book. And Jungians, almost by definition, tend to get enthused anytime something previously hidden reveals itself, when whatever’s been underground finally makes it to the surface.
Carl Jung founded the field of analytical psychology and, along with Sigmund Freud, was responsible for popularizing the idea that a person’s interior life merited not just attention but dedicated exploration — a notion that has since propelled tens of millions of people into psychotherapy. Freud, who started as Jung’s mentor and later became his rival, generally viewed the unconscious mind as a warehouse for repressed desires, which could then be codified and pathologized and treated. Jung, over time, came to see the psyche as an inherently more spiritual and fluid place, an ocean that could be fished for enlightenment and healing.
Whether or not he would have wanted it this way, Jung — who regarded himself as a scientist — is today remembered more as a countercultural icon, a proponent of spirituality outside religion and the ultimate champion of dreamers and seekers everywhere, which has earned him both posthumous respect and posthumous ridicule. Jung’s ideas laid the foundation for the widely used Myers-Briggs personality test and influenced the creation of Alcoholics Anonymous. His central tenets — the existence of a collective unconscious and the power of archetypes — have seeped into the larger domain of New Age thinking while remaining more at the fringes of mainstream psychology.
A big man with wire-rimmed glasses, a booming laugh and a penchant for the experimental, Jung was interested in the psychological aspects of séances, of astrology, of witchcraft. He could be jocular and also impatient. He was a dynamic speaker, an empathic listener. He had a famously magnetic appeal with women. Working at Zurich’s Burghölzli psychiatric hospital, Jung listened intently to the ravings of schizophrenics, believing they held clues to both personal and universal truths. At home, in his spare time, he pored over Dante, Goethe, Swedenborg and Nietzsche. He began to study mythology and world cultures, applying what he learned to the live feed from the unconscious — claiming that dreams offered a rich and symbolic narrative coming from the depths of the psyche. Somewhere along the way, he started to view the human soul — not just the mind and the body — as requiring specific care and development, an idea that pushed him into a province long occupied by poets and priests but not so much by medical doctors and empirical scientists.
Jung soon found himself in opposition not just to Freud but also to most of his field, the psychiatrists who constituted the dominant culture at the time, speaking the clinical language of symptom and diagnosis behind the deadbolts of asylum wards. Separation was not easy. As his convictions began to crystallize, Jung, who was at that point an outwardly successful and ambitious man with a young family, a thriving private practice and a big, elegant house on the shores of Lake Zurich, felt his own psyche starting to teeter and slide, until finally he was dumped into what would become a life-altering crisis.
What happened next to Carl Jung has become, among Jungians and other scholars, the topic of enduring legend and controversy. It has been characterized variously as a creative illness, a descent into the underworld, a bout with insanity, a narcissistic self-deification, a transcendence, a midlife breakdown and an inner disturbance mirroring the upheaval of World War I. Whatever the case, in 1913, Jung, who was then 38, got lost in the soup of his own psyche. He was haunted by troubling visions and heard inner voices. Grappling with the horror of some of what he saw, he worried in moments that he was, in his own words, “menaced by a psychosis” or “doing a schizophrenia.”
He later would compare this period of his life — this “confrontation with the unconscious,” as he called it — to a mescaline experiment. He described his visions as coming in an “incessant stream.” He likened them to rocks falling on his head, to thunderstorms, to molten lava. “I often had to cling to the table,” he recalled, “so as not to fall apart.”
Had he been a psychiatric patient, Jung might well have been told he had a nervous disorder and encouraged to ignore the circus going on in his head. But as a psychiatrist, and one with a decidedly maverick streak, he tried instead to tear down the wall between his rational self and his psyche. For about six years, Jung worked to prevent his conscious mind from blocking out what his unconscious mind wanted to show him. Between appointments with patients, after dinner with his wife and children, whenever there was a spare hour or two, Jung sat in a book-lined office on the second floor of his home and actually induced hallucinations — what he called “active imaginations.” “In order to grasp the fantasies which were stirring in me ‘underground,’ ” Jung wrote later in his book “Memories, Dreams, Reflections,” “I knew that I had to let myself plummet down into them.” He found himself in a liminal place, as full of creative abundance as it was of potential ruin, believing it to be the same borderlands traveled by both lunatics and great artists.
Jung recorded it all. First taking notes in a series of small, black journals, he then expounded upon and analyzed his fantasies, writing in a regal, prophetic tone in the big red-leather book. The book detailed an unabashedly psychedelic voyage through his own mind, a vaguely Homeric progression of encounters with strange people taking place in a curious, shifting dreamscape. Writing in German, he filled 205 oversize pages with elaborate calligraphy and with richly hued, staggeringly detailed paintings.
What he wrote did not belong to his previous canon of dispassionate, academic essays on psychiatry. Nor was it a straightforward diary. It did not mention his wife, or his children, or his colleagues, nor for that matter did it use any psychiatric language at all. Instead, the book was a kind of phantasmagoric morality play, driven by Jung’s own wish not just to chart a course out of the mangrove swamp of his inner world but also to take some of its riches with him. It was this last part — the idea that a person might move beneficially between the poles of the rational and irrational, the light and the dark, the conscious and the unconscious — that provided the germ for his later work and for what analytical psychology would become.
The book tells the story of Jung trying to face down his own demons as they emerged from the shadows. The results are humiliating, sometimes unsavory. In it, Jung travels the land of the dead, falls in love with a woman he later realizes is his sister, gets squeezed by a giant serpent and, in one terrifying moment, eats the liver of a little child. (“I swallow with desperate efforts — it is impossible — once again and once again — I almost faint — it is done.”) At one point, even the devil criticizes Jung as hateful.
He worked on his red book — and he called it just that, the Red Book — on and off for about 16 years, long after his personal crisis had passed, but he never managed to finish it. He actively fretted over it, wondering whether to have it published and face ridicule from his scientifically oriented peers or to put it in a drawer and forget it. Regarding the significance of what the book contained, however, Jung was unequivocal. “All my works, all my creative activity,” he would recall later, “has come from those initial fantasies and dreams.”
Jung evidently kept the Red Book locked in a cupboard in his house in the Zurich suburb of Küsnacht. When he died in 1961, he left no specific instructions about what to do with it. His son, Franz, an architect and the third of Jung’s five children, took over running the house and chose to leave the book, with its strange musings and elaborate paintings, where it was. Later, in 1984, the family transferred it to the bank, where since then it has fulminated as both an asset and a liability.
Anytime someone did ask to see the Red Book, family members said, without hesitation and sometimes without decorum, no. The book was private, they asserted, an intensely personal work. In 1989, an American analyst named Stephen Martin, who was then the editor of a Jungian journal and now directs a Jungian nonprofit foundation, visited Jung’s son (his other four children were daughters) and inquired about the Red Book. The question was met with a vehemence that surprised him. “Franz Jung, an otherwise genial and gracious man, reacted sharply, nearly with anger,” Martin later wrote in his foundation’s newsletter, saying “in no uncertain terms” that Martin could not “see the Red Book, nor could he ever imagine that it would be published.”
And yet, Carl Jung’s secret Red Book — scanned, translated and footnoted — will be in stores early next month, published by W. W. Norton and billed as the “most influential unpublished work in the history of psychology.” Surely it is a victory for someone, but it is too early yet to say for whom.
STEPHEN MARTIN IS a compact, bearded man of 57. He has a buoyant, irreverent wit and what feels like a fully intact sense of wonder. If you happen to have a conversation with him anytime before, say, 10 a.m., he will ask his first question — “How did you sleep?” — and likely follow it with a second one — “Did you dream?” Because for Martin, as it is for all Jungian analysts, dreaming offers a barometric reading of the psyche. At his house in a leafy suburb of Philadelphia, Martin keeps five thick books filled with notations on and interpretations of all the dreams he had while studying to be an analyst 30 years ago in Zurich, under the tutelage of a Swiss analyst then in her 70s named Liliane Frey-Rohn. These days, Martin stores his dreams on his computer, but his dream life is — as he says everybody’s dream life should be — as involving as ever.
Even as some of his peers in the Jungian world are cautious about regarding Carl Jung as a sage — a history of anti-Semitic remarks and his sometimes patriarchal views of women have caused some to distance themselves — Martin is unapologetically reverential. He keeps Jung’s 20 volumes of collected works on a shelf at home. He rereads “Memories, Dreams, Reflections” at least twice a year. Many years ago, when one of his daughters interviewed him as part of a school project and asked what his religion was, Martin, a nonobservant Jew, answered, “Oh, honey, I’m a Jungian.”
The first time I met him, at the train station in Ardmore, Pa., Martin shook my hand and thoughtfully took my suitcase. “Come,” he said. “I’ll take you to see the holy hankie.” We then walked several blocks to the office where Martin sees clients. The room was cozy and cavelike, with a thick rug and walls painted a deep, handsome shade of blue. There was a Mission-style sofa and two upholstered chairs and an espresso machine in one corner.
Several mounted vintage posters of Zurich hung on the walls, along with framed photographs of Carl Jung, looking wise and white-haired, and Liliane Frey-Rohn, a round-faced woman smiling maternally from behind a pair of severe glasses.
Martin tenderly lifted several first-edition books by Jung from a shelf, opening them so I could see how they had been inscribed to Frey-Rohn, who later bequeathed them to Martin. Finally, we found ourselves standing in front of a square frame hung on the room’s far wall, another gift from his former analyst and the centerpiece of Martin’s Jung arcana. Inside the frame was a delicate linen square, its crispness worn away by age — a folded handkerchief with the letters “CGJ” embroidered neatly in one corner in gray. Martin pointed. “There you have it,” he said with exaggerated pomp, “the holy hankie, the sacred nasal shroud of C. G. Jung.”
In addition to practicing as an analyst, Martin is the director of the Philemon Foundation, which focuses on preparing the unpublished works of Carl Jung for publication, with the Red Book as its central project. He has spent the last several years aggressively, sometimes evangelistically, raising money in the Jungian community to support his foundation. The foundation, in turn, helped pay for the translating of the book and the addition of a scholarly apparatus — a lengthy introduction and vast network of footnotes — written by a London-based historian named Sonu Shamdasani, who serves as the foundation’s general editor and who spent about three years persuading the family to endorse the publication of the book and to allow him access to it.
Given the Philemon Foundation’s aim to excavate and make public C. G. Jung’s old papers — lectures he delivered at Zurich’s Psychological Club or unpublished letters, for example — both Martin and Shamdasani, who started the foundation in 2003, have worked to develop a relationship with the Jung family, the owners and notoriously protective gatekeepers of Jung’s works. Martin echoed what nearly everybody I met subsequently would tell me about working with Jung’s descendants. “It’s sometimes delicate,” he said, adding by way of explanation, “They are very Swiss.”
What he likely meant by this was that the members of the Jung family who work most actively on maintaining Jung’s estate tend to do things carefully and with an emphasis on privacy and decorum and are on occasion taken aback by the relatively brazen and totally informal way that American Jungians — who it is safe to say are the most ardent of all Jungians — inject themselves into the family’s business. There are Americans knocking unannounced on the door of the family home in Küsnacht; Americans scaling the fence at Bollingen, the stone tower Jung built as a summer residence farther south on the shore of Lake Zurich. Americans pepper Ulrich Hoerni, one of Jung’s grandsons who manages Jung’s editorial and archival matters through a family foundation, almost weekly with requests for various permissions. The relationship between the Jungs and the people who are inspired by Jung is, almost by necessity, a complex symbiosis. The Red Book — which on one hand described Jung’s self-analysis and became the genesis for the Jungian method and on the other was just strange enough to possibly embarrass the family — held a certain electrical charge. Martin recognized the descendants’ quandary. “They own it, but they haven’t lived it,” he said, describing Jung’s legacy. “It’s very consternating for them because we all feel like we own it.” Even the old psychiatrist himself seemed to recognize the tension. “Thank God I am Jung,” he is rumored once to have said, “and not a Jungian.”
“This guy, he was a bodhisattva,” Martin said to me that day. “This is the greatest psychic explorer of the 20th century, and this book tells the story of his inner life.” He added, “It gives me goose bumps just thinking about it.” He had at that point yet to lay eyes on the book, but for him that made it all the more tantalizing. His hope was that the Red Book would “reinvigorate” Jungian psychology, or at the very least bring himself personally closer to Jung. “Will I understand it?” he said. “Probably not. Will it disappoint? Probably. Will it inspire? How could it not?” He paused a moment, seeming to think it through. “I want to be transformed by it,” he said finally. “That’s all there is.”
IN ORDER TO UNDERSTAND and decode the Red Book — a process he says required more than five years of concentrated work — Sonu Shamdasani took long, rambling walks on London’s Hampstead Heath. He would translate the book in the morning, then walk miles in the park in the afternoon, his mind trying to follow the rabbit’s path Jung had forged through his own mind.
Shamdasani is 46. He has thick black hair, a punctilious eye for detail and an understated, even somnolent, way of speaking. He is friendly but not particularly given to small talk. If Stephen Martin is — in Jungian terms — a “feeling type,” then Shamdasani, who teaches at the University College London’s Wellcome Trust Center for the History of Medicine and keeps a book by the ancient Greek playwright Aeschylus by his sofa for light reading, is a “thinking type.” He has studied Jungian psychology for more than 15 years and is particularly drawn to the breadth of Jung’s psychology and his knowledge of Eastern thought, as well as the historical richness of his era, a period when visionary writing was more common, when science and art were more entwined and when Europe was slipping into the psychic upheaval of war. He tends to be suspicious of interpretive thinking that’s not anchored by hard fact — and has, in fact, made a habit of attacking anybody he deems guilty of sloppy scholarship — and also maintains a generally unsentimental attitude toward Jung. Both of these qualities make him, at times, awkward company among both Jungians and Jungs.
The relationship between historians and the families of history’s luminaries is, almost by nature, one of mutual disenchantment. One side works to extract; the other to protect. One pushes; one pulls. Stephen Joyce, James Joyce’s literary executor and last living heir, has compared scholars and biographers to “rats and lice.” Vladimir Nabokov’s son Dmitri recently told an interviewer that he considered destroying his father’s last known novel in order to rescue it from the “monstrous nincompoops” who had already picked over his father’s life and works. T. S. Eliot’s widow, Valerie Fletcher, has actively kept his papers out of the hands of biographers, and Anna Freud was, during her lifetime, notoriously selective about who was allowed to read and quote from her father’s archives.
Even against this backdrop, the Jungs, led by Ulrich Hoerni, the chief literary administrator, have distinguished themselves with their custodial vigor. Over the years, they have tried to interfere with the publication of books perceived to be negative or inaccurate (including one by the award-winning biographer Deirdre Bair), engaged in legal standoffs with Jungians and other academics over rights to Jung’s work and maintained a state of high agitation concerning the way C. G. Jung is portrayed. Shamdasani was initially cautious with Jung’s heirs. “They had a retinue of people coming to them and asking to see the crown jewels,” he told me in London this summer. “And the standard reply was, ‘Get lost.’ ”
Shamdasani first approached the family with a proposal to edit and eventually publish the Red Book in 1997, which turned out to be an opportune moment. Franz Jung, a vehement opponent of exposing Jung’s private side, had recently died, and the family was reeling from the publication of two controversial and widely discussed books by an American psychologist named Richard Noll, who proposed that Jung was a philandering, self-appointed prophet of a sun-worshiping Aryan cult and that several of his central ideas were either plagiarized or based upon falsified research.
While the attacks by Noll might have normally propelled the family to more vociferously guard the Red Book, Shamdasani showed up with the right bargaining chips — two partial typed draft manuscripts (without illustrations) of the Red Book he had dug up elsewhere. One was sitting on a bookshelf in a house in southern Switzerland, at the home of the elderly daughter of a woman who once worked as a transcriptionist and translator for Jung. The second he found at Yale University’s Beinecke Library, in an uncataloged box of papers belonging to a well-known German publisher. The fact that there were partial copies of the Red Book signified two things — one, that Jung had distributed it to at least a few friends, presumably soliciting feedback for publication; and two, that the book, so long considered private and inaccessible, was in fact findable. The specter of Richard Noll and anybody else who, they feared, might want to taint Jung by quoting selectively from the book loomed large. With or without the family’s blessing, the Red Book — or at least parts of it — would likely become public at some point soon, “probably,” Shamdasani wrote ominously in a report to the family, “in sensationalistic form.”
For about two years, Shamdasani flew back and forth to Zurich, making his case to Jung’s heirs. He had lunches and coffees and delivered a lecture. Finally, after what were by all accounts tense deliberations inside the family, Shamdasani was given a small salary and a color copy of the original book and was granted permission to proceed in preparing it for publication, though he was bound by a strict confidentiality agreement. When money ran short in 2003, the Philemon Foundation was created to finance Shamdasani’s research.
Having lived more or less alone with the book for almost a decade, Shamdasani — who is a lover of fine wine and the intricacies of jazz — these days has the slightly stunned aspect of someone who has only very recently found his way out of an enormous maze. When I visited him this summer in the book-stuffed duplex overlooking the heath, he was just adding his 1,051st footnote to the Red Book.
The footnotes map both Shamdasani’s journey and Jung’s. They include references to Faust, Keats, Ovid, the Norse gods Odin and Thor, the Egyptian deities Isis and Osiris, the Greek goddess Hecate, ancient Gnostic texts, Greek Hyperboreans, King Herod, the Old Testament, the New Testament, Nietzsche’s Zarathustra, astrology, the artist Giacometti and the alchemical formulation of gold. And that’s just naming a few. The central premise of the book, Shamdasani told me, was that Jung had become disillusioned with scientific rationalism — what he called “the spirit of the times” — and over the course of many quixotic encounters with his own soul and with other inner figures, he comes to know and appreciate “the spirit of the depths,” a field that makes room for magic, coincidence and the mythological metaphors delivered by dreams.
“It is the nuclear reactor for all his works,” Shamdasani said, noting that Jung’s more well-known concepts — including his belief that humanity shares a pool of ancient wisdom that he called the collective unconscious and the thought that personalities have both male and female components (animus and anima) — have their roots in the Red Book. Creating the book also led Jung to reformulate how he worked with clients, as evidenced by an entry Shamdasani found in a self-published book written by a former client, in which she recalls Jung’s advice for processing what went on in the deeper and sometimes frightening parts of her mind.
“I should advise you to put it all down as beautifully as you can — in some beautifully bound book,” Jung instructed. “It will seem as if you were making the visions banal — but then you need to do that — then you are freed from the power of them. . . . Then when these things are in some precious book you can go to the book & turn over the pages & for you it will be your church — your cathedral — the silent places of your spirit where you will find renewal. If anyone tells you that it is morbid or neurotic and you listen to them — then you will lose your soul — for in that book is your soul.”
ZURICH IS, IF NOTHING ELSE, one of Europe’s more purposeful cities. Its church bells clang precisely; its trains glide in and out on a flawless schedule. There are crowded fondue restaurants and chocolatiers and rosy-cheeked natives breezily pedaling their bicycles over the stone bridges that span the Limmat River. In summer, white-sailed yachts puff around Lake Zurich; in winter, the Alps glitter on the horizon. And during the lunch hour year-round, squads of young bankers stride the Banhofstrasse in their power suits and high-end watches, appearing eternally mindful of the fact that beneath everyone’s feet lie labyrinthine vaults stuffed with a dazzling and disproportionate amount of the world’s wealth.
But there, too, ventilating the city’s material splendor with their devotion to dreams, are the Jungians. Some 100 Jungian analysts practice in and around Zurich, examining their clients’ dreams in sessions held in small offices tucked inside buildings around the city. Another few hundred analysts in training can be found studying at one of the two Jungian institutes in the area. More than once, I have been told that, in addition to being a fantastic tourist destination and a good place to hide money, Zurich is an excellent city for dreaming.
Jungians are accustomed to being in the minority pretty much everywhere they go, but here, inside a city of 370,000, they have found a certain quiet purchase. Zurich, for Jungians, is spiritually loaded. It’s a kind of Jerusalem, the place where C. G. Jung began his career, held seminars, cultivated an inner circle of disciples, developed his theories of the psyche and eventually grew old. Many of the people who enroll in the institutes are Swiss, American, British or German, but some are from places like Japan and South Africa and Brazil. Though there are other Jungian institutes in other cities around the world offering diploma programs, learning the techniques of dream analysis in Zurich is a little bit like learning to hit a baseball in Yankee Stadium. For a believer, the place alone conveys a talismanic grace.
Just as I had, Stephen Martin flew to Zurich the week the Red Book was taken from its bank-vault home and moved to a small photo studio near the opera house to be scanned, page by page, for publication. (A separate English translation along with Shamdasani’s introduction and footnotes will be included at the back of the book.) Martin already made a habit of visiting Zurich a few times a year for “bratwurst and renewal” and to attend to Philemon Foundation business. My first morning there, we walked around the older parts of Zurich, before going to see the book. Zurich made Martin nostalgic. It was here that he met his wife, Charlotte, and here that he developed the almost equally important relationship with his analyst, Frey-Rohn, carrying himself and his dreams to her office two or three times weekly for several years.
Undergoing analysis is a central, learn-by-doing part of Jungian training, which usually takes about five years and also involves taking courses in folklore, mythology, comparative religion and psychopathology, among others. It is, Martin says, very much a “mentor-based discipline.” He is fond of pointing out his own conferred pedigree, because Frey-Rohn was herself analyzed by C. G. Jung. Most analysts seem to know their bloodlines. That morning, Martin and I were passing a cafe when he spotted another American analyst, someone he knew in school and who has since settled in Switzerland. “Oh, there’s Bob,” Martin said merrily, making his way toward the man. “Bob trained with Liliane,” he explained to me, “and that makes us kind of like brothers.”
Jungian analysis revolves largely around writing down your dreams (or drawing them) and bringing them to the analyst — someone who is patently good with both symbols and people — to be scoured for personal and archetypal meaning. Borrowing from Jung’s own experiences, analysts often encourage clients to experiment on their own with active imagination, to summon a waking dreamscape and to interact with whatever, or whoever, surfaces there. Analysis is considered to be a form of psychotherapy, and many analysts are in fact trained also as psychotherapists, but in its purist form, a Jungian analyst eschews clinical talk of diagnoses and recovery in favor of broader (and some might say fuzzier) goals of self-discovery and wholeness — a maturation process Jung himself referred to as “individuation.” Perhaps as a result, Jungian analysis has a distinct appeal to people in midlife. “The purpose of analysis is not treatment,” Martin explained to me. “That’s the purpose of psychotherapy. The purpose of analysis,” he added, a touch grandly, “is to give life back to someone who’s lost it.”
Later that day, we went to the photo studio where the work on the book was already under way. The room was a charmless space with concrete floors and black walls. Its hushed atmosphere and glaring lights added a slightly surgical aspect. There was the editor from Norton in a tweedy sport coat. There was an art director hired by Norton and two technicians from a company called DigitalFusion, who had flown to Zurich from Southern California with what looked to be a half-ton of computer and camera equipment.
Shamdasani arrived ahead of us. And so did Ulrich Hoerni, who, along with his cousin Peter Jung, had become a cautious supporter of Shamdasani, working to build consensus inside the family to allow the book out into the world. Hoerni was the one to fetch the book from the bank and was now standing by, his brow furrowed, appearing somewhat tortured. To talk to Jung’s heirs is to understand that nearly four decades after his death, they continue to reel inside the psychic tornado Jung created during his lifetime, caught between the opposing forces of his admirers and critics and between their own filial loyalties and history’s pressing tendency to judge and rejudge its own playmakers. Hoerni would later tell me that Shamdasani’s discovery of the stray copies of the Red Book surprised him, that even today he’s not entirely clear about whether Carl Jung ever intended for the Red Book to be published. “He left it an open question,” he said. “One might think he would have taken some of his children aside and said, ‘This is what it is and what I want done with it,’ but he didn’t.” It was a burden Hoerni seemed to wear heavily. He had shown up at the photo studio not just with the Red Book in its special padded suitcase but also with a bedroll and a toothbrush, since after the day’s work was wrapped, he would be spending the night curled up near the book — “a necessary insurance measure,” he would explain.
And finally, there sunbathing under the lights, sat Carl Jung’s Red Book, splayed open to Page 37. One side of the open page showed an intricate mosaic painting of a giant holding an ax, surrounded by winged serpents and crocodiles. The other side was filled with a cramped German calligraphy that seemed at once controlled and also, just given the number of words on the page, created the impression of something written feverishly, cathartically. Above the book a 10,200-pixel scanner suspended on a dolly clicked and whirred, capturing the book one-tenth of a millimeter at a time and uploading the images into a computer.
The Red Book had an undeniable beauty. Its colors seemed almost to pulse, its writing almost to crawl. Shamdasani’s relief was palpable, as was Hoerni’s anxiety. Everyone in the room seemed frozen in a kind of awe, especially Stephen Martin, who stood about eight feet away from the book but then finally, after a few minutes, began to inch closer to it. When the art director called for a break, Martin leaned in, tilting his head to read some of the German on the page. Whether he understood it or not, he didn’t say. He only looked up and smiled.
ONE AFTERNOON I took a break from the scanning and visited Andreas Jung, who lives with his wife, Vreni, in C. G. Jung’s old house at 228 Seestrasse in the town of Küsnacht. The house — a 5,000-square-foot, 1908 baroque-style home, designed by the psychiatrist and financed largely with his wife, Emma’s, inheritance — sits on an expanse between the road and the lake. Two rows of trimmed, towering topiary trees create a narrow passage to the entrance. The house faces the white-capped lake, a set of manicured gardens and, in one corner, an anomalous, unruly patch of bamboo.
Andreas is a tall man with a quiet demeanor and a gentlemanly way of dressing. At 64, he resembles a thinner, milder version of his famous grandfather, whom he refers to as “C. G.” Among Jung’s five children (all but one are dead) and 19 grandchildren (all but five are still living), he is one of the youngest and also known as the most accommodating to curious outsiders. It is an uneasy kind of celebrity. He and Vreni make tea and politely serve cookies and dispense little anecdotes about Jung to those courteous enough to make an advance appointment. “People want to talk to me and sometimes even touch me,” Andreas told me, seeming both amused and a little sheepish. “But it is not at all because of me, of course. It is because of my grandfather.” He mentioned that the gardeners who trim the trees are often perplexed when they encounter strangers — usually foreigners — snapping pictures of the house. “In Switzerland, C. G. Jung is not thought to be so important,” he said. “They don’t see the point of it.”
Jung, who was born in the mountain village of Kesswil, was a lifelong outsider in Zurich, even as in his adult years he seeded the city with his followers and became — along with Paul Klee and Karl Barth — one of the best-known Swissmen of his era. Perhaps his marginalization stemmed in part from the offbeat nature of his ideas. (He was mocked, for example, for publishing a book in the late 1950s that examined the psychological phenomenon of flying saucers.) Maybe it was his well-documented abrasiveness toward people he found uninteresting. Or maybe it was connected to the fact that he broke with the established ranks of his profession. (During the troubled period when he began writing the Red Book, Jung resigned from his position at Burghölzli, never to return.) Most likely, too, it had something to do with the unconventional, unhidden, 40-something-year affair he conducted with a shy but intellectually forbidding woman named Toni Wolff, one of Jung’s former analysands who went on to become an analyst as well as Jung’s close professional collaborator and a frequent, if not fully welcome, fixture at the Jung family dinner table.
“The life of C. G. Jung was not easy,” Andreas said. “For the family, it was not easy at all.” As a young man, Andreas had sometimes gone and found his grandfather’s Red Book in the cupboard and paged through it, just for fun. Knowing its author personally, he said, “It was not strange to me at all.”
For the family, C. G. Jung became more of a puzzle after his death, having left behind a large amount of unpublished work and an audience eager to get its hands on it. “There were big fights,” Andreas told me when I visited him again this summer. Andreas, who was 19 when his grandfather died, recalled family debates over whether or not to allow some of Jung’s private letters to be published. When the extended family gathered for the annual Christmas party in Küsnacht, Jung’s children would disappear into a room and have heated discussions about what to do with what he had left behind while his grandchildren played in another room. “My cousins and brothers and I, we thought they were silly to argue over these things,” Andreas said, with a light laugh. “But later when our parents died, we found ourselves having those same arguments.”
Even Jung’s great-grandchildren felt his presence. “He was omnipresent,” Daniel Baumann, whose grandmother was Jung’s daughter Gret, would tell me when I met him later. He described his own childhood with a mix of bitterness and sympathy directed at the older generations. “It was, ‘Jung said this,’ and ‘Jung did that,’ and ‘Jung thought that.’ When you did something, he was always present somehow. He just continued to live on. He was with us. He is still with us,” Baumann said. Baumann is an architect and also the president of the board of the C. G. Jung Institute in Küsnacht. He deals with Jungians all the time, and for them, he said, it was the same. Jung was both there and not there. “It’s sort of like a hologram,” he said. “Everyone projects something in the space, and Jung begins to be a real person again.”
ONE NIGHT DURING the week of the scanning in Zurich, I had a big dream. A big dream, the Jungians tell me, is a departure from all your regular dreams, which in my case meant this dream was not about falling off a cliff or missing an exam. This dream was about an elephant — a dead elephant with its head cut off. The head was on a grill at a suburban-style barbecue, and I was holding the spatula. Everybody milled around with cocktails; the head sizzled over the flames. I was angry at my daughter’s kindergarten teacher because she was supposed to be grilling the elephant head at the barbecue, but she hadn’t bothered to show up. And so the job fell to me. Then I woke up.
At the hotel breakfast buffet, I bumped into Stephen Martin and a Californian analyst named Nancy Furlotti, who is the vice president on the board of the Philemon Foundation and was at that moment having tea and muesli.
“How are you?” Martin said.
“Did you dream?” Furlotti asked
“What do elephants mean to you?” Martin asked after I relayed my dream.
“I like elephants,” I said. “I admire elephants.”
“There’s Ganesha,” Furlotti said, more to Martin than to me. “Ganesha is an Indian god of wisdom.”
“Elephants are maternal,” Martin offered, “very caring.”
They spent a few minutes puzzling over the archetypal role of the kindergarten teacher. “How do you feel about her?” “Would you say she is more like a mother figure or more like a witch?”
Giving a dream to a Jungian analyst is a little bit like feeding a complex quadratic equation to someone who really enjoys math. It takes time. The process itself is to be savored. The solution is not always immediately evident. In the following months, I told my dream to several more analysts, and each one circled around similar symbolic concepts about femininity and wisdom. One day I was in the office of Murray Stein, an American analyst who lives in Switzerland and serves as the president of the International School of Analytical Psychology, talking about the Red Book. Stein was telling me about how some Jungian analysts he knew were worried about the publication — worried specifically that it was a private document and would be apprehended as the work of a crazy person, which then reminded me of my crazy dream. I related it to him, saying that the very thought of eating an elephant’s head struck me as grotesque and embarrassing and possibly a sign there was something deeply wrong with my psyche. Stein assured me that eating is a symbol for integration. “Don’t worry,” he said soothingly. “It’s horrifying on a naturalistic level, but symbolically it is good.”
It turned out that nearly everybody around the Red Book was dreaming that week. Nancy Furlotti dreamed that we were all sitting at a table drinking amber liquid from glass globes and talking about death. (Was the scanning of the book a death? Wasn’t death followed by rebirth?) Sonu Shamdasani dreamed that he came upon Hoerni sleeping in the garden of a museum. Stephen Martin was sure that he had felt some invisible hand patting him on the back while he slept. And Hugh Milstein, one of the digital techs scanning the book, passed a tormented night watching a ghostly, white-faced child flash on a computer screen. (Furlotti and Martin debated: could that be Mercurius? The god of travelers at a crossroads?)
Early one morning we were standing around the photo studio discussing our various dreams when Ulrich Hoerni trudged through the door, having deputized his nephew Felix to spend the previous night next to the Red Book. Felix had done his job; the Red Book lay sleeping with its cover closed on the table. But Hoerni, appearing weary, seemed to be taking an extra hard look at the book. The Jungians greeted him. “How are you? Did you dream last night?”
“Yes,” Hoerni said quietly, not moving his gaze from the table. “I dreamed the book was on fire.”
ABOUT HALFWAY THROUGH the Red Book — after he has traversed a desert, scrambled up mountains, carried God on his back, committed murder, visited hell; and after he has had long and inconclusive talks with his guru, Philemon, a man with bullhorns and a long beard who flaps around on kingfisher wings — Jung is feeling understandably tired and insane. This is when his soul, a female figure who surfaces periodically throughout the book, shows up again. She tells him not to fear madness but to accept it, even to tap into it as a source of creativity. “If you want to find paths, you should also not spurn madness, since it makes up such a great part of your nature.”
The Red Book is not an easy journey — it wasn’t for Jung, it wasn’t for his family, nor for Shamdasani, and neither will it be for readers. The book is bombastic, baroque and like so much else about Carl Jung, a willful oddity, synched with an antediluvian and mystical reality. The text is dense, often poetic, always strange. The art is arresting and also strange. Even today, its publication feels risky, like an exposure. But then again, it is possible Jung intended it as such. In 1959, after having left the book more or less untouched for 30 or so years, he penned a brief epilogue, acknowledging the central dilemma in considering the book’s fate. “To the superficial observer,” he wrote, “it will appear like madness.” Yet the very fact he wrote an epilogue seems to indicate that he trusted his words would someday find the right audience.
Shamdasani figures that the Red Book’s contents will ignite both Jung’s fans and his critics. Already there are Jungians planning conferences and lectures devoted to the Red Book, something that Shamdasani finds amusing. Recalling that it took him years to feel as if he understood anything about the book, he’s curious to know what people will be saying about it just months after it is published. As far as he is concerned, once the book sees daylight, it will become a major and unignorable piece of Jung’s history, the gateway into Carl Jung’s most inner of inner experiences. “Once it’s published, there will be a ‘before’ and ‘after’ in Jungian scholarship,” he told me, adding, “it will wipe out all the biographies, just for starters.” What about the rest of us, the people who aren’t Jungians, I wondered. Was there something in the Red Book for us? “Absolutely, there is a human story here,” Shamdasani said. “The basic message he’s sending is ‘Value your inner life.’ ”
After it was scanned, the book went back to its bank-vault home, but it will move again — this time to New York, accompanied by a number of Jung’s descendents. For the next few months it will be on display at the Rubin Museum of Art. Ulrich Hoerni told me this summer that he assumed the book would generate “criticism and gossip,” but by bringing it out they were potentially rescuing future generations of Jungs from some of the struggles of the past. If another generation inherited the Red Book, he said, “the question would again have to be asked, ‘What do we do with it?’ ”
Stephen Martin too will be on hand for the book’s arrival in New York. He is already sensing that it will shed positive light on Jung — this thanks to a dream he had recently about an “inexpressively sublime” dawn breaking over the Swiss Alps — even as others are not so certain.
In the Red Book, after Jung’s soul urges him to embrace the madness, Jung is still doubtful. Then suddenly, as happens in dreams, his soul turns into “a fat, little professor,” who expresses a kind of paternal concern for Jung.
Jung says: “I too believe that I’ve completely lost myself. Am I really crazy? It’s all terribly confusing.”The professor responds: “Have patience, everything will work out. Anyway, sleep well.”
Saturday, September 5, 2009
San Bernardino authorities arrested a woman Friday who they say boarded 22 mentally ill, elderly and other people in prison-like conditions, housing some in converted chicken coops behind razor wire fences.Pensri Sophar Dalton, 61, was arrested on 16 counts of suspected elder abuse, according to City Atty. James F. Penman. Some of the people appeared to have mental health issues, he said.
“The stench was pretty horrific,” Penman said. “These were very squalid conditions.”
The people were being held in dilapidated buildings, some without running water or toilets. The facility is not licensed with the state or the city, Penman said. Two milk box crates containing prescription medicines were found at the site. At least two of the residents were in wheelchairs.Police arrested Dalton after going to her home in the 2800 block of North Golden Avenue to arrest a man with outstanding warrants for drunk driving, according Penman said.
Penman said that while some residents were held in converted chicken coops, others lived in shared bedrooms, the doors of which had padlocks on them. Residents ate on two picnic tables beneath a metal roof outside on a dirt floor.
Wednesday, August 26, 2009
May 29 2009
A recent post on "real psychology"--as opposed to all the fake or unreal psychology out there--got me thinking. The day we all decide on what real psychology is, is the day psychology dies. Real psychology equals dead-ended, myopic, oversimplification--of subject matter and of methodology. Real psychology is a means-centered approach. That is to say: only psychologists making use of prescribed, narrowly-defined "scientistic" methods are allowed into the fold. All others are touchy-feely, hopelessly subjective trespassers. Such a stance is 1) naïve, 2) unhistorical, and 3) regressive.
The study of the mind goes way back, of course, but let's just look at the 20th century. We had Wundt and his "experimental introspection," research into things like reaction time. We had the wonderfully overreaching brilliance of William James, who was into the same things as Wundt--attention, memory, sensation--but also psychic phenomena, religious experience, and philosophy and art. We had Freud and psychoanalysis. We had Jung and his association experiments. Then there was the biologically reductionistic doings of psychiatry that led, by the 1950s, to seizure therapies and lobotomy. Skinner's radical behaviorism had its day, followed by the cognitive revolution and, in time, by neuroscience. Lots always going on, in other words, from lots of different angles. Methodologically speaking, there was case study, experimentation, introspection, animal behavior, surveys, projective techniques, dream analysis, phenomenology, lesion studies--the list goes on and on. Methodological pluralism was/is the norm. But still today, let's face it, psychology is more or less in the Stone Ages. No doubt much has been accomplished. Powerful mid-level theories do exist that are promisingly predictive. But as for the great big questions, those enduring mysteries, we've taken only very small steps. We still don't know why we dream. We still don't know what causes schizophrenia. We still can't make solid sense of the function of consciousness. So let's not start proclaiming what real psychology is. Better to keep that question helpfully unanswered.
Psychology's disorder now is multiple personality, and in a way that's fine. What we've got is something like 60 sub-disciplines leaving in their wake a farrago of sub sub-disciplines. Each sub-discipline is pretty insular, there is little harmony overall (far more cacophony), and what's especially funny is this: every sub-discipline tends to believe--according to an in-group, out-group dynamic--that it is THE ONLY ONE DOING REAL PSYCHOLOGY. In fact, each is focusing on its own little hiccup of mind, its own pet variables, while mainly neglecting the questions other sub-disciplines find so essential. So each sub-discipline inflates the importance of its methods/questions while devaluing the methods/questions of other sub-disciplines. That attitude was on hair-raising display in the post cited at the top of this one.
Take my situation. I have a PhD in Personality from UC Davis. Now, presently, with some important exceptions, Social Psychologists sometimes devalue Personality Science while Personality Psychologists sometimes devalue Social Psychology. I like to think of this as the narcissism of minor differences, but that's another subject altogether.
I also do qualitative case study research that in my case goes by the name of Psychobiography. According to some, that's not real psychology because it is not experimental. Well, someone should have clued in Piaget, Erikson, Maslow, Freud, Jung, James, Skinner (who also used single-subject design), RD Laing, Henry Murray, Silvan Tomkins, etc etc etc, ALL OF WHOM DID CASE STUDY AND ALL OF WHOM ARE REGARDED AS SEMINAL FIGURES IN THE FIELD. I don't know, it's a strangely territorial neurotic mind-set that 1) believes itself in possession of true knowledge and 2) feels a need to tell lowly others that what they are up to is BS.
I say this: we psychologists know a lot less than we think we do, and at this very early stage of the game in the study of mind, all promising approaches and questions are welcome. The more the merrier. Does anything go? No. But is there one real psychology? Double no.
Published on Psychology Today (http://www.psychologytoday.com)
Friday, August 14, 2009
We're all familiar with the stereotype of the tortured artist. Salvador Dali's various disorders and Sylvia Plath's depression spring to mind. Now new research seems to show why: a genetic mutation linked to psychosis and schizophrenia also influences creativity.
The finding could help to explain why mutations that increase a person's risk of developing mental illnesses such as schizophrenia and bipolar syndrome have been preserved, even preferred, during human evolution, says Szabolcs Kéri, a researcher at Semmelweis University in Budapest, Hungary, who carried out the study.
Kéri examined a gene involved in brain development called neuregulin 1, which previous studies have linked to a slightly increased risk of schizophrenia. Moreover, a single DNA letter mutation that affects how much of the neuregulin 1 protein is made in the brain has been linked to psychosis, poor memory and sensitivity to criticism.
About 50 per cent of healthy Europeans have one copy of this mutation, while 15 per cent possess two copies.
To determine how these variations affect creativity, Kéri genotyped 200 adults who responded to adverts seeking creative and accomplished volunteers. He also gave the volunteers two tests of creative thinking, and devised an objective score of their creative achievements, such as filing a patent or writing a book.
People with two copies of the neuregulin 1 mutation – about 12 per cent of the study participants – tended to score notably higher on these measures of creativity, compared with other volunteers with one or no copy of the mutation. Those with one copy were also judged to be more creative, on average, than volunteers without the mutation. All told, the mutation explained between 3 and 8 per cent of the differences in creativity, Kéri says.
Exactly how neuregulin 1 affects creativity isn't clear. Volunteers with two copies of the mutation were no more likely than others to possess so-called schizotypal traits, such as paranoia, odd speech patterns and inappropriate emotions. This would suggest that the mutation's connection to mental illness does not entirely explain its link to creativity, Kéri says.
Dampening the brain
Rather, Kéri speculates that the mutation dampens a brain region that reins in mood and behaviour, called the prefrontal cortex. This change could unleash creative potential in some people and psychotic delusions in others.
Intelligence could be one factor that determines whether the neuregulin 1 mutation boosts creativity or contributes to psychosis. Kéri's volunteers tended to be smarter than average. In contrast, another study of families with a history of schizophrenia found that the same mutation was associated with lower intelligence and psychotic symptoms.
"My clinical experience is that high-IQ people with psychosis have more intellectual capacity to deal with psychotic experiences," Kéri says. "It's not enough to experience those feelings, you have to communicate them."
Jeremy Hall, a geneticist at the University of Edinburgh in the UK who uncovered the link between the neuregulin 1 mutation and psychosis, agrees that the gene's effects are probably influenced by cognitive factors such as intelligence.
This doesn't mean that psychosis and creativity are the same, though. "There's always been this slightly romantic idea that madness and genius are the flipside to the same coin. How much is that true? Madness is often madness and doesn't have as much genetic association with intelligence," Hall says.
Bernard Crespi, a behavioural geneticist at Simon Fraser University in Burnaby, British Columbia, Canada, is holding his applause for now. "This is a very interesting study with remarkably strong results, though it must be replicated in an independent population before the results can be accepted with confidence," he says.
Wednesday, August 12, 2009
Monday, August 10, 2009
Sunday, August 9, 2009
By Mason Inman
It had been four years since 13-year-old Mohamed Abdul escaped civil war in Somalia, but he still had nightmares and flashbacks. When he was nine years old, a crowd fleeing a street shooting trampled him, putting him in the hospital for two weeks. A month later he saw the aftermath of an apparent massacre: about 20 corpses floating in the ocean. Soon after, militia-men shot him in the leg, knocked him unconscious, then raped his best friend, a girl named Halimo.
Recovering in the hospital, Abdul (not his real name) was overwhelmed by fear—and guilt, for not having helped Halimo. He felt unprovoked fury: he mistook people he knew well for the rapist and threatened to kill them. A few months later Abdul fled his homeland and landed in the Nakivale refugee settlement in Uganda. “I felt as if there were two personalities living inside me,” he said at the time. “One was smart and kind and normal; the other one was crazy and violent.”
Abdul had post-traumatic stress disorder (PTSD), an ailment characterized by fear, hyperarousal and vivid replays of the traumatic event. Fortunately, this refugee camp had an extraordinary resource. Psychologist Frank Neuner of Bielefeld University in Germany was offering “narrative exposure therapy” to its 14,400 Africans, mostly Rwandans. The approach coaxes trauma survivors to assimilate their troubling memories into their life stories and thereby regain some emotional balance.
After four 60- to 90-minute therapy sessions, Abdul’s flashbacks and nightmares disappeared; he was still easily startled but no longer felt out of control. His doctors deemed him “cured.”
Researchers and aid workers have historically overlooked mental health in developing countries, focusing instead on issues such as malnutrition, disease and high infant mortality, but that is changing. “What’s changed in the past 10 years is the realization that mental health is not separate from general health,” explains child psychiatrist Atif Rahman of the University of Liverpool in England.
Recent psychotherapy trials have achieved remarkable success in improving the lives of war survivors such as Abdul, poor mothers with postpartum depression and others victimized by the stresses of extreme poverty. The keys to a workable program for the impoverished include training ordinary citizens to be counselors and, in some cases, disguising the remedy as something other than a fix for emotional troubles.
Although many people think of mental illness as a plague of fast-paced modern life, some psychiatric ailments are actually more prevalent in the developing world, according to the World Health Organization. Of the several dozen wars and armed conflicts around the globe, nearly all are in developing countries, and this violence is leading to PTSD, which hinders recovery after the conflicts subside. Across South Asia, new mothers suffer from depression more frequently than they do in richer countries, according to a 2003 report by Rahman and his colleagues.
People in underprivileged nations also experience more severe economic stresses. “This pileup of adversities is associated with low mental health,” says sociologist Ronald Kessler of Harvard Medical School. For individuals living on the edge of survival, the economic ramifications of a mental illness can be especially devastating. When someone has a major mental illness, “you’ve lost their labor and their input,” notes mental health researcher Paul Bolton of Johns Hopkins University.
To make up for the deficit of mental health care professionals in the developing world, Neuner and his team recruited refugees from the camp. Anybody who could read, write and be empathetic was a candidate. Because nearly one third of the Rwandan refugees and half of the Somalis suffered from PTSD, many of the would-be counselors needed to be treated first.
For a PTSD sufferer, distressing experiences are divorced from time or place and out of sync with the person’s life story. “Once these memories are activated, usually the interpretation of the brain of what’s happening is that there’s a danger right now, because the brain is not really aware that it’s just a memory,” Neuner points out. “We want to nail down this vivid emotional representation. We want to bring it where it belongs and connect it with your life history.”
Accordingly, refugee therapists spent six weeks learning to help patients shape their lives into a coherent story, incorporating major traumas into the narrative. The strategy worked. Seventy percent of those who received the therapy no longer displayed significant PTSD symptoms at a nine-month follow-up assessment compared to a 37 percent recovery rate among a group of untreated refugees.
In Rawalpindi, a largely rural district of Pakistan, nearly 30 percent of new mothers become depressed—about twice the rate in the developed world. In addition to its toll on mothers, postpartum depression can harm babies’ emotional and, in South Asia, physical development. Most of these women consider their symptoms the fate of poor folk or believe that they are caused by tawiz, or black magic. Many are anxious about talking about their problems and being labeled as ill. What is more, Rawalpindi has only three psychiatrists for its more than 3.5 million residents.
To get around such stigmas and barriers, Rahman and his colleagues recruited government employees known as lady health workers to integrate mental health therapy into their home visits to mothers. Ordinarily, these workers visit homes 16 times a year to give advice on infant nutrition and child rearing.
A two-day course enabled these health workers to add mental health to their curriculum. Rahman’s approach is based on cognitive-behavior therapy, in which a counselor tries to correct distorted and negative ways of thinking either by discussing them openly or by suggesting more adaptive behaviors. If a mother said she could not afford to feed her baby healthful food, for example, the lady health worker would question that assumption and suggest incremental improvements to the baby’s diet. A year after giving birth, mothers given this psychologically sensitive advice showed half the rate of major depression of those who received traditional health visits. The strategy worked by empowering the women to solve problems, Rahman believes.
More efforts to bring psychiatry to the poor are under way, such as a trial in Pakistan in which community health workers help to ensure that schizophrenics take their medications. But the biggest hurdle is scaling up these treatments to meet the great need.
Note: This article was originally printed with the title, "Psychotherapy for the Poor".
By David Dobbs
April 6, 2009
Depression’s Wiring Diagram
When Helen Mayberg started curing depression by stimulating a previously unknown neural junction box in a brain area called Brodmann’s area 25—discovered through 20 years of dogged research—people asked her where she was going to look next. Her reaction was, “What do you mean, Where am I going to look next? I’m going to look more closely here!”
Her closer look is now paying off. In a series of papers last year, Mayberg and several of her colleagues used diffusion tensor imaging (DTI) to reveal the neural circuitry of depression at new levels of precision. This MRI technique illuminates the connective tracts in the brain. For depression, the resulting map may allow a better understanding of what drives the disorder—and much better targeting and patient selection for treatments such as deep-brain stimulation (DBS) that seek to tweak this circuitry.
In the early 2000s Mayberg and Wayne C. Drevets, then at Washington University Medical School, separately established that area 25, which appeared to connect several brain regions involved in mood, thought and emotion, is hyperactive in depressed patients. The area’s significance was confirmed when Mayberg and her colleagues at the University of Toronto—neurosurgeon Andres Lazano and psychiatrist Sidney Kennedy—used DBS devices to bring relief to 12 out of 20 intractably depressed patients [see “Turning Off Depression,” by David Dobbs; Scientific American Mind, August/September 2006]. “That confirmed my hypothesis that area 25 is an important crossroads,” Mayberg says. “But exactly what circuits were we affecting?”
The recent papers take her much closer to answering this question. Working with fellow imaging experts Heidi Johansen-Berg and Tim Behrens of the University of Oxford and others, Mayberg used DTI to produce detailed images of area 25’s “tractography,” the layout of the white matter tracts that connect disparate brain regions. They identified five connective tracts that run through this pea-size region, carrying neural traffic among five vital areas: the amygdala, a deep-brain area that moderates fear and other emotions; the orbitofrontal and medial frontal cortices, two poorly understood areas that appear to be significant in expectation, reward processing, error assessment, learning and decision making; the hippocampus, vital to memory; and the hypothalamus, which helps to regulate stress and arousal.
The refined imaging of these tracts does more than just confirm Mayberg’s previous work identifying area 25 as a junction box. It also gives her a map that provides diagnostic and targeting information for DBS treatments of the area. As she expected, the locations of those tracts varies among individuals. “And this variation,” Mayberg says, “along with variations in the nature of different patients’ depression, probably explains why some patients respond better than others. Because the location varies, we’re not hitting all five tracts the same way in every patient.”
In a new study of 20 more patients she began at Emory University, Mayberg plans to analyze the tractography and electrode placement to see which of the tracts seems to be most essential to the treatment’s success. That investigation may reveal yet more about the nature of depression—and it might help Mayberg identify which patients will benefit from surgery so she can spare those it will not help.
Meanwhile a kind of DBS gold rush has developed as other scientists slide neuromodulators into different brain areas to try to treat depression, obsessive-compulsive disorder, eating disorders, Tourette’s syndrome, headaches and chronic pain [see “Sparking Recovery with Brain ‘Pacemakers,’ ” by Morton L. Kringelbach and Tipu Z. Aziz; Scientific American Mind, December 2008/January 2009].
Although DBS treatment for depression might receive fda approval in as soon as four or five years, Mayberg does not think it will become common. She is following closely the work of researchers who are seeking ways to modulate tightly defined brain areas such as area 25 with tools less intrusive than electrodes. Stanford University bioengineer Karl Deisseroth, for instance, is having luck stimulating targeted brain areas in mice with proteins called opsins (cousins of retinal cells used in night vision) that can be placed noninvasively and then stimulated with light via a very thin fiber-optic cable rather than electricity from a bulky electrode. He and others hope to develop these or similar tools to create less invasive “switches” that modulate brain areas more cleanly than electrodes do. “There may come a time,” Mayberg says, “when we can work these circuits some other way.”
Note: This article was originally printed with the title, "Insights into the Brain's Circuitry".
"A Wiring Diagram for Depression," Scientific American Mind, April/May 2009, by David Dobbs.
Friday, August 7, 2009
A few years ago a single mother who had recently moved to town came to my office asking me to prescribe the stimulant drug Adderall for her sixth-grade son. The boy had been taking the medication for several years, and his mother had liked its effects: it made homework time easier and improved her son’s grades.
At the time of this visit, the boy was off the medication, and I conducted a series of cognitive and behavioral tests on him. He performed wonderfully. I also noticed that off the medication he was friendly and playful.
On a previous casual encounter, when the boy had been on Adderall, he had seemed reserved and quiet. His mother acknowledged this was a side effect of the Adderall. I told her that I did not think her son had attention-deficit hyperactivity disorder (ADHD) and that he did not need medication. That was the last time I saw her.
Attention-deficit hyperactivity disorder afflicts about 5 percent of U.S. children—twice as many boys as girls—age six to 17, according to a recent survey conducted by the Centers for Disease Control and Prevention. As its name implies, people with the condition have trouble focusing and often are hyperactive or impulsive. An estimated 9 percent of boys and 4 percent of girls in the U.S. are taking stimulant medications as part of their therapy for ADHD, the CDC reported in 2005. The majority of patients take methylphenidate (Ritalin, Concerta), whereas most of the rest are prescribed an amphetamine such as Adderall.
Although it sounds counterintuitive to give stimulants to a person who is hyperactive, these drugs are thought to boost activity in the parts of the brain responsible for attention and self-control. Indeed, the pills can improve attention, concentration and productivity and also suppress impulsive behavior, producing significant improvements in some people’s lives. Severe inattention and impulsivity put individuals at risk for substance abuse, unemployment, crime and car accidents. Thus, appropriate medication might keep a person out of prison, away from addictive drugs or in a job.
Over the past 15 years, however, doctors have been pinning the ADHD label on—and prescribing stimulants for—a rapidly rising number of patients, including those with moderate to mild inattention, some of whom, like the sixth grader I saw, have a normal ability to focus. This trend may be fueled in part by a relaxation of official diagnostic criteria for the disorder, combined with a lower tolerance in society for mild behavioral or cognitive problems.
In addition, patients are no longer just taking the medicines for a few years during grade school but are encouraged to stay on them into adulthood. In 2008 two new stimulants—Vyvanse (amphetamine) and Concerta—received U.S. Food and Drug Administration indications for treating adults, and pharmaceutical firms are pushing awareness of the adult forms of the disorder. What is more, many people who have no cognitive deficits are opting to take these drugs to boost their academic performance. A number of my patients—doctors, lawyers and other professionals—have asked me for stimulants in hopes of boosting their productivity. As a result of these developments, prescriptions for methylphenidate and amphetamine rose by almost 12 percent a year between 2000 and 2005, according to a 2007 study.
With the expanded and extended use of stimulants comes mounting concern that the drugs might take a toll on the brain over the long run. Indeed, a smattering of recent studies, most of them involving animals, hint that stimulants could alter the structure and function of the brain in ways that may depress mood, boost anxiety and, contrary to their short-term effects, lead to cognitive deficits. Human studies already indicate the medications can adversely affect areas of the brain that govern growth in children, and some researchers worry that additional harms have yet to be unearthed.
Medicine for the MindTo appreciate why stimulants could have negative effects over time, it helps to first understand what they do in the brain. One hallmark of ADHD is an underactive frontal cortex, a brain region that lies just behind the forehead and controls such “executive” functions as decision making, predicting future events, and suppressing emotions and urges. This area may, in some cases, be smaller than average in ADHD patients, compromising their executive abilities. Frontal cortex function depends greatly on a signaling chemical, or neurotransmitter, called dopamine, which is released in this structure by neurons that originate in deeper brain structures. Less dopamine in the prefrontal cortex is linked, for example, with cognitive difficulty in old age. Another set of dopamine-releasing neurons extends to the nucleus accumbens, a critical mediator of motivation, pleasure and reward whose function may also be impaired in ADHD.
Stimulants enhance communication in these dopamine-controlled brain circuits by binding to so-called dopamine transporters—the proteins on nerve endings that suck up excess dopamine—thereby deactivating them. As a result, dopamine accumulates outside the neurons, and the additional neurotransmitter is thought to improve the operation of neuronal circuits critical for motivation and impulse control.
Not only can methylphenidate and amphetamine ameliorate a mental deficit, they also can enhance cognitive performance. In studies dating back to the 1970s, researchers have shown that normal children who do not have ADHD also become more attentive—and often calmer—after taking stimulants. In fact, the drugs can lead to higher test scores in students of average and above-average intellectual ability
Since the 1950s, when doctors first started prescribing stimulants to treat behavior problems, millions of people have taken them without obvious incident. A number of studies have even exonerated them from causing possible adverse effects. For example, researchers have failed to find differences between stimulant-treated children and those not on meds in the larger-scale growth of the brain. In January 2009 child psychiatrist Philip Shaw of the National Institute of Mental Health and his colleagues used MRI scans to measure the change in the thickness of the cerebral cortex (the outer covering of the brain) of 43 youths between the ages of 12 and 16 who had ADHD.
The researchers found no evidence that stimulants slowed cortical growth. In fact, only the unmedicated adolescents showed more thinning of the cerebrum than was typical for their age, hinting that the drugs might facilitate normal cortical development in kids with ADHD.
Altering MoodDespite such positive reports, traces of a sinister side to stimulants have also surfaced. In February 2007 the FDA issued warnings about side effects such as growth stunting and psychosis, among other mental disorders. Indeed, the vast majority of adults with ADHD experience at least one additional psychiatric illness—often an anxiety disorder or drug addiction—in their lifetime. Having ADHD is itself a risk factor for other mental health problems, but the possibility also exists that stimulant treatment during childhood might contribute to these high rates of accompanying diagnoses.
After all, stimulants activate the brain’s reward pathways, which are part of the neural circuitry that controls mood under normal conditions. And at least three studies using animals hint that exposure to methylphenidate during childhood may alter mood in the long run, perhaps raising the risk of depression and anxiety in adulthood.
In an experiment published in 2003 psychiatrist Eric Nestler of the University of Texas Southwestern Medical Center and his colleagues injected juvenile rats twice a day with a low dose of methylphenidate similar to that prescribed for children with ADHD. When the rats became adults, the scientists observed the rodents’ responses to various emotional stimuli. The rodents that had received methylphenidate were significantly less responsive to natural rewards such as sugar, sex, and fun, novel environments than were untreated rats, suggesting that the drug-exposed animals find such stimuli less pleasurable. In addition, the stimulants apparently made the rats more sensitive to stressful situations such as being forced to swim inside a large tube. Similarly, in the same year psychiatrist William Carlezon of Harvard Medical School and his colleagues reported that methylphenidate-treated preadolescent rats displayed a muted response to a cocaine reward as adults as well as unusual apathy in a forced-swim test, a sign of depression.
In 2008 psychopharmacologist Leandro F. Vendruscolo and his co-workers at Federal University of Santa Catarina in Brazil echoed these results using spontaneously hypertensive rats, which—like children with ADHD—sometimes show attention deficits, hyperactivity and motor impulsiveness. The researchers injected these young rats with methylphenidate for 16 days at doses approximating those used to treat ADHD in young people. Four weeks later, when the rats were young adults, those that had been exposed to methylphenidate were unusually anxious: they avoided traversing the central area of an open, novel space more so than did rats not exposed to methylphenidate. Adverse effects of this stimulant, the authors speculate, could contribute to the high rates of anxiety disorders among ADHD patients.
Copying Cocaine? The long-term use of any drug that affects the brain’s reward circuitry also raises the specter of addiction. Methyl-phenidate has a chemical structure similar to that of cocaine and acts on the brain in a very similar way. Both cocaine and methamphetamine (also called “speed” or “meth”)—another highly addictive stimulant—block dopamine transporters just as ADHD drugs do [see “New Weapons against Cocaine Addiction,” by Peter Sergo; Scientific American Mind, April/May 2008]. In the case of the illicit drugs, the dopamine surge is so sudden that in addition to making a person unusually energetic and alert, it produces a “high.”
Recent experiments in animals have sounded the alarm that methylphenidate may alter the brain in ways similar to that of more powerfully addictive stimulants such as cocaine.
In February 2009 neuroscientists Yong Kim and Paul Greengard, along with their colleagues at the Rockefeller University, reported cocainelike structural and chemical alterations in the brains of mice given methylphenidate. The researchers injected the mice with either methylphenidate or cocaine daily for two weeks. Both treatments increased the density of tiny extensions called spines at the ends of neurons bearing dopamine receptors in the rodent nucleus accumbens. Compared with cocaine, methylphenidate had a somewhat more localized influence; it also had more power over longer spines and less effect on shorter ones. Otherwise, the drugs’ effects were strikingly similar.
Furthermore, the scientists found that methylphenidate boosted the amount of a protein called ΔFosB, which turns genes on and off, even more than cocaine did. That result could be a chemical warning of future problems: excess ΔFosB heightens an animal’s sensitivity to the rewarding effects of cocaine and makes the animal more likely to ingest the drug. Many former cocaine addicts struggle with depression, anxiety and cognitive problems. Researchers have found that cocaine has remodeled the brains of such ex-users. Similar problems—principally, perhaps, difficulty experiencing joy and excitement in life—could occur after many years of Ritalin or Adderall use.
Amphetamine and methylphenidate can also be addictive if abused by, say, crushing or snorting the pills. In a classic study published in 1995 research psychiatrist Nora Volkow, then at Stony Brook University, and her colleagues showed that injections of methylphenidate produced a cocainelike high in volunteers. More than seven million people in the U.S. have abused methylphenidate, and as many as 750,000 teenagers and young adults show signs of addiction, according to a 2006 report.
Typical oral doses of ADHD meds rarely produce such euphoria and are not usually addicting. Furthermore, the evidence to date, including two 2008 studies from the National Institute on Drug Abuse, indicates that children treated with stimulants early in life are not more likely than other children to become addicted to drugs as adults. In fact, the risk for severe cases of ADHD may run in the opposite direction. (A low addiction risk also jibes with Carlezon’s earlier findings, which indicated that methylphenidate use in early life mutes adult rats’ response to cocaine.)
Corrupting CognitionAmphetamines such as Adderall could alter the mind in other ways. A team led by psychologist Stacy A. Castner of the Yale University School of Medicine has documented long-lasting behavioral oddities, such as hallucinations, and cognitive impairment in rhesus monkeys that received escalating injected doses of amphetamine over either six or 12 weeks.
Compared with monkeys given inactive saline, the drug-treated monkeys displayed deficits in working memory—the short-term buffer that allows us to hold several items in mind—which persisted for at least three years after exposure to the drug. The researchers connected these cognitive problems to a significantly lower level of dopamine activity in the frontal cortex of the drug-treated monkeys as compared with that of the monkeys not given amphetamine.
Underlying such cognitive and behavioral effects may be subtle structural changes too small to show up on brain scans. In a 1997 study psychologists Terry E. Robinson and Bryan Kolb of the University of Michigan at Ann Arbor found that high injected doses of amphetamine in rats cause the major output neurons of the nucleus accumbens to sprout longer branches, or dendrites, as well as additional spines on those dendrites. A decade later Castner’s team linked lower doses of amphetamine to subtle atrophy of neurons in the prefrontal cortex of monkeys.
A report published in 2005 by neurologist George A. Ricaurte and his team at the Johns Hopkins University School of Medicine is even more damning to ADHD meds because the researchers used realistic doses and drug delivery by mouth instead of by injection. Ricaurte’s group trained baboons and squirrel monkeys to self-administer an oral formulation of amphetamine similar to Adderall: the animals drank an amphetamine-laced orange cocktail twice a day for four weeks, mimicking the dosing schedule in humans. Two to four weeks later the researchers detected evidence of amphetamine-induced brain damage, encountering lower levels of dopamine and fewer dopamine transporters on nerve endings in the striatum—a trio of brain regions that includes the nucleus accumbens—in amphetamine-treated primates than in untreated animals. The authors believe these observations reflect a drug-related loss of dopamine-releasing nerve fibers that reach the striatum from the brain stem.
One possible consequence of a loss of dopamine and its associated molecules is Parkinson’s disease, a movement disorder that can also lead to cognitive deficits. A study in humans published in 2006 hints at a link between Parkinson’s and a prolonged exposure to amphetamine in any form (not just that prescribed for ADHD). Before Parkinson’s symptoms such as tremors and muscle rigidity appear, however, dopamine’s function in the brain must decline by 80 to 90 percent, or by about twice as much as what Ricaurte and his colleagues saw in baboons that were drinking a more moderate dose of the drug. And some studies have found no connection between stimulant use and Parkinson’s.
Stimulants do seem to stunt growth in children. Otherwise, however, studies in humans have largely failed to demonstrate any clear indications of harm from taking ADHD medications as prescribed. Whether the drugs alter the human brain in the same way they alter that of certain animals is unknown, because so far little clinical data exist on their long-term neurological effects. Even when the dosing is similar or the animals have something resembling ADHD, different species’ brains may have varying sensitivities to stimulant medications.
Nevertheless, in light of the emerging evidence, many doctors and researchers are recommending a more cautious approach to the medical use of stimulants. Some are urging the adoption of strict diagnostic criteria for ADHD and a policy restricting prescriptions for individuals who fit those criteria. Others are advocating behavior modification—which can be as effective as stimulants over the long run—as a first-line approach to combating the disorder. Certain types of mental exercises may also ease ADHD symptoms [see “Train Your Brain,” by Ulrich Kraft; Scientific American Mind, February/March 2006]. For patients who require stimulants, some neurologists and psychiatrists have also suggested using the lowest dose needed or monitoring the blood levels of these drugs as a way of keeping concentrations below those shown to be problematic in other mammals. Without these or similar measures, large numbers of people who regularly take stimulants may ultimately struggle with a new set of problems spawned by the treatments themselves.
Growing ProblemsSo far the best-documented problem associated with the stimulants used to treat attention-deficit hyperactivity disorder (ADHD) concerns growth. Human growth is controlled at least in part through the hypothalamus and pituitary at the base of the brain. Studies in mice hint that stimulants may increase levels of the neurotransmitter dopamine in the hypothalamus as well as in the striatum (a three-part brain structure that includes part of its reward circuitry) and that the excess dopamine may reach the pituitary by way of the bloodstream and act to retard growth.
Recent work strongly indicates that the drugs can stunt growth in children. In a 2007 analysis of a National Institute of Mental Health study of ADHD treatments involving 579 children, research psychiatrist Nora Volkow, who directs the National Institute of Drug Abuse, and her colleagues compared growth rates of unmedicated seven- to 10-year-olds over three years with those of kids who took stimulants throughout that period. Relative to the unmedicated youths, the drug-treated youths showed a decrease in growth rate, gaining, on average, two fewer centimeters in height and 2.7 kilograms less in weight. Although this growth-stunting effect came to a halt by the third year, the kids on the meds never caught up to their counterparts.