How far can science advance brain-machine interface technology? Will we one day pipe the latest blog entry or NASCAR highlights directly into the human brain as if the organ were an outsize flash drive?
By Gary Stix
The cyberpunk science fiction that emerged in the 1980s routinely paraded “neural implants” for hooking a computing device directly to the brain: “I had hundreds of megabytes stashed in my head,” proclaimed the protagonist of “Johnny Mnemonic,” a William Gibson story that later became a wholly forgettable movie starring Keanu Reeves.
The genius of the then emergent genre (back in the days when a megabyte could still wow) was its juxtaposition of low-life retro culture with technology that seemed only barely beyond the capabilities of the deftest biomedical engineer. Although the implants could not have been replicated at the Massachusetts Institute of Technology or the California Institute of Technology, the best cyberpunk authors gave the impression that these inventions might yet materialize one day, perhaps even in the reader’s own lifetime.
In the past 10 years, however, more realistic approximations of technologies originally evoked in the cyberpunk literature have made their appearance. A person with electrodes implanted inside his brain has used neural signals alone to control a prosthetic arm, a prelude to allowing a human to bypass limbs immobilized by amyotrophic lateral sclerosis or stroke. Researchers are also investigating how to send electrical messages in the other direction as well, providing feedback that enables a primate to actually sense what a robotic arm is touching.
But how far can we go in fashioning replacement parts for the brain and the rest of the nervous system? Besides controlling a computer cursor or robot arm, will the technology somehow actually enable the brain’s roughly 100 billion neurons to function as a clandestine repository for pilfered industrial espionage data or another plot element borrowed from Gibson?
Will Human Become Machine?
Today’s Hollywood scriptwriters and futurists, less skilled heirs of the original cyberpunk tradition, have embraced these neurotechnologies. The Singularity Is Near, scheduled for release next year, is a film based on the ideas of computer scientist Ray Kurzweil, who has posited that humans will eventually achieve a form of immortality by transferring a digital blueprint of their brain into a computer or robot.
Yet the dream of eternity as a Max Headroom–like avatar trapped inside a television set (or as a copy-and-paste job into the latest humanoid bot) remains only slightly less distant than when René Descartes ruminated on mind-body dualism in the 17th century. The wholesale transfer of self—a machine-based facsimile of the perception of the ruddy hues of a sunrise, the constantly shifting internal emotional palette and the rest of the mix that combines to evoke the uniquely subjective sense of the world that constitutes the essence of conscious life—is still nothing more than a prop for fiction writers.
Hoopla over thought-controlled prostheses, moreover, obscures the lack of knowledge of the underlying mechanisms of neural functioning needed to feed information into the brain to re-create a real-life cyberpunk experience. “We know very little about brain circuits for higher cognition,” says Richard A. Andersen, a neuroscientist at Caltech.
What, then, might realistically be achieved by interactions between brains and machines? Do the advances from the first EEG experiment to brain-controlled arms and cursors suggest an inevitable, deterministic progression, if not toward a Kurzweilian singularity, then perhaps toward the possibility of inputting at least some high-level cognitive information into the brain? Could we perhaps download War and Peace or, with a nod to The Matrix, a manual of how to fly a
helicopter? How about inscribing the sentence “See Spot run” into the memory of someone who is unconscious of the transfer? How about just the word “see”?
These questions are not entirely academic, although some wags might muse that it would be easier just to buy a pair of reading glasses and do things the old-fashioned way. Even if a pipeline to the cortex remains forever a figment of science fiction, an understanding of how photons, sound waves, scent molecules and pressure on the skin get translated into lasting memories will be more than mere cyberpunk entertainment. A neural prosthesis built from knowledge of these underlying processes could help stroke victims or Alzheimer’s patients form new memories.
Primitive means of jacking in already reside inside the skulls of thousands of people. Deaf or profoundly hearing-impaired individuals carry cochlear implants that stimulate the auditory nerve with sounds picked up by a microphone—a device that neuroscientist Michael S. Gazzaniga of the University of California, Santa Barbara, has characterized as the first successful neuroprosthesis in humans. Arrays of electrodes that serve as artificial retinas are in the laboratory. If they work, they might be tweaked to give humans night vision.
The more ambitious goal of linking Amazon.com directly to the hippocampus, a neural structure involved with forming memories, requires technology that has yet to be invented. The bill of particulars would include ways of establishing reliable connections between neurons and the extracranial world—and a means to translate a digital version of War and Peace into the language that neurons use to communicate with one another. An inkling of how this might be done can be sought by examining leading work on brain-machine interfaces.
Your Brain on Text
Jacking text into the brain requires consideration of whether to insert electrodes directly into tissue, an impediment that might make neural implants impractical for anyone but the disabled. As has been known for nearly a century, the brain’s electrical activity can be detected without cracking bone. What looks like a swimming cap studded with electrodes can transmit signals from a paralyzed patient, thereby enabling typing of letters on a screen or actual surfing of the Web. Niels Birbaumer of the University of Tübingen in Germany, a leading developer of the technology, asserts that trial-and-error stimulation of the cortex using a magnetic signal from outside the skull, along with the electrode cap to record which neurons are activated, might be able to locate the words “see” or “run.” Once mapped, these areas could be fired up again to evoke those memories—at least in theory.
Some neurotechnologists think that if particular words reside in specific spots in the brain (which is debatable), finding those spots would probably require greater precision than is afforded by a wired swim cap. One of the ongoing experiments with invasive implants could possibly lead to the needed fine-level targeting. Philip R. Kennedy of Neural Signals and his colleagues designed a device that records the output of neurons. The hookup lets a stroke victim send a signal, through thought alone, to a computer that interprets it as, say, a vowel, which can then be vocalized by a speech synthesizer, a step toward forming whole words. This type of brain-machine interface might also eventually be used for activating individual neurons.
Still more precise hookups might be furnished by nanoscale fibers, measuring 100 nanometers or less in diameter, which could easily tap into single neurons because of their dimensions and their electrical and mechanical properties. Jun Li of Kansas State University and his colleagues have crafted a brushlike structure in which nanofiber bristles serve as electrodes for stimulating or receiving neural signals. Li foresees it as a way to stimulate neurons to allay Parkinson’s disease or depression, to control a prosthetic arm or even to flex astronauts’ muscles during long spaceflights to prevent the inevitable muscle wasting that occurs in zero gravity.
Learning the Language
Fulfilling the fantasy of inputting a calculus text—or even plugging in Traveler’s French before going on vacation—would require far deeper insight into the brain signals that encode language and other neural representations.
Unraveling the neural code is one of the most imposing challenges in neuroscience—and, to misappropriate Freud, would likely pave a royal road to an understanding of consciousness. Theorists have advanced many differing ideas to explain how the billions of neurons and trillions of synapses that connect them can ping meaningful messages to one another. The oldest is that the code corresponds to the rate of firing of the voltage spikes generated by a neuron.
Whereas the rate code may suffice for some stimuli, it might not be enough for booting a Marcel Proust or a Richard Feynman, supplying a mental screen capture of a madeleine cake or the conceptual abstraction of a textbook of differential equations. More recent work has focused on the precise timing of the intervals between each spike (temporal codes) and the constantly changing patterns of how neurons fire together (population codes).
Some help toward downloading to the brain might come from a decadelong endeavor to build an artificial hippocampus to help people with memory deficits, which may have the corollary benefit of helping researchers gain insights into the coding process. A collaboration between the University of Southern California and Wake Forest University has worked to fashion a replacement body part for this memory-forming brain structure. The hippocampus, seated deep within the brain’s temporal lobe, sustains damage in stroke or Alzheimer’s. An electronic bypass of a damaged hippocampus could restore the ability to create new memories. The project, funded by the National Science Foundation and the Defense Advanced Research Projects Agency, might eventually go further, enhancing normal memory or helping to deduce the particular codes needed for high-level cognition.
The two groups—led by Theodore W. Berger at U.S.C. and Samuel Deadwyler at Wake Forest—are preparing a technical paper showing that an artificial hippocampus took over from the biological organ the task of consolidating a rat’s memory of pressing a lever to receive a drop of water. Normally the hippocampus emits signals that are relayed to cortical areas responsible for storing the long-term memory of an experience. For the experiment, a chemical temporarily incapacitated the hippocampus. When the rat pressed the correct bar, electrical input from sensory and other areas of the cortex were channeled through a microchip, which, the scientists say, dispatched the same signals the hippocampus would have sent. A demonstration that an artificial device mimicked hippocampal output would mark a step toward deducing the underlying code that could be used to create a memory in the motor cortex—and perhaps one day to unravel ciphers for even higher-level behaviors.
If the codes for the sentence “See Spot run”—or perhaps an entire technical manual—could be ascertained, it might, in theory, be possible to input them directly to an electrode array in the hippocampus (or cortical areas), evoking the scene in The Matrix in which instructions for flying a helicopter are downloaded by cell phone. Artificial hippocampus research postulates a scenario only slightly more prosaic. “The kinds of examples [the U.S. Department of Defense] likes to typically use are coded information for flying an F-15,” says Berger.
The seeming simplicity of the model of neural input envisaged by artificial hippocampus-related studies may raise more questions than it answers. Would such an implant overwrite existing memories? Would the code for the sentence “See Spot run” be the same for me as it is for you or, for that matter, a native Kurdish speaker? Would the hippocampal codes merge cleanly with other circuitry that provides the appropriate context, a semantic framework, for the sentence? Would “See Spot run” be misinterpreted as a laundry mishap instead of a trotting dog?
Some neuroscientists think the language of the brain may not be deciphered until understanding moves beyond the reading of mere voltage spikes. “Just getting a lot of signals and trying to understand what these signals mean and correlating them with particular behavior is not going to solve it,” notes Henry Markram, director of neuroscience and technology at the Swiss Federal Institute of Technology in Lausanne. A given input into a neuron or groups of neurons can produce a particular output—conversion of sensory inputs to long-term memory by the hippocampus, for instance—through many different pathways. “As long as there are lots of different ways to do it, you’re not even close,” he says.
The Blue Brain Project, which Markram heads, is an attempt that began in 2005 to use supercomputer-based simulations to reverse-engineer the brain at the molecular and cellular levels—modeling first the simpler rat organ and then the human version to unravel the underlying function of neural processes. The latter task awaits a computer that boasts a more than 1,000-fold improvement over the processing power of current supercomputers. The actual code, when it does emerge, may be structured very differently from what appears in today’s textbooks. “I think there will be a conceptual breakthrough that will have significant implications for how we think of reality,” Markram says. “It will be quite a profound thing. That’s probably why it’s such an intractable problem.”
The challenge involved in figuring out how to move information into the brain suggests a practical foreseeable limit for how far neurotechnology might be advanced. The task of forming the multitude of connections that make a memory is vastly different from magnetizing a set of bits on a hard disk. “Complex information like the contents of a book would require the interactions of a very large number of brain cells over a very large area of the nervous system,” observes neuroscientist John P. Donoghue of Brown University. “Therefore, you couldn’t address all of them, getting them to store in their connections the correct kind of information. So I would say based on current knowledge, it’s not possible.”
Writing to the brain may remain a dream lost in cyberspace. But the seeming impossibility does not make Donoghue less sanguine about ultimate expectations for feeding information the other way and developing brain-controlled prostheses for the severely disabled. He has been a leader in studies to implant an array of multiple electrodes into the brain that can furnish a direct line from the cortex to a prosthetic arm or even a wheelchair.
Donoghue predicts that in the next five years brain-machine interfaces will let a paralyzed person pick up a cup and take a drink of water and that, in some distant future, these systems might be further refined so that a person with an upper spinal cord injury might accomplish the unthinkable, perhaps even playing a game of basketball with prosthetics that would make a reality of The Six Million Dollar Man, the 1970s television series. Even without an information pipeline into the brain, disabled patients and basic researchers might still reap the benefits of lesser substitutes. Gert Pfurtscheller of the Graz University of Technology in Austria and his colleagues reported last year on a patient with a spinal cord injury who was able, merely by thinking, to traverse a virtual environment, moving from one end to the other of a simulated street. Duke University’s Miguel A. L. Nicolelis, another pioneer in brain-machine interfaces, has begun to explore how monkeys connected to brain-controlled prosthetic devices begin to develop a kinesthetic awareness, a sense of movement and touch, that is completely separate from sensory inputs into their biological bodies. “There’s some physiological evidence that during the experiment they feel more connected to the robots than to their own bodies,” he says.
The most important consequences of these investigations may be something other than neural implants and robotic arms. An understanding of central nervous system development acquired by the Blue Brain Project or another simulation may let educators understand the best ways to teach children and determine at what point a given pedagogical technique should be applied. “You can build an educational development program that is engineered to, in the shortest possible time, allow you to acquire certain capabilities,” Markram says. If he is right, research on neural implants and brain simulations will produce more meaningful practical benefits than dreams of the brain as a flash drive drawn from 20th-century science-fiction literature.
Note: This article was originally published with the title, "Jacking Into the Brain".
A Scientific Study of the Human Mind and the Understanding of Human Behavior through the analysis and research of Meta Psychology.
Wednesday, November 12, 2008
Tuesday, November 11, 2008
WW II vet held in Nazi slave camp breaks silence: 'Let it be known'
* World War II vet held in slave camp witnessed Nazi atrocities first-hand
* Anthony Acevedo, 84, was one of 350 U.S. soldiers held at Buchenwald subcamp
* Only about 165 survived captivity and their subsequent death march, he says
* Survivors signed documents never to speak; Acevedo says now people "must know"
By Wayne Drash, Thelma Gutierrez and Sara Weisfeldt
LOMA LINDA, California (CNN) -- Anthony Acevedo thumbs through the worn, yellowed pages of his diary emblazoned with the words "A Wartime Log" on its cover. It's a catalog of deaths and atrocities he says were carried out on U.S. soldiers held by Nazis at a slave labor camp during World War II -- a largely forgotten legacy of the war.
Acevedo pauses when he comes across a soldier with the last name of Vogel.
"He died in my arms. He wouldn't eat. He didn't want to eat," says Acevedo, now 84 years old. "He said, 'I want to die! I want to die! I want to die!' "
The memories are still fresh, some 60 years later. Acevedo keeps reading his entries, scrawled on the pages with a Schaeffer fountain pen he held dear. See inside Acevedo's diary »
He was one of 350 U.S. soldiers held at Berga am Elster, a satellite camp of the Nazis' notorious Buchenwald concentration camp. The soldiers, working 12-hour days, were used by the German army to dig tunnels and hide equipment in the final weeks of the war. Less than half of the soldiers survived their captivity and a subsequent death march, he says.
Acevedo shows few emotions as he scans the pages of his diary. But when he gets to one of his final entries, the decades of pent-up pain, the horror witnessed by a 20-year-old medic, are too much.
"We were liberated today, April the 23, 1945," he reads.
His body shakes, and he begins sobbing. "Sorry," he says, tears rolling down his face. "I'm sorry." VideoWatch Acevedo's emotional account of being freed »
Acevedo's story is one that was never supposed to be told. "We had to sign an affidavit ... [saying] we never went through what we went through. We weren't supposed to say a word," he says.
The U.S. Army Center of Military History provided CNN a copy of the document signed by soldiers at the camp before they were sent back home. "You must be particularly on your guard with persons representing the press," it says. "You must give no account of your experience in books, newspapers, periodicals, or in broadcasts or in lectures."
The document ends with: "I understand that disclosure to anyone else will make me liable to disciplinary action."
The information was kept secret "to protect escape and evasion techniques and the names of personnel who helped POW escapees," said Frank Shirer, the chief historian at the U.S. Army Center for Military History.
Acevedo sees it differently. For a soldier who survived one of the worst atrocities of mankind, the military's reaction is still painful to accept. "My stomach turned to acid, and the government didn't care. They didn't give a hullabaloo."
It took more than 50 years, he says, before he received 100 percent disability benefits from the U.S. Department of Veterans Affairs.
Despite everything Acevedo endured during the war, little had prepared him for his own father's attitude toward his capture. "My dad told me I was a coward," he says.
"I turned around and got my duffel bag, my luggage, and said, 'This is it, Father. I'm not coming back.' So I took the train the following day, and I didn't see my parents for years, because I didn't want to see them. I felt belittled."
For decades, Acevedo followed the rules and kept his mouth shut. His four children didn't know the extent of his war experience. He says he felt stymied because of the document he signed. "You never gave it a thought because of that paper."
Now, he says it's too important to be forgotten. In recent years, he's attended local high schools to tell his story to today's generation.
"Let it be known," he says. "People have to know what happened."
Born July 31, 1924, in San Bernardino, California, Anthony C. Acevedo is what is known in today's parlance as a "citizen child" -- one who was born in the United States to parents from Mexico. iReport: Tell us your war stories
A Mexican-American, he was schooled in Pasadena, California, but couldn't attend the same classes as his white peers. "We couldn't mix with white people," he says. Both of his parents were deported to Mexico in 1937, and he went with them.
Acevedo returned to the States when he was 17, he says, because he wanted to enlist in the U.S. Army. He received medical training in Illinois before being sent to the European theater.
A corporal, he served as a medic for the 275th Infantry Regiment of the 70th Infantry Division. Acevedo was captured at the Battle of the Bulge after days of brutal firefights with Nazis who surrounded them. He recalls seeing another medic, Murry Pruzan, being gunned down.
"When I saw him stretched out there in the snow, frozen," Acevedo says, shaking his head. "God, that's the only time I cried when I saw him. He was stretched out, just massacred by a machine gun with his Red Cross band."
He pauses. "You see all of them dying out there in the fields. You have to build a thick wall."
Acevedo was initially taken to a prison camp known as Stalag IX-B in Bad Orb, Germany, where thousands of American, French, Italian and Russian soldiers were held as prisoners of war. Acevedo's diary entry reads simply: "Was captured the 6th of January 1945."
For the next several months, he would be known by the Germans only as Prisoner Number 27016. One day while in Stalag IX-B, he says, a German commander gathered American soldiers and asked all Jews "to take one step forward." Few willingly did so. VideoWatch Acevedo describe being selected as an "undesirable" »
Jewish soldiers wearing Star of David necklaces began yanking them off, he says. About 90 Jewish soldiers and another 260 U.S. soldiers deemed "undesirables" -- those who "looked like Jews" -- were selected. Acevedo, who is not Jewish, was among them.
They were told they were being sent to "a beautiful camp" with a theater and live shows.
"It turned out to be the opposite," he says. "They put us on a train, and we traveled six days and six nights. It was a boxcar that would fit heads of cattle. They had us 80 to a boxcar. You couldn't squat. And there was little tiny windows that you could barely see through."
It was February 8, 1945, when they arrived. The new camp was known as Berga am Elster, a subcamp of Buchenwald, the Nazi concentration camp where tens of thousands of Jews and other political prisoners were killed under Adolf Hitler's regime. PhotoSee the horrors of Buchenwald »
Acevedo says he was one of six medics among the 350 U.S. soldiers at Berga. Political prisoners from other countries were held at Berga separate from the Americans. "We didn't mingle with them at all," he says, adding that the U.S. soldiers worked in the same tunnels as the other political prisoners.
"We were all just thin as a rail."
The U.S. prisoners, Acevedo says, were given 100 grams of bread per week made of redwood sawdust, ground glass and barley. Soup was made from cats and rats, he says. Eating dandelion leaves was considered a "gourmet meal."
If soldiers tried to escape, they would be shot and killed. If they were captured alive, they would be executed with gunshots to their foreheads, Acevedo says. Wooden bullets, he says, were used to shatter the inside of their brains. Medics were always asked to fill the execution holes with wax, he says.
"Prisoners were being murdered and tortured by the Nazis. Many of our men died, and I tried keeping track of who they were and how they died."
The soldiers were forced to sleep naked, two to a bunk, with no blankets. As the days and weeks progressed, his diary catalogs it all. The names, prisoner numbers and causes of death are listed by the dozens in his diary. He felt it was his duty as a medic to keep track of everyone.
"I'm glad I did it," he says.
As a medic, he says, he heard of other more horrific atrocities committed by the Nazis at camps around them. "We heard about experiments that they were doing -- peeling the skins of people, humans, political prisoners, making lampshades." VideoWatch Acevedo talk about Nazi atrocities »
He and the other soldiers were once taken to what Acevedo believes was the main camp of Buchenwald, about 30 miles (48 kilometers) from Berga. They noticed large pipes coming from one building.
"We thought we were going to be gassed when we were told to take our clothes off," he says. "We were scared. We were stripped."
"Rumors were around that this was where the political prisoners would be suffocated with gas." It turned out to be a shower, the only time during their captivity they were allowed to bathe.
The main Buchenwald camp was officially liberated on April 11, 1945. But the camp and its subcamps were emptied of tens of thousands of prisoners as American troops neared. The U.S. troops held at the Berga compound were no exception.
"Very definite that we are moving away from here and on foot. This isn't very good for our sick men. No drinking water and no latrines," Acevedo wrote in his diary on April 4, 1945.
He says they began a death march of 217 miles (349 kilometers) that would last three weeks. More than 300 U.S. soldiers were alive at the start of the march, he says; about 165 were left by the end, when they were finally liberated.
Lines of political prisoners in front of them during the march caught the full brunt of angry Nazi soldiers.
"We saw massacres of people being slaughtered off the highway. Women, children," he says. "You could see people of all ages, hanging on barbed wire."
One of his diary entries exemplifies an extraordinary patriotism among soldiers, even as they were being marched to their deaths. "Bad news for us. President Roosevelt's death. We all felt bad about it. We held a prayer service for the repose of his soul," Acevedo wrote on April 13, 1945.
It adds, "Burdeski died today."
To this day, Acevedo still remembers that soldier. He wanted to perform a tracheotomy using his diary pen to save Burdeski, a 41-year-old father of six children. A German commander struck Acevedo in the jaw with a rifle when he asked.
"I'll never forget," he says.
On a recent day, about a dozen prisoners of war held during World War II and their liberators gathered at the Jerry L. Pettis Memorial Veterans Medical Center in Loma Linda, California. Many applauded Acevedo for his heroics.
"Those of us in combat have our own heroes, and those are the medics. And that's Antonio. Thank you, Antonio," one of the men said.
The men gathered there nodded their heads. Two stood to shake Acevedo's hand.
"The people that are in this room really are an endangered species," another man said. "When they're gone, they're gone. ... That is why they should be honored and put in history for generations to come, because there are not that many of them left."
Donald George sat next to Acevedo. The two were captured about a half-mile apart during the Battle of the Bulge. "It's hard to explain how it is to be sitting with a bunch of people that you know they've been through the same thing you've been through," George said.
"Some of us want to talk about it, and some of us don't. Some of us want to cry about it once in a while, and some of us won't. But it's all there," he said.
"We still like to come and be together a couple times a month," George added, before Acevedo finished his sentence: "To exchange what you are holding back inside."
Acevedo says the world must never forget the atrocities of World War II and that for killing 6 million Jews, Hitler was the worst terrorist of all time. He doesn't want the world to ever slide backward.
His message on this Veterans Day, he says, is never to hold animosity toward anybody.
"You only live once. Let's keep trucking. If we don't do that, who's going to do it for us? We have to be happy. Why hate?" he says. "The world is full of hate, and yet they don't know what they want."
* Anthony Acevedo, 84, was one of 350 U.S. soldiers held at Buchenwald subcamp
* Only about 165 survived captivity and their subsequent death march, he says
* Survivors signed documents never to speak; Acevedo says now people "must know"
By Wayne Drash, Thelma Gutierrez and Sara Weisfeldt
LOMA LINDA, California (CNN) -- Anthony Acevedo thumbs through the worn, yellowed pages of his diary emblazoned with the words "A Wartime Log" on its cover. It's a catalog of deaths and atrocities he says were carried out on U.S. soldiers held by Nazis at a slave labor camp during World War II -- a largely forgotten legacy of the war.
Acevedo pauses when he comes across a soldier with the last name of Vogel.
"He died in my arms. He wouldn't eat. He didn't want to eat," says Acevedo, now 84 years old. "He said, 'I want to die! I want to die! I want to die!' "
The memories are still fresh, some 60 years later. Acevedo keeps reading his entries, scrawled on the pages with a Schaeffer fountain pen he held dear. See inside Acevedo's diary »
He was one of 350 U.S. soldiers held at Berga am Elster, a satellite camp of the Nazis' notorious Buchenwald concentration camp. The soldiers, working 12-hour days, were used by the German army to dig tunnels and hide equipment in the final weeks of the war. Less than half of the soldiers survived their captivity and a subsequent death march, he says.
Acevedo shows few emotions as he scans the pages of his diary. But when he gets to one of his final entries, the decades of pent-up pain, the horror witnessed by a 20-year-old medic, are too much.
"We were liberated today, April the 23, 1945," he reads.
His body shakes, and he begins sobbing. "Sorry," he says, tears rolling down his face. "I'm sorry." VideoWatch Acevedo's emotional account of being freed »
Acevedo's story is one that was never supposed to be told. "We had to sign an affidavit ... [saying] we never went through what we went through. We weren't supposed to say a word," he says.
The U.S. Army Center of Military History provided CNN a copy of the document signed by soldiers at the camp before they were sent back home. "You must be particularly on your guard with persons representing the press," it says. "You must give no account of your experience in books, newspapers, periodicals, or in broadcasts or in lectures."
The document ends with: "I understand that disclosure to anyone else will make me liable to disciplinary action."
The information was kept secret "to protect escape and evasion techniques and the names of personnel who helped POW escapees," said Frank Shirer, the chief historian at the U.S. Army Center for Military History.
Acevedo sees it differently. For a soldier who survived one of the worst atrocities of mankind, the military's reaction is still painful to accept. "My stomach turned to acid, and the government didn't care. They didn't give a hullabaloo."
It took more than 50 years, he says, before he received 100 percent disability benefits from the U.S. Department of Veterans Affairs.
Despite everything Acevedo endured during the war, little had prepared him for his own father's attitude toward his capture. "My dad told me I was a coward," he says.
"I turned around and got my duffel bag, my luggage, and said, 'This is it, Father. I'm not coming back.' So I took the train the following day, and I didn't see my parents for years, because I didn't want to see them. I felt belittled."
For decades, Acevedo followed the rules and kept his mouth shut. His four children didn't know the extent of his war experience. He says he felt stymied because of the document he signed. "You never gave it a thought because of that paper."
Now, he says it's too important to be forgotten. In recent years, he's attended local high schools to tell his story to today's generation.
"Let it be known," he says. "People have to know what happened."
Born July 31, 1924, in San Bernardino, California, Anthony C. Acevedo is what is known in today's parlance as a "citizen child" -- one who was born in the United States to parents from Mexico. iReport: Tell us your war stories
A Mexican-American, he was schooled in Pasadena, California, but couldn't attend the same classes as his white peers. "We couldn't mix with white people," he says. Both of his parents were deported to Mexico in 1937, and he went with them.
Acevedo returned to the States when he was 17, he says, because he wanted to enlist in the U.S. Army. He received medical training in Illinois before being sent to the European theater.
A corporal, he served as a medic for the 275th Infantry Regiment of the 70th Infantry Division. Acevedo was captured at the Battle of the Bulge after days of brutal firefights with Nazis who surrounded them. He recalls seeing another medic, Murry Pruzan, being gunned down.
"When I saw him stretched out there in the snow, frozen," Acevedo says, shaking his head. "God, that's the only time I cried when I saw him. He was stretched out, just massacred by a machine gun with his Red Cross band."
He pauses. "You see all of them dying out there in the fields. You have to build a thick wall."
Acevedo was initially taken to a prison camp known as Stalag IX-B in Bad Orb, Germany, where thousands of American, French, Italian and Russian soldiers were held as prisoners of war. Acevedo's diary entry reads simply: "Was captured the 6th of January 1945."
For the next several months, he would be known by the Germans only as Prisoner Number 27016. One day while in Stalag IX-B, he says, a German commander gathered American soldiers and asked all Jews "to take one step forward." Few willingly did so. VideoWatch Acevedo describe being selected as an "undesirable" »
Jewish soldiers wearing Star of David necklaces began yanking them off, he says. About 90 Jewish soldiers and another 260 U.S. soldiers deemed "undesirables" -- those who "looked like Jews" -- were selected. Acevedo, who is not Jewish, was among them.
They were told they were being sent to "a beautiful camp" with a theater and live shows.
"It turned out to be the opposite," he says. "They put us on a train, and we traveled six days and six nights. It was a boxcar that would fit heads of cattle. They had us 80 to a boxcar. You couldn't squat. And there was little tiny windows that you could barely see through."
It was February 8, 1945, when they arrived. The new camp was known as Berga am Elster, a subcamp of Buchenwald, the Nazi concentration camp where tens of thousands of Jews and other political prisoners were killed under Adolf Hitler's regime. PhotoSee the horrors of Buchenwald »
Acevedo says he was one of six medics among the 350 U.S. soldiers at Berga. Political prisoners from other countries were held at Berga separate from the Americans. "We didn't mingle with them at all," he says, adding that the U.S. soldiers worked in the same tunnels as the other political prisoners.
"We were all just thin as a rail."
The U.S. prisoners, Acevedo says, were given 100 grams of bread per week made of redwood sawdust, ground glass and barley. Soup was made from cats and rats, he says. Eating dandelion leaves was considered a "gourmet meal."
If soldiers tried to escape, they would be shot and killed. If they were captured alive, they would be executed with gunshots to their foreheads, Acevedo says. Wooden bullets, he says, were used to shatter the inside of their brains. Medics were always asked to fill the execution holes with wax, he says.
"Prisoners were being murdered and tortured by the Nazis. Many of our men died, and I tried keeping track of who they were and how they died."
The soldiers were forced to sleep naked, two to a bunk, with no blankets. As the days and weeks progressed, his diary catalogs it all. The names, prisoner numbers and causes of death are listed by the dozens in his diary. He felt it was his duty as a medic to keep track of everyone.
"I'm glad I did it," he says.
As a medic, he says, he heard of other more horrific atrocities committed by the Nazis at camps around them. "We heard about experiments that they were doing -- peeling the skins of people, humans, political prisoners, making lampshades." VideoWatch Acevedo talk about Nazi atrocities »
He and the other soldiers were once taken to what Acevedo believes was the main camp of Buchenwald, about 30 miles (48 kilometers) from Berga. They noticed large pipes coming from one building.
"We thought we were going to be gassed when we were told to take our clothes off," he says. "We were scared. We were stripped."
"Rumors were around that this was where the political prisoners would be suffocated with gas." It turned out to be a shower, the only time during their captivity they were allowed to bathe.
The main Buchenwald camp was officially liberated on April 11, 1945. But the camp and its subcamps were emptied of tens of thousands of prisoners as American troops neared. The U.S. troops held at the Berga compound were no exception.
"Very definite that we are moving away from here and on foot. This isn't very good for our sick men. No drinking water and no latrines," Acevedo wrote in his diary on April 4, 1945.
He says they began a death march of 217 miles (349 kilometers) that would last three weeks. More than 300 U.S. soldiers were alive at the start of the march, he says; about 165 were left by the end, when they were finally liberated.
Lines of political prisoners in front of them during the march caught the full brunt of angry Nazi soldiers.
"We saw massacres of people being slaughtered off the highway. Women, children," he says. "You could see people of all ages, hanging on barbed wire."
One of his diary entries exemplifies an extraordinary patriotism among soldiers, even as they were being marched to their deaths. "Bad news for us. President Roosevelt's death. We all felt bad about it. We held a prayer service for the repose of his soul," Acevedo wrote on April 13, 1945.
It adds, "Burdeski died today."
To this day, Acevedo still remembers that soldier. He wanted to perform a tracheotomy using his diary pen to save Burdeski, a 41-year-old father of six children. A German commander struck Acevedo in the jaw with a rifle when he asked.
"I'll never forget," he says.
On a recent day, about a dozen prisoners of war held during World War II and their liberators gathered at the Jerry L. Pettis Memorial Veterans Medical Center in Loma Linda, California. Many applauded Acevedo for his heroics.
"Those of us in combat have our own heroes, and those are the medics. And that's Antonio. Thank you, Antonio," one of the men said.
The men gathered there nodded their heads. Two stood to shake Acevedo's hand.
"The people that are in this room really are an endangered species," another man said. "When they're gone, they're gone. ... That is why they should be honored and put in history for generations to come, because there are not that many of them left."
Donald George sat next to Acevedo. The two were captured about a half-mile apart during the Battle of the Bulge. "It's hard to explain how it is to be sitting with a bunch of people that you know they've been through the same thing you've been through," George said.
"Some of us want to talk about it, and some of us don't. Some of us want to cry about it once in a while, and some of us won't. But it's all there," he said.
"We still like to come and be together a couple times a month," George added, before Acevedo finished his sentence: "To exchange what you are holding back inside."
Acevedo says the world must never forget the atrocities of World War II and that for killing 6 million Jews, Hitler was the worst terrorist of all time. He doesn't want the world to ever slide backward.
His message on this Veterans Day, he says, is never to hold animosity toward anybody.
"You only live once. Let's keep trucking. If we don't do that, who's going to do it for us? We have to be happy. Why hate?" he says. "The world is full of hate, and yet they don't know what they want."
Thursday, November 6, 2008
Why Do We Forget Things?
The brain can store a vast number of memories, so why can't we find these memories when we need to? A new study provides insights into this question.
By Edward K. Vogel and Trafton Drew
Our brains are crammed with a massive amount of memories that we have formed over a lifetime of experiences. These memories range from the profound (who am I and how did I get here?) to the most trivial (the license plate of the car at a stoplight). Furthermore, our memories also vary considerably in their precision. Parents, for instance, often know the perils of a fuzzy memory when shopping for a birthday gift for their child: remembering that their son wanted the G.I. Joe with Kung Fu Grip rather than the regular G.I. Joe could make an enormous difference in how well the gift is received. Thus, the “fuzziness” of our memory can often be just as important in our daily lives as being able to remember lots and lots of information in the first place.
Different Levels of Detail for Different Types of Memory?
In the past several decades, cognitive psychologists have determined that there are two primary memory systems in the human mind: a short-term, or “working,” memory that temporarily holds information about just a few things that we are currently thinking about; and a long-lasting memory that can hold massive amounts of information gained through a lifetime of thoughts and experiences. These two memory systems are also thought to differ in the level of detail they provide: working memory provides sharp detail about the few things we are presently thinking about, whereas long-term memory provides a much fuzzier picture about lots of different things we have seen or experienced. That is, although we can hold lots of things in long-term memory, the details of the memory aren’t always crystal-clear and are often limited to just the gist of what we saw or what happened.
A recently published study by Timothy F. Brady, a cognitive neuroscientist at the Massachusetts Institute of Technology, and colleagues suggests that these long-term memories may not be nearly as fuzzy as once thought, however. In their work, the researchers asked subjects to try to remember 3,000 pictures of common objects—including items such as backpacks, remote controls and toasters—that were presented one at a time for just a few seconds each. At the end of this viewing phase, the researchers tested subjects’ memory for each object by showing them two objects and asking which one they had seen before. Not surprisingly, subjects were exceptionally good (more than 90 percent correct) even though there were thousands of objects to remember. This high success rate attests to the massive storage ability of long-term memory. What was most surprising, however, was the amazing level of detail that the subjects had for all of these memories. The subjects were just as good at telling the difference between two pictures of the same object even when the objects differed in an extremely subtle manner, such as a pair of toasters with slightly different slices of bread.
If It’s Not Fuzzy, Why Do We Still Forget Things?
This new work provides compelling evidence that the enormous amount of information we hold in long-term memory is not so uncertain after all. It seems that we actually hold representations of things we’ve seen in a fairly detailed and precise form.
Of course, this finding raises the obvious question: if our memories aren’t all that fuzzy, then why do we often forget the details of things we want to remember? One explanation is that, although the brain contains detailed representations of lots of different events and objects, we can’t always find that information when we want it. As this study reveals, if we’re shown an object, we can often be very accurate and precise at being able to say whether we’ve seen it before. If we’re in a toy store and trying to remember what it was that our son wanted for his birthday, however, we need to be able to voluntarily search our memory for the right answer—without being prompted by a visual reminder. It seems that it is this voluntary searching mechanism that’s prone to interference and forgetfulness. At least that’s our story when we come home without the Kung Fu Grip G.I. Joe.
Are you a scientist? Have you recently read a peer-reviewed paper that you want to write about? Then contact Mind Matters editor Jonah Lehrer, the science writer behind the blog The Frontal Cortex and the book Proust Was a Neuroscientist.
By Edward K. Vogel and Trafton Drew
Our brains are crammed with a massive amount of memories that we have formed over a lifetime of experiences. These memories range from the profound (who am I and how did I get here?) to the most trivial (the license plate of the car at a stoplight). Furthermore, our memories also vary considerably in their precision. Parents, for instance, often know the perils of a fuzzy memory when shopping for a birthday gift for their child: remembering that their son wanted the G.I. Joe with Kung Fu Grip rather than the regular G.I. Joe could make an enormous difference in how well the gift is received. Thus, the “fuzziness” of our memory can often be just as important in our daily lives as being able to remember lots and lots of information in the first place.
Different Levels of Detail for Different Types of Memory?
In the past several decades, cognitive psychologists have determined that there are two primary memory systems in the human mind: a short-term, or “working,” memory that temporarily holds information about just a few things that we are currently thinking about; and a long-lasting memory that can hold massive amounts of information gained through a lifetime of thoughts and experiences. These two memory systems are also thought to differ in the level of detail they provide: working memory provides sharp detail about the few things we are presently thinking about, whereas long-term memory provides a much fuzzier picture about lots of different things we have seen or experienced. That is, although we can hold lots of things in long-term memory, the details of the memory aren’t always crystal-clear and are often limited to just the gist of what we saw or what happened.
A recently published study by Timothy F. Brady, a cognitive neuroscientist at the Massachusetts Institute of Technology, and colleagues suggests that these long-term memories may not be nearly as fuzzy as once thought, however. In their work, the researchers asked subjects to try to remember 3,000 pictures of common objects—including items such as backpacks, remote controls and toasters—that were presented one at a time for just a few seconds each. At the end of this viewing phase, the researchers tested subjects’ memory for each object by showing them two objects and asking which one they had seen before. Not surprisingly, subjects were exceptionally good (more than 90 percent correct) even though there were thousands of objects to remember. This high success rate attests to the massive storage ability of long-term memory. What was most surprising, however, was the amazing level of detail that the subjects had for all of these memories. The subjects were just as good at telling the difference between two pictures of the same object even when the objects differed in an extremely subtle manner, such as a pair of toasters with slightly different slices of bread.
If It’s Not Fuzzy, Why Do We Still Forget Things?
This new work provides compelling evidence that the enormous amount of information we hold in long-term memory is not so uncertain after all. It seems that we actually hold representations of things we’ve seen in a fairly detailed and precise form.
Of course, this finding raises the obvious question: if our memories aren’t all that fuzzy, then why do we often forget the details of things we want to remember? One explanation is that, although the brain contains detailed representations of lots of different events and objects, we can’t always find that information when we want it. As this study reveals, if we’re shown an object, we can often be very accurate and precise at being able to say whether we’ve seen it before. If we’re in a toy store and trying to remember what it was that our son wanted for his birthday, however, we need to be able to voluntarily search our memory for the right answer—without being prompted by a visual reminder. It seems that it is this voluntary searching mechanism that’s prone to interference and forgetfulness. At least that’s our story when we come home without the Kung Fu Grip G.I. Joe.
Are you a scientist? Have you recently read a peer-reviewed paper that you want to write about? Then contact Mind Matters editor Jonah Lehrer, the science writer behind the blog The Frontal Cortex and the book Proust Was a Neuroscientist.
Tuesday, October 28, 2008
Are You Evil? Profiling That Which Is Truly Wicked
A cognitive scientist employs malevolent logic to define the dark side of the human psyche
By Larry Greenemeier
TROY, N.Y.—The hallowed halls of academia are not the place you would expect to find someone obsessed with evil (although some students might disagree). But it is indeed evil—or rather trying to get to the roots of evil—that fascinates Selmer Bringsjord, a logician, philosopher and chairman of Rensselaer Polytechnic Institute's Department of Cognitive Science here. He's so intrigued, in fact, that he has developed a sort of checklist for determining whether someone is demonic, and is working with a team of graduate students to create a computerized representation of a purely sinister person.
"I've been working on what is evil and how to formally define it," says Bringsjord, who is also director of the Rensselaer AI & Reasoning Lab (RAIR). "It's creepy, I know it is."
To be truly evil, someone must have sought to do harm by planning to commit some morally wrong action with no prompting from others (whether this person successfully executes his or her plan is beside the point). The evil person must have tried to carry out this plan with the hope of "causing considerable harm to others," Bringsjord says. Finally, "and most importantly," he adds, if this evil person were willing to analyze his or her reasons for wanting to commit this morally wrong action, these reasons would either prove to be incoherent, or they would reveal that the evil person knew he or she was doing something wrong and regarded the harm caused as a good thing.
Bringsjord's research builds on earlier definitions put forth by San Diego State University philosophy professor J. Angelo Corlett as well as the late sociopolitical philosophers and psychologists, Joel Feinberg and Erich Fromm, but most significantly by psychiatrist and author M. Scott Peck in his 1983 book, People of the Lie, The Hope for Healing Human Evil. After reading Peck's tome about clinically evil people, "I thought it would be interesting to come up with formal structures that define evil," Bringsjord says, "and, ultimately, to create a purely evil character the way a creative writer would."
He and his research team began developing their computer representation of evil by posing a series of questions beginning with the basics—name, age, sex, etcetera—and progressing to inquiries about this fictional person's beliefs and motivations.
This exercise resulted in "E," a computer character first created in 2005 to meet the criteria of Bringsjord's working definition of evil. Whereas the original E was simply a program designed to respond to questions in a manner consistent with Bringsjord's definition, the researchers have since given E a physical identity: It's a relatively young, white man with short black hair and dark stubble on his face. Bringsjord calls E's appearance "a meaner version" of the character Mr. Perry in the 1989 movie Dead Poets Society. "He is a great example of evil," Bringsjord says, adding, however, that he is not entirely satisfied with this personification and may make changes.
The researchers have placed E in his own virtual world and written a program depicting a scripted interview between one of the researcher's avatars and E. In this example, E is programmed to respond to questions based on a case study in Peck's book that involves a boy whose parents gave him a gun that his older brother had used to commit suicide.
The researchers programmed E with a degree of artificial intelligence to make "him" believe that he (and not the parents) had given the pistol to the distraught boy, and then asked E a series of questions designed to glean his logic for doing so. The result is a surreal simulation during which Bringsjord's diabolical incarnation attempts to produce a logical argument for its actions: The boy wanted a gun, E had a gun, so E gave the boy the gun.
Bringsjord and his team by the end of the year hope to have completed the fourth generation of E, which will be able to use artificial intelligence and a limited set of straightforward English (no slang, for example) to "speak" with computer users.
Following the path of a true logician, Bringsjord's interest in the portrayal of virtuousness and evil in literature led to his interest in software that helps writers develop ideas and create stories; this, in turn, spurred him to develop his own software for simulating human behavior, both good and odious, says Barry Smith, a distinguished professor of bioinformatics and ontology at the State University of New York at Buffalo who is familiar with Bringsjord's work. "He's known as someone on the fringe of philosophy and computer science."
Bringsjord and Smith both have an interest in finding ways to better understand human behavior, and their work has attracted the attention of the intelligence community, which is seeking ways to successfully analyze the information they gather on potential terrorists. "To solve problems in intelligence analysis, you need more accurate representations of people," Smith says. "Selmer is trying to build really good representations of human beings in all of their subtlety."
Bringsjord acknowledges that the endeavor to create pure evil, even in a software program, does raise ethical questions, such as, how researchers could control an artificially intelligent character like E if "he" was placed in a virtual world such as Second Life, a Web-based program that allows people to create digital representations of themselves and have those avatars interact in a number of different ways.
"I wouldn't release E or anything like it, even in purely virtual environments, without engineered safeguards," Bringsjord says. These safeguards would be a set of ethics written into the software, something akin to author Isaac Asimov's "Three Laws of Robotics" that prevent a robot from harming humans, requires a robot to obey humans, and instructs a robot to protect itself—as long as that does not violate either or both of the first two laws.
By Larry Greenemeier
TROY, N.Y.—The hallowed halls of academia are not the place you would expect to find someone obsessed with evil (although some students might disagree). But it is indeed evil—or rather trying to get to the roots of evil—that fascinates Selmer Bringsjord, a logician, philosopher and chairman of Rensselaer Polytechnic Institute's Department of Cognitive Science here. He's so intrigued, in fact, that he has developed a sort of checklist for determining whether someone is demonic, and is working with a team of graduate students to create a computerized representation of a purely sinister person.
"I've been working on what is evil and how to formally define it," says Bringsjord, who is also director of the Rensselaer AI & Reasoning Lab (RAIR). "It's creepy, I know it is."
To be truly evil, someone must have sought to do harm by planning to commit some morally wrong action with no prompting from others (whether this person successfully executes his or her plan is beside the point). The evil person must have tried to carry out this plan with the hope of "causing considerable harm to others," Bringsjord says. Finally, "and most importantly," he adds, if this evil person were willing to analyze his or her reasons for wanting to commit this morally wrong action, these reasons would either prove to be incoherent, or they would reveal that the evil person knew he or she was doing something wrong and regarded the harm caused as a good thing.
Bringsjord's research builds on earlier definitions put forth by San Diego State University philosophy professor J. Angelo Corlett as well as the late sociopolitical philosophers and psychologists, Joel Feinberg and Erich Fromm, but most significantly by psychiatrist and author M. Scott Peck in his 1983 book, People of the Lie, The Hope for Healing Human Evil. After reading Peck's tome about clinically evil people, "I thought it would be interesting to come up with formal structures that define evil," Bringsjord says, "and, ultimately, to create a purely evil character the way a creative writer would."
He and his research team began developing their computer representation of evil by posing a series of questions beginning with the basics—name, age, sex, etcetera—and progressing to inquiries about this fictional person's beliefs and motivations.
This exercise resulted in "E," a computer character first created in 2005 to meet the criteria of Bringsjord's working definition of evil. Whereas the original E was simply a program designed to respond to questions in a manner consistent with Bringsjord's definition, the researchers have since given E a physical identity: It's a relatively young, white man with short black hair and dark stubble on his face. Bringsjord calls E's appearance "a meaner version" of the character Mr. Perry in the 1989 movie Dead Poets Society. "He is a great example of evil," Bringsjord says, adding, however, that he is not entirely satisfied with this personification and may make changes.
The researchers have placed E in his own virtual world and written a program depicting a scripted interview between one of the researcher's avatars and E. In this example, E is programmed to respond to questions based on a case study in Peck's book that involves a boy whose parents gave him a gun that his older brother had used to commit suicide.
The researchers programmed E with a degree of artificial intelligence to make "him" believe that he (and not the parents) had given the pistol to the distraught boy, and then asked E a series of questions designed to glean his logic for doing so. The result is a surreal simulation during which Bringsjord's diabolical incarnation attempts to produce a logical argument for its actions: The boy wanted a gun, E had a gun, so E gave the boy the gun.
Bringsjord and his team by the end of the year hope to have completed the fourth generation of E, which will be able to use artificial intelligence and a limited set of straightforward English (no slang, for example) to "speak" with computer users.
Following the path of a true logician, Bringsjord's interest in the portrayal of virtuousness and evil in literature led to his interest in software that helps writers develop ideas and create stories; this, in turn, spurred him to develop his own software for simulating human behavior, both good and odious, says Barry Smith, a distinguished professor of bioinformatics and ontology at the State University of New York at Buffalo who is familiar with Bringsjord's work. "He's known as someone on the fringe of philosophy and computer science."
Bringsjord and Smith both have an interest in finding ways to better understand human behavior, and their work has attracted the attention of the intelligence community, which is seeking ways to successfully analyze the information they gather on potential terrorists. "To solve problems in intelligence analysis, you need more accurate representations of people," Smith says. "Selmer is trying to build really good representations of human beings in all of their subtlety."
Bringsjord acknowledges that the endeavor to create pure evil, even in a software program, does raise ethical questions, such as, how researchers could control an artificially intelligent character like E if "he" was placed in a virtual world such as Second Life, a Web-based program that allows people to create digital representations of themselves and have those avatars interact in a number of different ways.
"I wouldn't release E or anything like it, even in purely virtual environments, without engineered safeguards," Bringsjord says. These safeguards would be a set of ethics written into the software, something akin to author Isaac Asimov's "Three Laws of Robotics" that prevent a robot from harming humans, requires a robot to obey humans, and instructs a robot to protect itself—as long as that does not violate either or both of the first two laws.
Monday, October 27, 2008
Why can't I remember that thing, person, task?
By Cathryn Jakobson Ramin
I'd barely crossed the threshold of middle age. As a journalist, I was invested in staying smart and quick, mistress of my good brain and sardonic tongue. But almost overnight, I found that I was missing critical information -- the names of people and places, the titles of books and movies.
Worse, I had the attention span of a flea. I was having trouble keeping track of my calendar, and my sense of direction had disappeared. The change was so dramatic that sometimes I felt foreign to myself.
Over the course of a few years, as friends and relatives moved into their 40s and 50s, I realized that I was part of a large group of people who were struggling to keep up. I was determined to find a plausible explanation for what was happening to my brain and, by extension, to middle-aged minds in general.
As a first step, I began to study and categorize midlife mental lapses as if they were so many butterflies. Oprah.com: Keep your memory strong as you age
• There was Colliding-Planets Syndrome, which occurs when you fail to grasp, until too late, that you've scheduled a child's orthodontist appointment in the suburbs for the same hour as a business meeting in the city.
• Quick-Who-Is-She Dysfunction surfaces when you are face-to-face with someone whose name stubbornly refuses to come to mind.
• What-Am-I-Doing-Here Paranoia leaves you standing empty-handed in a doorway, trying to figure out what you've come for.
• The Damn-It-They-Were-Just-in-My-Hand Affliction leads to panicky moments spent looking for your favorite new sunglasses, when all the while they're on top of your head.
• And Wrong-Vessel Disorder results in placing the ice cream in the pantry rather than the freezer.
In the past decade, cognitive neuroscientists have learned that much of what we blame on fading memory in midlife can be more accurately attributed to failing attention. Physiological changes in the brain's frontal lobes make it harder to maintain attention in the face of distractions, explains Cheryl Grady, Ph.D., a neuroscientist and assistant director of the Rotman Research Institute in Toronto.
When the frontal lobes are in top form, they're adept at figuring out what's important for the job at hand and what's irrelevant blather; a sort of neural "bouncer" automatically keeps out unnecessary information. In middle age, that bouncer takes a lot of coffee breaks. Instead of focusing on the report that's due, you find yourself wondering what's for dinner. Even background noise -- the phone chatter of the co-worker in the next cubicle --can impair your ability to concentrate on the task before you.
When the neural bouncer slacks off, the cognitive scratch pad called working memory (which allows us to manipulate and prioritize information, and remember the thread of an argument) is quickly overwhelmed. You know the feeling: You can't absorb one more shred of information, so you erect a sturdy wall, neatly deflecting your husband's announcement that he'll be working late --an announcement you later swear he never made. "Metaphorically speaking," writes social theorist David Shenk in his book "Data Smog," "we plug up our ears, pinch our noses, cover our eyes& and step into a bodysuit lined with protective padding."
As you age, you may also notice that information that once popped into your head in milliseconds now shows up in its own sweet time. Denise Park, Ph.D., a cognitive neuroscientist at the University of Illinois at Urbana-Champaign, has found that while processing speed begins to decline in your late 20s, typically you don't feel the effect until your 40s or 50s. And then you feel as though you're wading through mental Jell-O. It's tough to acknowledge that your brain is aging right along with your abs, but in both cases you can put up a fight.
Quick-Who-Is-She Dysfunction
One type of forgetfulness is so prevalent, not to mention demoralizing, that just about everyone over 40 complains about it. I refer to the very public cognitive failure known as blocking, or blanking, when names refuse to come to mind and words dart in and out of consciousness, hiding in dark closets just when you need them.
In his landmark book, "The Seven Sins of Memory," the eminent Harvard memory expert Daniel Schacter, Ph.D., notes that the concept of blocking exists in at least 45 languages. The Cheyenne used an expression, "Navonotootse`a", which translates "I have lost it on my tongue." In Korean it is Hyeu kkedu-te mam-dol-da, which in English means "sparkling at the end of my tongue."
In midlife, resolving the "tip of the tongue" dilemma grows increasingly challenging. In the split second between your query -- "What do you call that sleek, dark purple vegetable?" -- and the response -- "eggplant" --your aging brain delivers quantities of unsolicited information.
Often, notes Schacter, "people can produce virtually everything they know about a person...nearly everything they know about a word except its label." The brain volunteers words that begin with the same letter, items that are the same color or shape, and, my favorite, words with the same number of syllables -- all of which gum up the works.
Unfortunately, blocking is most common in social situations, when anxiety and distraction combine to kidnap a chunk of your already challenged working memory. Roman aristocrats avoided the problem by always traveling with a nomenclator, an alert slave whose duty it was to supply his master with the names of acquaintances as they were encountered.
In the film "The Devil Wears Prada", magazine editor Miranda Priestly relies on her young assistant, Andy Sachs, to produce the names of party guests. Absent such a companion, Barbara Wallraff, senior editor and columnist for The Atlantic, sought suggestions from her readers on how to describe what transpires when you're introducing two people but have blocked their names. One reader suggested "whomnesia." Another proposed "mumbleduction."
With planning, many instances of Quick-Who-Is-She Dysfunction can be eradicated. Before you go to see the eighth-grade play, where you will sit among people you've known since your kids were in kindergarten, take 15 minutes to look over the school directory. You may avoid the embarrassment suffered by my friend Victor, an economist, when he introduced himself to a woman at Back to School Night who reminded him that the year before, at the same event, they'd spent a pleasant hour chatting about their shared alma mater.
Writing down a few key phrases on an index card before putting yourself in a cognitively challenging situation can ward off word loss. Before heading to your book group, take a moment to review the names of the characters and the plot of the fat novel you finished two weeks ago and barely remember. The other members will thank you. If words go missing anyway, grab for a synonym. Staying on the trail like a bloodhound only exacerbates the problem.
Colliding-Planets Syndrome
To your distress, you discover that you agreed to attend your friend Sarah's 50th birthday party on the same night you're supposed to be at a convention in Las Vegas. Now, how did that happen? If I had to guess, I'd say that you said yes to Sarah's birthday ("Of course, I wouldn't miss it!") when you were nowhere near your calendar. If you want to eliminate Colliding-Planets Syndrome, that calendar must be your new best friend.
Don't get cocky and put off entering a date, even if it's just for coffee the following day. Mark A. McDaniel, Ph.D., a professor of psychology at Washington University in St. Louis and an expert in human learning and memory, found that in the face of even a brief delay, older adults have much more difficulty than younger ones keeping in mind a task to be accomplished in the future.
Refuse to agree to anything, ever, without a calendar in front of you.
And don't write down cryptic things like "Starbucks," because you'll draw a blank on which café you meant, and sit for a long time in the wrong one. Where you write things down matters: Multiple calendars -- home, work, school -- can only lead to trouble.
But what about things you must remember to do in the short term, like returning the nurse-practitioner's call in 15 minutes or putting money in the parking meter in a half hour? These are what Daniel Schacter calls time-based commitments, and putting them on your calendar isn't likely to help unless you habitually check it every five minutes.
On less than an hour's notice, my most responsible friend, Jane, agreed to pick up her neighbor's son at school when she collected her own brood. Knowing she had to make it to soccer and ballet in Los Angeles traffic, she was the first in the carpool line, where she efficiently loaded her kids and took off. The neighbor's child sat waiting on a bench until teachers phoned his mother, who had nothing nice to say when she got in touch with Jane.
In midlife we have trouble remembering to do things at specific times because we're at the mercy of a million environmental distractions. One of Denise Park's studies demonstrated that elderly subjects were more likely to remember to take their medication on schedule than middle-aged subjects, because in midlife the crush of extenuating circumstances often got in the way.
To remember to make that call to the nurse practitioner, Schacter told me, you're going to need an unmistakable cue, one that will be both available and informative. An alarm clock on the desk in front of you can do the job, but under no circumstances should you permit yourself to switch off the clock and finish just one more thing before you pick up the phone. And don't count on your PDA: You've heard those bleeps and blurps so often, you've learned to ignore them.
Damn-It-They-Were-Just-in-My-Hand Affliction
Even the most meticulously managed PDA won't work if you misplace it. And as luck would have it, the items we lose most often -- keys, glasses, wallets, cell phones, planners -- are the ones that are crucial to our survival.
This abject failure to keep track of our belongings may emerge from the brain's talent for forecasting the future. The neocortex, a long-term storage facility, constantly predicts how we'll behave in specific situations, explains Jeff Hawkins in his book "On Intelligence." Instead of reinventing the wheel every time we do something familiar, the brain chooses from a library of existing patterns, based on choices we've made before. A novel event -- a man with a gun -- gets the brain's full attention, but when we're merely lugging groceries into the house, we shift into autopilot. And autopilot is the mode in which we're likely to misplace things.
The problem can be remedied, but only with a preemptive strike. Awareness is essential: When the phone rings as you're entering the house loaded down with groceries, don't drop your keys on the counter, where they will be buried in the day's mail, making you frantically late for your dinner engagement. If you can't immediately hang the keys on the hook where they belong, keep them on your person until you can; one woman I know slips them into her bra, creating a silhouette so inelegant that she can't possibly forget where she put them.
Give up your habit of tucking important items into indiscriminate pockets of your purse or briefcase. Choose one secure zone -- front, zippered -- where you always keep your boarding pass or passport, and never alter it. You'll save yourself the discomfort of searching high and low under the stern surveillance of security personnel.
Wrong-Vessel Disorder
When you lose track of what you intended to say or do, you've had what cognitive psychologists call a prospective lapse. Wrong-Vessel Disorder is a manifestation of this problem: With the best intentions, you absentmindedly place your cell phone in your briefcase, which has many of the same attributes as your purse. Saturday morning, when you reach into your bag and come up empty, you're mystified. Because you're barely conscious when it strikes, it's hard to fend off Wrong-Vessel Disorder. You just have to laugh.
But prospective failures also show up as What-Am-I-Doing-Here Paranoia: Suddenly, as if someone depressed the power button on the remote, you go blank. The minigaps, where you march purposefully to the kitchen, only to stand there and scratch your head, are irritating; the yawning caverns can really shake your confidence. Fran, the marketing director of a local bank, was bright-eyed and ready to give her quarterly presentation before the board -- until somewhere in midsentence, three out of six points eluded her, an experience that made her realize that her days of winging it were over.
Mark McDaniel observes that younger adults make use of robust working memory, relying on a little voice that automatically whispers "get milk, get milk, get milk," all the way home. In midlife that voice is easily interrupted ("Oh, look, it's raining! Now, where did I put that umbrella?") -- at least until you're in the driveway. If you can send the voice back into the game, you'll avoid a lot of extra trips to the store. I've stuck Post-it notes on the steering wheel, which makes driving awkward, but at least I don't return home with the FedEx package still beside me on the front seat.
What-Am-I-Doing-Here Paranoia
When what you forget is not a grocery item but an idea, you've no alternative but to backtrack mentally. It's vaguely amusing to do this with a friend at lunch -- "What on earth were we talking about?" -- but in a professional situation it hurts. With a little digging, you can often extract a key idea that lingers in your working memory and, from there, reconstruct the context of the discussion. In such cases, it is helpful to have a stockpile of useful phrases, conversation fillers that buy you time. "Do you see what I mean?" works well, as does my friend Jeff's old standby, delivered with the greatest sincerity: "Now that's very interesting," even when it isn't.
When a colleague stood me up for breakfast, after exchanging no fewer than nine e-mails about where and when earlier in the week, I wasn't upset -- I was as curious as a botanist who has come upon a valuable specimen. How had it happened? Had planets collided yet again? In a classic demonstration of autopilot, he'd exited the commuter train, jumped on the subway, and gone straight to work, failing to stop at the café across the street from the station where we'd planned to meet. When I phoned his cell, it took him several seconds to realize his mistake, at which point he howled in dismay.
He didn't want to talk about it, but nevertheless I probed. "Wait," I said, "let's dissect it. How did it start?"
As was his habit, he had carefully printed out his schedule the previous night before leaving work, he explained. Then he packed up his briefcase and departed, leaving the piece of paper in the printer. From that moment on, our breakfast appointment never crossed his mind. "Is this normal?" he asked. It was normal, I assured him, in that it happened regularly to people in midlife. But that didn't mean he had to sit back and take it. It was time to make a stand.
I'd barely crossed the threshold of middle age. As a journalist, I was invested in staying smart and quick, mistress of my good brain and sardonic tongue. But almost overnight, I found that I was missing critical information -- the names of people and places, the titles of books and movies.
Worse, I had the attention span of a flea. I was having trouble keeping track of my calendar, and my sense of direction had disappeared. The change was so dramatic that sometimes I felt foreign to myself.
Over the course of a few years, as friends and relatives moved into their 40s and 50s, I realized that I was part of a large group of people who were struggling to keep up. I was determined to find a plausible explanation for what was happening to my brain and, by extension, to middle-aged minds in general.
As a first step, I began to study and categorize midlife mental lapses as if they were so many butterflies. Oprah.com: Keep your memory strong as you age
• There was Colliding-Planets Syndrome, which occurs when you fail to grasp, until too late, that you've scheduled a child's orthodontist appointment in the suburbs for the same hour as a business meeting in the city.
• Quick-Who-Is-She Dysfunction surfaces when you are face-to-face with someone whose name stubbornly refuses to come to mind.
• What-Am-I-Doing-Here Paranoia leaves you standing empty-handed in a doorway, trying to figure out what you've come for.
• The Damn-It-They-Were-Just-in-My-Hand Affliction leads to panicky moments spent looking for your favorite new sunglasses, when all the while they're on top of your head.
• And Wrong-Vessel Disorder results in placing the ice cream in the pantry rather than the freezer.
In the past decade, cognitive neuroscientists have learned that much of what we blame on fading memory in midlife can be more accurately attributed to failing attention. Physiological changes in the brain's frontal lobes make it harder to maintain attention in the face of distractions, explains Cheryl Grady, Ph.D., a neuroscientist and assistant director of the Rotman Research Institute in Toronto.
When the frontal lobes are in top form, they're adept at figuring out what's important for the job at hand and what's irrelevant blather; a sort of neural "bouncer" automatically keeps out unnecessary information. In middle age, that bouncer takes a lot of coffee breaks. Instead of focusing on the report that's due, you find yourself wondering what's for dinner. Even background noise -- the phone chatter of the co-worker in the next cubicle --can impair your ability to concentrate on the task before you.
When the neural bouncer slacks off, the cognitive scratch pad called working memory (which allows us to manipulate and prioritize information, and remember the thread of an argument) is quickly overwhelmed. You know the feeling: You can't absorb one more shred of information, so you erect a sturdy wall, neatly deflecting your husband's announcement that he'll be working late --an announcement you later swear he never made. "Metaphorically speaking," writes social theorist David Shenk in his book "Data Smog," "we plug up our ears, pinch our noses, cover our eyes& and step into a bodysuit lined with protective padding."
As you age, you may also notice that information that once popped into your head in milliseconds now shows up in its own sweet time. Denise Park, Ph.D., a cognitive neuroscientist at the University of Illinois at Urbana-Champaign, has found that while processing speed begins to decline in your late 20s, typically you don't feel the effect until your 40s or 50s. And then you feel as though you're wading through mental Jell-O. It's tough to acknowledge that your brain is aging right along with your abs, but in both cases you can put up a fight.
Quick-Who-Is-She Dysfunction
One type of forgetfulness is so prevalent, not to mention demoralizing, that just about everyone over 40 complains about it. I refer to the very public cognitive failure known as blocking, or blanking, when names refuse to come to mind and words dart in and out of consciousness, hiding in dark closets just when you need them.
In his landmark book, "The Seven Sins of Memory," the eminent Harvard memory expert Daniel Schacter, Ph.D., notes that the concept of blocking exists in at least 45 languages. The Cheyenne used an expression, "Navonotootse`a", which translates "I have lost it on my tongue." In Korean it is Hyeu kkedu-te mam-dol-da, which in English means "sparkling at the end of my tongue."
In midlife, resolving the "tip of the tongue" dilemma grows increasingly challenging. In the split second between your query -- "What do you call that sleek, dark purple vegetable?" -- and the response -- "eggplant" --your aging brain delivers quantities of unsolicited information.
Often, notes Schacter, "people can produce virtually everything they know about a person...nearly everything they know about a word except its label." The brain volunteers words that begin with the same letter, items that are the same color or shape, and, my favorite, words with the same number of syllables -- all of which gum up the works.
Unfortunately, blocking is most common in social situations, when anxiety and distraction combine to kidnap a chunk of your already challenged working memory. Roman aristocrats avoided the problem by always traveling with a nomenclator, an alert slave whose duty it was to supply his master with the names of acquaintances as they were encountered.
In the film "The Devil Wears Prada", magazine editor Miranda Priestly relies on her young assistant, Andy Sachs, to produce the names of party guests. Absent such a companion, Barbara Wallraff, senior editor and columnist for The Atlantic, sought suggestions from her readers on how to describe what transpires when you're introducing two people but have blocked their names. One reader suggested "whomnesia." Another proposed "mumbleduction."
With planning, many instances of Quick-Who-Is-She Dysfunction can be eradicated. Before you go to see the eighth-grade play, where you will sit among people you've known since your kids were in kindergarten, take 15 minutes to look over the school directory. You may avoid the embarrassment suffered by my friend Victor, an economist, when he introduced himself to a woman at Back to School Night who reminded him that the year before, at the same event, they'd spent a pleasant hour chatting about their shared alma mater.
Writing down a few key phrases on an index card before putting yourself in a cognitively challenging situation can ward off word loss. Before heading to your book group, take a moment to review the names of the characters and the plot of the fat novel you finished two weeks ago and barely remember. The other members will thank you. If words go missing anyway, grab for a synonym. Staying on the trail like a bloodhound only exacerbates the problem.
Colliding-Planets Syndrome
To your distress, you discover that you agreed to attend your friend Sarah's 50th birthday party on the same night you're supposed to be at a convention in Las Vegas. Now, how did that happen? If I had to guess, I'd say that you said yes to Sarah's birthday ("Of course, I wouldn't miss it!") when you were nowhere near your calendar. If you want to eliminate Colliding-Planets Syndrome, that calendar must be your new best friend.
Don't get cocky and put off entering a date, even if it's just for coffee the following day. Mark A. McDaniel, Ph.D., a professor of psychology at Washington University in St. Louis and an expert in human learning and memory, found that in the face of even a brief delay, older adults have much more difficulty than younger ones keeping in mind a task to be accomplished in the future.
Refuse to agree to anything, ever, without a calendar in front of you.
And don't write down cryptic things like "Starbucks," because you'll draw a blank on which café you meant, and sit for a long time in the wrong one. Where you write things down matters: Multiple calendars -- home, work, school -- can only lead to trouble.
But what about things you must remember to do in the short term, like returning the nurse-practitioner's call in 15 minutes or putting money in the parking meter in a half hour? These are what Daniel Schacter calls time-based commitments, and putting them on your calendar isn't likely to help unless you habitually check it every five minutes.
On less than an hour's notice, my most responsible friend, Jane, agreed to pick up her neighbor's son at school when she collected her own brood. Knowing she had to make it to soccer and ballet in Los Angeles traffic, she was the first in the carpool line, where she efficiently loaded her kids and took off. The neighbor's child sat waiting on a bench until teachers phoned his mother, who had nothing nice to say when she got in touch with Jane.
In midlife we have trouble remembering to do things at specific times because we're at the mercy of a million environmental distractions. One of Denise Park's studies demonstrated that elderly subjects were more likely to remember to take their medication on schedule than middle-aged subjects, because in midlife the crush of extenuating circumstances often got in the way.
To remember to make that call to the nurse practitioner, Schacter told me, you're going to need an unmistakable cue, one that will be both available and informative. An alarm clock on the desk in front of you can do the job, but under no circumstances should you permit yourself to switch off the clock and finish just one more thing before you pick up the phone. And don't count on your PDA: You've heard those bleeps and blurps so often, you've learned to ignore them.
Damn-It-They-Were-Just-in-My-Hand Affliction
Even the most meticulously managed PDA won't work if you misplace it. And as luck would have it, the items we lose most often -- keys, glasses, wallets, cell phones, planners -- are the ones that are crucial to our survival.
This abject failure to keep track of our belongings may emerge from the brain's talent for forecasting the future. The neocortex, a long-term storage facility, constantly predicts how we'll behave in specific situations, explains Jeff Hawkins in his book "On Intelligence." Instead of reinventing the wheel every time we do something familiar, the brain chooses from a library of existing patterns, based on choices we've made before. A novel event -- a man with a gun -- gets the brain's full attention, but when we're merely lugging groceries into the house, we shift into autopilot. And autopilot is the mode in which we're likely to misplace things.
The problem can be remedied, but only with a preemptive strike. Awareness is essential: When the phone rings as you're entering the house loaded down with groceries, don't drop your keys on the counter, where they will be buried in the day's mail, making you frantically late for your dinner engagement. If you can't immediately hang the keys on the hook where they belong, keep them on your person until you can; one woman I know slips them into her bra, creating a silhouette so inelegant that she can't possibly forget where she put them.
Give up your habit of tucking important items into indiscriminate pockets of your purse or briefcase. Choose one secure zone -- front, zippered -- where you always keep your boarding pass or passport, and never alter it. You'll save yourself the discomfort of searching high and low under the stern surveillance of security personnel.
Wrong-Vessel Disorder
When you lose track of what you intended to say or do, you've had what cognitive psychologists call a prospective lapse. Wrong-Vessel Disorder is a manifestation of this problem: With the best intentions, you absentmindedly place your cell phone in your briefcase, which has many of the same attributes as your purse. Saturday morning, when you reach into your bag and come up empty, you're mystified. Because you're barely conscious when it strikes, it's hard to fend off Wrong-Vessel Disorder. You just have to laugh.
But prospective failures also show up as What-Am-I-Doing-Here Paranoia: Suddenly, as if someone depressed the power button on the remote, you go blank. The minigaps, where you march purposefully to the kitchen, only to stand there and scratch your head, are irritating; the yawning caverns can really shake your confidence. Fran, the marketing director of a local bank, was bright-eyed and ready to give her quarterly presentation before the board -- until somewhere in midsentence, three out of six points eluded her, an experience that made her realize that her days of winging it were over.
Mark McDaniel observes that younger adults make use of robust working memory, relying on a little voice that automatically whispers "get milk, get milk, get milk," all the way home. In midlife that voice is easily interrupted ("Oh, look, it's raining! Now, where did I put that umbrella?") -- at least until you're in the driveway. If you can send the voice back into the game, you'll avoid a lot of extra trips to the store. I've stuck Post-it notes on the steering wheel, which makes driving awkward, but at least I don't return home with the FedEx package still beside me on the front seat.
What-Am-I-Doing-Here Paranoia
When what you forget is not a grocery item but an idea, you've no alternative but to backtrack mentally. It's vaguely amusing to do this with a friend at lunch -- "What on earth were we talking about?" -- but in a professional situation it hurts. With a little digging, you can often extract a key idea that lingers in your working memory and, from there, reconstruct the context of the discussion. In such cases, it is helpful to have a stockpile of useful phrases, conversation fillers that buy you time. "Do you see what I mean?" works well, as does my friend Jeff's old standby, delivered with the greatest sincerity: "Now that's very interesting," even when it isn't.
When a colleague stood me up for breakfast, after exchanging no fewer than nine e-mails about where and when earlier in the week, I wasn't upset -- I was as curious as a botanist who has come upon a valuable specimen. How had it happened? Had planets collided yet again? In a classic demonstration of autopilot, he'd exited the commuter train, jumped on the subway, and gone straight to work, failing to stop at the café across the street from the station where we'd planned to meet. When I phoned his cell, it took him several seconds to realize his mistake, at which point he howled in dismay.
He didn't want to talk about it, but nevertheless I probed. "Wait," I said, "let's dissect it. How did it start?"
As was his habit, he had carefully printed out his schedule the previous night before leaving work, he explained. Then he packed up his briefcase and departed, leaving the piece of paper in the printer. From that moment on, our breakfast appointment never crossed his mind. "Is this normal?" he asked. It was normal, I assured him, in that it happened regularly to people in midlife. But that didn't mean he had to sit back and take it. It was time to make a stand.
Monday, October 20, 2008
Imaging the Unconscious
Functional magnetic resonance imaging could bring psychoanalysis into the 21st century.
By Emily Singer
More than one hundred years ago, Sigmund Freud proposed his pioneering theory that hidden desires in our subconscious drive much of human behavior. While those theories have fallen out of favor in recent decades, scientists are now revisiting some of them -- with new brain imaging tools. The hope is that having a direct window into the brain's hidden processes will shed new light on anxiety disorders, and perhaps help to assess how well behavioral therapies, such as psychoanalysis, target the intricacies of the unconscious mind.
"One of the reasons people departed from Freudian concepts was because they weren't very testable," says Ronald Cohen, professor of psychiatry at Brown University in Providence, RI. "These types of [imaging] experiments would potentially be a more direct way of testing ideas that rose out of traditional psychoanalytic theory."
One of Freud's theories held that after a traumatic event, people might unconsciously associate a normally benign stimulus, say, a friendly golden retriever, with a previously fearful event, such as getting bitten by a Rottweiler. This theory seems to be true in the case of post-traumatic stress disorder (PTSD). Harmless sights and sounds, for instance, such as a bus traveling on down a street, can trigger a panic attack in someone with PTSD who was once involved in a bus crash. Furthermore, the sufferer may not be immediately able to pinpoint the cause of his or her anxiety attack.
Now scientists are using brain imaging techniques to explore how the unconscious fear signal may be turned up in people with PTSD and other anxiety disorders. To study the brain processes underlying anxiety, researchers use functional magnetic resonance imaging (fMRI) to measure a person's brain activity while he or she looks at threatening signals, such as a picture of a fearful face. These frightening pictures will spark activity in the part of the brain known as the amygdala, which is part of the evolutionarily ancient brain involved in processing emotion and fear. To study the unconscious aspects of fear and anxiety, the researchers flash the ominous picture so quickly that subjects don't consciously notice it -- the brain reacts to the image, even though the person cannot determine whether or not they actually saw it.
Last year, Amit Etkin and collaborators at Columbia University showed that people who score high on anxiety tests have a stronger amygdala response to fearful faces when those images are presented below the level of conscious perception than people who score lower on the tests. Their findings suggest that the way people respond unconsciously to the world around them could also affect their daily anxiety levels.
Now the Columbia researchers want to determine if this lab observation can be used therapeutically. To do so, they plan to study 25 people with generalized anxiety disorder, first to determine whether this exaggerated amygdala response is present in people with the disorder, then to see if cognitive behavioral therapy -- one of the best-established forms of talk therapy -- can reduce the exaggerated unconscious response.
"We can use imaging as a way of evaluating the outcome of therapy," says Eric Kandel, a Nobel-prize winning neuroscientist at Columbia University who's collaborating on the project. "Maybe we can take people who have a large [anxiety] signal and turn it down as the result of a therapeutic experience," he says.
People with PTSD show a similar exaggerated amygdala response to fearful faces. Jorge Armony, the Canada Research Chair in Affective Neuroscience at McGill University in Montreal, is studying both PTSD patients and people who have recently experienced a traumatic event and may develop PTSD. Armony and his team want to see if they can use the amygdala signal and other factors to predict who is vulnerable to the disorder and who will be resistant to therapy. "After 6 to 12 months, some people recover -- what's the difference between people who recover and people who don't?" Armony asks.
While fMRI measures of unconscious processes are useful for studying populations of people with an illness, they're not yet precise enough to diagnose an individual with a particular disorder, says Armony. "We can say that [statistically] a person with PTSD will have an exaggerated amygdala response, but that doesn't mean that everyone will have it."
Hans Breiter, a neuroscientist at Harvard Medical School, one of the first researchers to study amygdala activity with fMRI in the mid-1990s, agrees that a more extensive evaluation of the neurological changes in psychiatric disorders is necessary before the technique can have clinical applications. "This approach is promising and is the right first step, but scientists will need to study larger numbers of people with fMRI to get a better sense of the variability in brain functions that underlie anxiety and depression," he says. "They may have very different [brain activity patterns] and may have very different therapeutic needs." He predicts those larger-scale studies will happen within the next five years.
Breiter and other scientists are optimistic that fMRI can one day be used to evaluate the benefits of therapy, but they say it's unclear what brain signals, conscious or unconscious ones, will be the most effective measure.
"The question still remains, how important are these subconscious phenomena?" says Cohen at Brown. "From a cognitive behavioral perspective, the conscious aspects of depression and anxiety are more important."
Both Etkin at Columbia and Armony at McGill are also using fMRI to study conscious processes, such as attention, in people with anxiety disorders; and they plan to examine how these different factors may be important in different anxiety-related diseases, such as depression and eating disorders.
"There's information processing going on in the brain that's completely outside of awareness, which previously we could only investigate with psychoanalysis," says Tom Insel, director of the National Institutes of Mental Health in Bethesda, MD. "Now you can track [those processes] with neuro-imaging -- a tool that may be much more compelling."
By Emily Singer
More than one hundred years ago, Sigmund Freud proposed his pioneering theory that hidden desires in our subconscious drive much of human behavior. While those theories have fallen out of favor in recent decades, scientists are now revisiting some of them -- with new brain imaging tools. The hope is that having a direct window into the brain's hidden processes will shed new light on anxiety disorders, and perhaps help to assess how well behavioral therapies, such as psychoanalysis, target the intricacies of the unconscious mind.
"One of the reasons people departed from Freudian concepts was because they weren't very testable," says Ronald Cohen, professor of psychiatry at Brown University in Providence, RI. "These types of [imaging] experiments would potentially be a more direct way of testing ideas that rose out of traditional psychoanalytic theory."
One of Freud's theories held that after a traumatic event, people might unconsciously associate a normally benign stimulus, say, a friendly golden retriever, with a previously fearful event, such as getting bitten by a Rottweiler. This theory seems to be true in the case of post-traumatic stress disorder (PTSD). Harmless sights and sounds, for instance, such as a bus traveling on down a street, can trigger a panic attack in someone with PTSD who was once involved in a bus crash. Furthermore, the sufferer may not be immediately able to pinpoint the cause of his or her anxiety attack.
Now scientists are using brain imaging techniques to explore how the unconscious fear signal may be turned up in people with PTSD and other anxiety disorders. To study the brain processes underlying anxiety, researchers use functional magnetic resonance imaging (fMRI) to measure a person's brain activity while he or she looks at threatening signals, such as a picture of a fearful face. These frightening pictures will spark activity in the part of the brain known as the amygdala, which is part of the evolutionarily ancient brain involved in processing emotion and fear. To study the unconscious aspects of fear and anxiety, the researchers flash the ominous picture so quickly that subjects don't consciously notice it -- the brain reacts to the image, even though the person cannot determine whether or not they actually saw it.
Last year, Amit Etkin and collaborators at Columbia University showed that people who score high on anxiety tests have a stronger amygdala response to fearful faces when those images are presented below the level of conscious perception than people who score lower on the tests. Their findings suggest that the way people respond unconsciously to the world around them could also affect their daily anxiety levels.
Now the Columbia researchers want to determine if this lab observation can be used therapeutically. To do so, they plan to study 25 people with generalized anxiety disorder, first to determine whether this exaggerated amygdala response is present in people with the disorder, then to see if cognitive behavioral therapy -- one of the best-established forms of talk therapy -- can reduce the exaggerated unconscious response.
"We can use imaging as a way of evaluating the outcome of therapy," says Eric Kandel, a Nobel-prize winning neuroscientist at Columbia University who's collaborating on the project. "Maybe we can take people who have a large [anxiety] signal and turn it down as the result of a therapeutic experience," he says.
People with PTSD show a similar exaggerated amygdala response to fearful faces. Jorge Armony, the Canada Research Chair in Affective Neuroscience at McGill University in Montreal, is studying both PTSD patients and people who have recently experienced a traumatic event and may develop PTSD. Armony and his team want to see if they can use the amygdala signal and other factors to predict who is vulnerable to the disorder and who will be resistant to therapy. "After 6 to 12 months, some people recover -- what's the difference between people who recover and people who don't?" Armony asks.
While fMRI measures of unconscious processes are useful for studying populations of people with an illness, they're not yet precise enough to diagnose an individual with a particular disorder, says Armony. "We can say that [statistically] a person with PTSD will have an exaggerated amygdala response, but that doesn't mean that everyone will have it."
Hans Breiter, a neuroscientist at Harvard Medical School, one of the first researchers to study amygdala activity with fMRI in the mid-1990s, agrees that a more extensive evaluation of the neurological changes in psychiatric disorders is necessary before the technique can have clinical applications. "This approach is promising and is the right first step, but scientists will need to study larger numbers of people with fMRI to get a better sense of the variability in brain functions that underlie anxiety and depression," he says. "They may have very different [brain activity patterns] and may have very different therapeutic needs." He predicts those larger-scale studies will happen within the next five years.
Breiter and other scientists are optimistic that fMRI can one day be used to evaluate the benefits of therapy, but they say it's unclear what brain signals, conscious or unconscious ones, will be the most effective measure.
"The question still remains, how important are these subconscious phenomena?" says Cohen at Brown. "From a cognitive behavioral perspective, the conscious aspects of depression and anxiety are more important."
Both Etkin at Columbia and Armony at McGill are also using fMRI to study conscious processes, such as attention, in people with anxiety disorders; and they plan to examine how these different factors may be important in different anxiety-related diseases, such as depression and eating disorders.
"There's information processing going on in the brain that's completely outside of awareness, which previously we could only investigate with psychoanalysis," says Tom Insel, director of the National Institutes of Mental Health in Bethesda, MD. "Now you can track [those processes] with neuro-imaging -- a tool that may be much more compelling."
Man 'roused from coma' by a magnetic field
NewScientist.com
A volunteer models the TMS device which is worn over the front of the head to stimulate the underlying brain tissue
Enlarge image
A volunteer models the TMS device which is worn over the front of the head to stimulate the underlying brain tissue
JOSH VILLA was 26 and driving home after a drink with a friend on 28 August 2005 when his car mounted the kerb and flipped over. Villa was thrown through the windscreen, suffered massive head injuries and fell into a coma.
Almost a year later, there was little sign of improvement. "He would open his eyes, but he was not responsive to any external stimuli in his environment," says Theresa Pape of the US Department of Veterans Affairs in Chicago, who helped treat him.
Usually there is little more that can be done for people in this condition. Villa was to be sent home to Rockford, Illinois, where his mother, Laurie McAndrews, had volunteered to care for him.
But Pape had a different suggestion. She enrolled him in a six-week study in which an electromagnetic coil was held over the front of his head to stimulate the underlying brain tissue. Such transcranial magnetic stimulation (TMS) has been investigated as a way of treating migraine, stroke, Parkinson's disease and depression, with some promising results, but this is the first time it has been used as a potential therapy for someone in a coma-like state.
The rapidly changing magnetic fields that the coil creates can be used either to excite or inhibit brain cells - making it easier or harder for them to communicate with one another. In Villa's case, the coil was used to excite brain cells in the right prefrontal dorsolateral cortex. This area has strong connections to the brainstem, which sends out pulses to the rest of the brain that tell it to pay attention. "It's like an 'OK, I'm awake' pulse," says Pape.
At first, there was little change in Villa's condition, but after around 15 sessions something happened. "You started talking to him and he would turn his head and look at you," says McAndrews. "That was huge."
Villa started obeying one-step commands, such as following the movement of a thumb and speaking single words. "They were very slurred but they were there," says Pape, who presented her findings this month at an international meeting on brain stimulation at the University of Göttingen, Germany. "He'd say like 'erm', 'help', 'help me'."
After the 30 planned sessions the TMS was stopped. Without it, Villa became very tired and his condition declined a little, but he was still much better than before. Six weeks later he was given another 10 sessions, but there were no further improvements and he was sent home, where he remains today.
Villa is by no means cured. But he is easier to care for and can interact with visitors such as his girlfriend, who has stuck by him following the accident. "When you talk to him he will move his mouth to show he is listening," McAndrews says. "If I ask him, 'Do you love me?' he'll do two slow eye blinks, yes. Some people would say it's not much, but he's improving and that's the main thing."
John Whyte of the Moss Rehabilitation Research Institute in Philadelphia, Pennsylvania, cautions that as intriguing as Villa's case is, it alone does not show that TMS is a useful treatment. "Even after eight months, it is not uncommon for patients to transition from the vegetative to the minimally conscious state without any particular intervention," he points out. He says TMS merits further investigation, along with other experimental treatments such as drugs which have temporarily roused three men from a coma, and deep brain stimulation, an invasive technique that roused a man out of a minimally conscious state.
"This is the first and very interesting use of repetitive TMS in coma," says Steven Laureys of the Coma Research Group at the University of Liège in Belgium. Our understanding of disorders of consciousness is so limited that even a single study can provide new insights, he says.
Pape acknowledges that further studies are needed to demonstrate that TMS really is beneficial, though she is convinced that it helped Villa. He had only been given a 20 to 40 per cent chance of long-term recovery, and until he was given TMS his functioning had not improved since about four months after the accident. What's more, after the 15th TMS session, he improved incrementally with each session - further evidence that TMS was the cause.
Pape hopes to begin treating a second patient in a coma-like state later this year. This time she plans to adjust the number of pulses of TMS in each train, and to alter the gap between pulses to see if there is an optimum interval.
McAndrews is also in no doubt that her son's quality of life has improved as a result of TMS. "Before I felt like he was not responsive, that he was depressed almost. Now you move him around and he complains - he can show emotions on that level."
See "Editorial: improving the lot of coma patients"
A gentle current helps when words are hard to find
People with Alzheimer's disease got better at a word-recognition task after their brains were stimulated with an electric current.
Like transcranial magnetic stimulation or TMS (see main story), transcranial direct current stimulation (tDCS) aims to activate or inhibit areas of the brain by making it easier or harder for the brain cells to fire. While TMS involves holding a current-carrying coil over the subject's head, tDCS, which has previously shown promise in treating pain and depression (New Scientist, 5 April 2006, p 34), uses electrodes to send a current of 1 to 2 milliamps through the skull.
In Alzheimer's, the temporoparietal areas of the brain, which are involved in memory and recognition, are known to be less active than in healthy people. So Alberto Priori at the University of Milan, Italy, and his colleagues used tDCS to stimulate these areas. They asked 10 people with mild to moderate Alzheimer's to perform a word-recognition and a visual-attention task, before and after receiving tDCS or a sham treatment.
With tDCS, word recognition improved by 17 per cent, but there was no improvement in visual attention. Word recognition worsened when tDCS was used to inhibit neurons, and there was no change when the sham treatment was applied (Neurology, vol 71, p 493).
"Our findings are consistent with evidence that tDCS improves cognitive functions in healthy subjects and in patients with neurological disorders," Priori says. He is now running a larger study to confirm the results, and to find out how long the improvement lasts.
From issue 2678 of New Scientist magazine, 15 October 2008, page 8-9
Close this window
Printed on Thu Oct 16 05:09:08 BST 2008
A volunteer models the TMS device which is worn over the front of the head to stimulate the underlying brain tissue
Enlarge image
A volunteer models the TMS device which is worn over the front of the head to stimulate the underlying brain tissue
JOSH VILLA was 26 and driving home after a drink with a friend on 28 August 2005 when his car mounted the kerb and flipped over. Villa was thrown through the windscreen, suffered massive head injuries and fell into a coma.
Almost a year later, there was little sign of improvement. "He would open his eyes, but he was not responsive to any external stimuli in his environment," says Theresa Pape of the US Department of Veterans Affairs in Chicago, who helped treat him.
Usually there is little more that can be done for people in this condition. Villa was to be sent home to Rockford, Illinois, where his mother, Laurie McAndrews, had volunteered to care for him.
But Pape had a different suggestion. She enrolled him in a six-week study in which an electromagnetic coil was held over the front of his head to stimulate the underlying brain tissue. Such transcranial magnetic stimulation (TMS) has been investigated as a way of treating migraine, stroke, Parkinson's disease and depression, with some promising results, but this is the first time it has been used as a potential therapy for someone in a coma-like state.
The rapidly changing magnetic fields that the coil creates can be used either to excite or inhibit brain cells - making it easier or harder for them to communicate with one another. In Villa's case, the coil was used to excite brain cells in the right prefrontal dorsolateral cortex. This area has strong connections to the brainstem, which sends out pulses to the rest of the brain that tell it to pay attention. "It's like an 'OK, I'm awake' pulse," says Pape.
At first, there was little change in Villa's condition, but after around 15 sessions something happened. "You started talking to him and he would turn his head and look at you," says McAndrews. "That was huge."
Villa started obeying one-step commands, such as following the movement of a thumb and speaking single words. "They were very slurred but they were there," says Pape, who presented her findings this month at an international meeting on brain stimulation at the University of Göttingen, Germany. "He'd say like 'erm', 'help', 'help me'."
After the 30 planned sessions the TMS was stopped. Without it, Villa became very tired and his condition declined a little, but he was still much better than before. Six weeks later he was given another 10 sessions, but there were no further improvements and he was sent home, where he remains today.
Villa is by no means cured. But he is easier to care for and can interact with visitors such as his girlfriend, who has stuck by him following the accident. "When you talk to him he will move his mouth to show he is listening," McAndrews says. "If I ask him, 'Do you love me?' he'll do two slow eye blinks, yes. Some people would say it's not much, but he's improving and that's the main thing."
John Whyte of the Moss Rehabilitation Research Institute in Philadelphia, Pennsylvania, cautions that as intriguing as Villa's case is, it alone does not show that TMS is a useful treatment. "Even after eight months, it is not uncommon for patients to transition from the vegetative to the minimally conscious state without any particular intervention," he points out. He says TMS merits further investigation, along with other experimental treatments such as drugs which have temporarily roused three men from a coma, and deep brain stimulation, an invasive technique that roused a man out of a minimally conscious state.
"This is the first and very interesting use of repetitive TMS in coma," says Steven Laureys of the Coma Research Group at the University of Liège in Belgium. Our understanding of disorders of consciousness is so limited that even a single study can provide new insights, he says.
Pape acknowledges that further studies are needed to demonstrate that TMS really is beneficial, though she is convinced that it helped Villa. He had only been given a 20 to 40 per cent chance of long-term recovery, and until he was given TMS his functioning had not improved since about four months after the accident. What's more, after the 15th TMS session, he improved incrementally with each session - further evidence that TMS was the cause.
Pape hopes to begin treating a second patient in a coma-like state later this year. This time she plans to adjust the number of pulses of TMS in each train, and to alter the gap between pulses to see if there is an optimum interval.
McAndrews is also in no doubt that her son's quality of life has improved as a result of TMS. "Before I felt like he was not responsive, that he was depressed almost. Now you move him around and he complains - he can show emotions on that level."
See "Editorial: improving the lot of coma patients"
A gentle current helps when words are hard to find
People with Alzheimer's disease got better at a word-recognition task after their brains were stimulated with an electric current.
Like transcranial magnetic stimulation or TMS (see main story), transcranial direct current stimulation (tDCS) aims to activate or inhibit areas of the brain by making it easier or harder for the brain cells to fire. While TMS involves holding a current-carrying coil over the subject's head, tDCS, which has previously shown promise in treating pain and depression (New Scientist, 5 April 2006, p 34), uses electrodes to send a current of 1 to 2 milliamps through the skull.
In Alzheimer's, the temporoparietal areas of the brain, which are involved in memory and recognition, are known to be less active than in healthy people. So Alberto Priori at the University of Milan, Italy, and his colleagues used tDCS to stimulate these areas. They asked 10 people with mild to moderate Alzheimer's to perform a word-recognition and a visual-attention task, before and after receiving tDCS or a sham treatment.
With tDCS, word recognition improved by 17 per cent, but there was no improvement in visual attention. Word recognition worsened when tDCS was used to inhibit neurons, and there was no change when the sham treatment was applied (Neurology, vol 71, p 493).
"Our findings are consistent with evidence that tDCS improves cognitive functions in healthy subjects and in patients with neurological disorders," Priori says. He is now running a larger study to confirm the results, and to find out how long the improvement lasts.
From issue 2678 of New Scientist magazine, 15 October 2008, page 8-9
Close this window
Printed on Thu Oct 16 05:09:08 BST 2008
Subscribe to:
Comments (Atom)