Tuesday, October 28, 2008

Are You Evil? Profiling That Which Is Truly Wicked

A cognitive scientist employs malevolent logic to define the dark side of the human psyche

By Larry Greenemeier

TROY, N.Y.—The hallowed halls of academia are not the place you would expect to find someone obsessed with evil (although some students might disagree). But it is indeed evil—or rather trying to get to the roots of evil—that fascinates Selmer Bringsjord, a logician, philosopher and chairman of Rensselaer Polytechnic Institute's Department of Cognitive Science here. He's so intrigued, in fact, that he has developed a sort of checklist for determining whether someone is demonic, and is working with a team of graduate students to create a computerized representation of a purely sinister person.

"I've been working on what is evil and how to formally define it," says Bringsjord, who is also director of the Rensselaer AI & Reasoning Lab (RAIR). "It's creepy, I know it is."

To be truly evil, someone must have sought to do harm by planning to commit some morally wrong action with no prompting from others (whether this person successfully executes his or her plan is beside the point). The evil person must have tried to carry out this plan with the hope of "causing considerable harm to others," Bringsjord says. Finally, "and most importantly," he adds, if this evil person were willing to analyze his or her reasons for wanting to commit this morally wrong action, these reasons would either prove to be incoherent, or they would reveal that the evil person knew he or she was doing something wrong and regarded the harm caused as a good thing.

Bringsjord's research builds on earlier definitions put forth by San Diego State University philosophy professor J. Angelo Corlett as well as the late sociopolitical philosophers and psychologists, Joel Feinberg and Erich Fromm, but most significantly by psychiatrist and author M. Scott Peck in his 1983 book, People of the Lie, The Hope for Healing Human Evil. After reading Peck's tome about clinically evil people, "I thought it would be interesting to come up with formal structures that define evil," Bringsjord says, "and, ultimately, to create a purely evil character the way a creative writer would."

He and his research team began developing their computer representation of evil by posing a series of questions beginning with the basics—name, age, sex, etcetera—and progressing to inquiries about this fictional person's beliefs and motivations.

This exercise resulted in "E," a computer character first created in 2005 to meet the criteria of Bringsjord's working definition of evil. Whereas the original E was simply a program designed to respond to questions in a manner consistent with Bringsjord's definition, the researchers have since given E a physical identity: It's a relatively young, white man with short black hair and dark stubble on his face. Bringsjord calls E's appearance "a meaner version" of the character Mr. Perry in the 1989 movie Dead Poets Society. "He is a great example of evil," Bringsjord says, adding, however, that he is not entirely satisfied with this personification and may make changes.

The researchers have placed E in his own virtual world and written a program depicting a scripted interview between one of the researcher's avatars and E. In this example, E is programmed to respond to questions based on a case study in Peck's book that involves a boy whose parents gave him a gun that his older brother had used to commit suicide.

The researchers programmed E with a degree of artificial intelligence to make "him" believe that he (and not the parents) had given the pistol to the distraught boy, and then asked E a series of questions designed to glean his logic for doing so. The result is a surreal simulation during which Bringsjord's diabolical incarnation attempts to produce a logical argument for its actions: The boy wanted a gun, E had a gun, so E gave the boy the gun.

Bringsjord and his team by the end of the year hope to have completed the fourth generation of E, which will be able to use artificial intelligence and a limited set of straightforward English (no slang, for example) to "speak" with computer users.

Following the path of a true logician, Bringsjord's interest in the portrayal of virtuousness and evil in literature led to his interest in software that helps writers develop ideas and create stories; this, in turn, spurred him to develop his own software for simulating human behavior, both good and odious, says Barry Smith, a distinguished professor of bioinformatics and ontology at the State University of New York at Buffalo who is familiar with Bringsjord's work. "He's known as someone on the fringe of philosophy and computer science."

Bringsjord and Smith both have an interest in finding ways to better understand human behavior, and their work has attracted the attention of the intelligence community, which is seeking ways to successfully analyze the information they gather on potential terrorists. "To solve problems in intelligence analysis, you need more accurate representations of people," Smith says. "Selmer is trying to build really good representations of human beings in all of their subtlety."

Bringsjord acknowledges that the endeavor to create pure evil, even in a software program, does raise ethical questions, such as, how researchers could control an artificially intelligent character like E if "he" was placed in a virtual world such as Second Life, a Web-based program that allows people to create digital representations of themselves and have those avatars interact in a number of different ways.

"I wouldn't release E or anything like it, even in purely virtual environments, without engineered safeguards," Bringsjord says. These safeguards would be a set of ethics written into the software, something akin to author Isaac Asimov's "Three Laws of Robotics" that prevent a robot from harming humans, requires a robot to obey humans, and instructs a robot to protect itself—as long as that does not violate either or both of the first two laws.

Monday, October 27, 2008

Why can't I remember that thing, person, task?

By Cathryn Jakobson Ramin

I'd barely crossed the threshold of middle age. As a journalist, I was invested in staying smart and quick, mistress of my good brain and sardonic tongue. But almost overnight, I found that I was missing critical information -- the names of people and places, the titles of books and movies.

Worse, I had the attention span of a flea. I was having trouble keeping track of my calendar, and my sense of direction had disappeared. The change was so dramatic that sometimes I felt foreign to myself.

Over the course of a few years, as friends and relatives moved into their 40s and 50s, I realized that I was part of a large group of people who were struggling to keep up. I was determined to find a plausible explanation for what was happening to my brain and, by extension, to middle-aged minds in general.

As a first step, I began to study and categorize midlife mental lapses as if they were so many butterflies. Oprah.com: Keep your memory strong as you age

• There was Colliding-Planets Syndrome, which occurs when you fail to grasp, until too late, that you've scheduled a child's orthodontist appointment in the suburbs for the same hour as a business meeting in the city.

• Quick-Who-Is-She Dysfunction surfaces when you are face-to-face with someone whose name stubbornly refuses to come to mind.

• What-Am-I-Doing-Here Paranoia leaves you standing empty-handed in a doorway, trying to figure out what you've come for.

• The Damn-It-They-Were-Just-in-My-Hand Affliction leads to panicky moments spent looking for your favorite new sunglasses, when all the while they're on top of your head.

• And Wrong-Vessel Disorder results in placing the ice cream in the pantry rather than the freezer.

In the past decade, cognitive neuroscientists have learned that much of what we blame on fading memory in midlife can be more accurately attributed to failing attention. Physiological changes in the brain's frontal lobes make it harder to maintain attention in the face of distractions, explains Cheryl Grady, Ph.D., a neuroscientist and assistant director of the Rotman Research Institute in Toronto.

When the frontal lobes are in top form, they're adept at figuring out what's important for the job at hand and what's irrelevant blather; a sort of neural "bouncer" automatically keeps out unnecessary information. In middle age, that bouncer takes a lot of coffee breaks. Instead of focusing on the report that's due, you find yourself wondering what's for dinner. Even background noise -- the phone chatter of the co-worker in the next cubicle --can impair your ability to concentrate on the task before you.

When the neural bouncer slacks off, the cognitive scratch pad called working memory (which allows us to manipulate and prioritize information, and remember the thread of an argument) is quickly overwhelmed. You know the feeling: You can't absorb one more shred of information, so you erect a sturdy wall, neatly deflecting your husband's announcement that he'll be working late --an announcement you later swear he never made. "Metaphorically speaking," writes social theorist David Shenk in his book "Data Smog," "we plug up our ears, pinch our noses, cover our eyes& and step into a bodysuit lined with protective padding."

As you age, you may also notice that information that once popped into your head in milliseconds now shows up in its own sweet time. Denise Park, Ph.D., a cognitive neuroscientist at the University of Illinois at Urbana-Champaign, has found that while processing speed begins to decline in your late 20s, typically you don't feel the effect until your 40s or 50s. And then you feel as though you're wading through mental Jell-O. It's tough to acknowledge that your brain is aging right along with your abs, but in both cases you can put up a fight.

Quick-Who-Is-She Dysfunction

One type of forgetfulness is so prevalent, not to mention demoralizing, that just about everyone over 40 complains about it. I refer to the very public cognitive failure known as blocking, or blanking, when names refuse to come to mind and words dart in and out of consciousness, hiding in dark closets just when you need them.

In his landmark book, "The Seven Sins of Memory," the eminent Harvard memory expert Daniel Schacter, Ph.D., notes that the concept of blocking exists in at least 45 languages. The Cheyenne used an expression, "Navonotootse`a", which translates "I have lost it on my tongue." In Korean it is Hyeu kkedu-te mam-dol-da, which in English means "sparkling at the end of my tongue."

In midlife, resolving the "tip of the tongue" dilemma grows increasingly challenging. In the split second between your query -- "What do you call that sleek, dark purple vegetable?" -- and the response -- "eggplant" --your aging brain delivers quantities of unsolicited information.

Often, notes Schacter, "people can produce virtually everything they know about a person...nearly everything they know about a word except its label." The brain volunteers words that begin with the same letter, items that are the same color or shape, and, my favorite, words with the same number of syllables -- all of which gum up the works.

Unfortunately, blocking is most common in social situations, when anxiety and distraction combine to kidnap a chunk of your already challenged working memory. Roman aristocrats avoided the problem by always traveling with a nomenclator, an alert slave whose duty it was to supply his master with the names of acquaintances as they were encountered.

In the film "The Devil Wears Prada", magazine editor Miranda Priestly relies on her young assistant, Andy Sachs, to produce the names of party guests. Absent such a companion, Barbara Wallraff, senior editor and columnist for The Atlantic, sought suggestions from her readers on how to describe what transpires when you're introducing two people but have blocked their names. One reader suggested "whomnesia." Another proposed "mumbleduction."

With planning, many instances of Quick-Who-Is-She Dysfunction can be eradicated. Before you go to see the eighth-grade play, where you will sit among people you've known since your kids were in kindergarten, take 15 minutes to look over the school directory. You may avoid the embarrassment suffered by my friend Victor, an economist, when he introduced himself to a woman at Back to School Night who reminded him that the year before, at the same event, they'd spent a pleasant hour chatting about their shared alma mater.

Writing down a few key phrases on an index card before putting yourself in a cognitively challenging situation can ward off word loss. Before heading to your book group, take a moment to review the names of the characters and the plot of the fat novel you finished two weeks ago and barely remember. The other members will thank you. If words go missing anyway, grab for a synonym. Staying on the trail like a bloodhound only exacerbates the problem.

Colliding-Planets Syndrome

To your distress, you discover that you agreed to attend your friend Sarah's 50th birthday party on the same night you're supposed to be at a convention in Las Vegas. Now, how did that happen? If I had to guess, I'd say that you said yes to Sarah's birthday ("Of course, I wouldn't miss it!") when you were nowhere near your calendar. If you want to eliminate Colliding-Planets Syndrome, that calendar must be your new best friend.

Don't get cocky and put off entering a date, even if it's just for coffee the following day. Mark A. McDaniel, Ph.D., a professor of psychology at Washington University in St. Louis and an expert in human learning and memory, found that in the face of even a brief delay, older adults have much more difficulty than younger ones keeping in mind a task to be accomplished in the future.

Refuse to agree to anything, ever, without a calendar in front of you.
And don't write down cryptic things like "Starbucks," because you'll draw a blank on which café you meant, and sit for a long time in the wrong one. Where you write things down matters: Multiple calendars -- home, work, school -- can only lead to trouble.

But what about things you must remember to do in the short term, like returning the nurse-practitioner's call in 15 minutes or putting money in the parking meter in a half hour? These are what Daniel Schacter calls time-based commitments, and putting them on your calendar isn't likely to help unless you habitually check it every five minutes.

On less than an hour's notice, my most responsible friend, Jane, agreed to pick up her neighbor's son at school when she collected her own brood. Knowing she had to make it to soccer and ballet in Los Angeles traffic, she was the first in the carpool line, where she efficiently loaded her kids and took off. The neighbor's child sat waiting on a bench until teachers phoned his mother, who had nothing nice to say when she got in touch with Jane.

In midlife we have trouble remembering to do things at specific times because we're at the mercy of a million environmental distractions. One of Denise Park's studies demonstrated that elderly subjects were more likely to remember to take their medication on schedule than middle-aged subjects, because in midlife the crush of extenuating circumstances often got in the way.

To remember to make that call to the nurse practitioner, Schacter told me, you're going to need an unmistakable cue, one that will be both available and informative. An alarm clock on the desk in front of you can do the job, but under no circumstances should you permit yourself to switch off the clock and finish just one more thing before you pick up the phone. And don't count on your PDA: You've heard those bleeps and blurps so often, you've learned to ignore them.

Damn-It-They-Were-Just-in-My-Hand Affliction

Even the most meticulously managed PDA won't work if you misplace it. And as luck would have it, the items we lose most often -- keys, glasses, wallets, cell phones, planners -- are the ones that are crucial to our survival.

This abject failure to keep track of our belongings may emerge from the brain's talent for forecasting the future. The neocortex, a long-term storage facility, constantly predicts how we'll behave in specific situations, explains Jeff Hawkins in his book "On Intelligence." Instead of reinventing the wheel every time we do something familiar, the brain chooses from a library of existing patterns, based on choices we've made before. A novel event -- a man with a gun -- gets the brain's full attention, but when we're merely lugging groceries into the house, we shift into autopilot. And autopilot is the mode in which we're likely to misplace things.

The problem can be remedied, but only with a preemptive strike. Awareness is essential: When the phone rings as you're entering the house loaded down with groceries, don't drop your keys on the counter, where they will be buried in the day's mail, making you frantically late for your dinner engagement. If you can't immediately hang the keys on the hook where they belong, keep them on your person until you can; one woman I know slips them into her bra, creating a silhouette so inelegant that she can't possibly forget where she put them.

Give up your habit of tucking important items into indiscriminate pockets of your purse or briefcase. Choose one secure zone -- front, zippered -- where you always keep your boarding pass or passport, and never alter it. You'll save yourself the discomfort of searching high and low under the stern surveillance of security personnel.

Wrong-Vessel Disorder

When you lose track of what you intended to say or do, you've had what cognitive psychologists call a prospective lapse. Wrong-Vessel Disorder is a manifestation of this problem: With the best intentions, you absentmindedly place your cell phone in your briefcase, which has many of the same attributes as your purse. Saturday morning, when you reach into your bag and come up empty, you're mystified. Because you're barely conscious when it strikes, it's hard to fend off Wrong-Vessel Disorder. You just have to laugh.

But prospective failures also show up as What-Am-I-Doing-Here Paranoia: Suddenly, as if someone depressed the power button on the remote, you go blank. The minigaps, where you march purposefully to the kitchen, only to stand there and scratch your head, are irritating; the yawning caverns can really shake your confidence. Fran, the marketing director of a local bank, was bright-eyed and ready to give her quarterly presentation before the board -- until somewhere in midsentence, three out of six points eluded her, an experience that made her realize that her days of winging it were over.

Mark McDaniel observes that younger adults make use of robust working memory, relying on a little voice that automatically whispers "get milk, get milk, get milk," all the way home. In midlife that voice is easily interrupted ("Oh, look, it's raining! Now, where did I put that umbrella?") -- at least until you're in the driveway. If you can send the voice back into the game, you'll avoid a lot of extra trips to the store. I've stuck Post-it notes on the steering wheel, which makes driving awkward, but at least I don't return home with the FedEx package still beside me on the front seat.

What-Am-I-Doing-Here Paranoia

When what you forget is not a grocery item but an idea, you've no alternative but to backtrack mentally. It's vaguely amusing to do this with a friend at lunch -- "What on earth were we talking about?" -- but in a professional situation it hurts. With a little digging, you can often extract a key idea that lingers in your working memory and, from there, reconstruct the context of the discussion. In such cases, it is helpful to have a stockpile of useful phrases, conversation fillers that buy you time. "Do you see what I mean?" works well, as does my friend Jeff's old standby, delivered with the greatest sincerity: "Now that's very interesting," even when it isn't.

When a colleague stood me up for breakfast, after exchanging no fewer than nine e-mails about where and when earlier in the week, I wasn't upset -- I was as curious as a botanist who has come upon a valuable specimen. How had it happened? Had planets collided yet again? In a classic demonstration of autopilot, he'd exited the commuter train, jumped on the subway, and gone straight to work, failing to stop at the café across the street from the station where we'd planned to meet. When I phoned his cell, it took him several seconds to realize his mistake, at which point he howled in dismay.

He didn't want to talk about it, but nevertheless I probed. "Wait," I said, "let's dissect it. How did it start?"

As was his habit, he had carefully printed out his schedule the previous night before leaving work, he explained. Then he packed up his briefcase and departed, leaving the piece of paper in the printer. From that moment on, our breakfast appointment never crossed his mind. "Is this normal?" he asked. It was normal, I assured him, in that it happened regularly to people in midlife. But that didn't mean he had to sit back and take it. It was time to make a stand.

Monday, October 20, 2008

Imaging the Unconscious

Functional magnetic resonance imaging could bring psychoanalysis into the 21st century.
By Emily Singer

More than one hundred years ago, Sigmund Freud proposed his pioneering theory that hidden desires in our subconscious drive much of human behavior. While those theories have fallen out of favor in recent decades, scientists are now revisiting some of them -- with new brain imaging tools. The hope is that having a direct window into the brain's hidden processes will shed new light on anxiety disorders, and perhaps help to assess how well behavioral therapies, such as psychoanalysis, target the intricacies of the unconscious mind.

"One of the reasons people departed from Freudian concepts was because they weren't very testable," says Ronald Cohen, professor of psychiatry at Brown University in Providence, RI. "These types of [imaging] experiments would potentially be a more direct way of testing ideas that rose out of traditional psychoanalytic theory."

One of Freud's theories held that after a traumatic event, people might unconsciously associate a normally benign stimulus, say, a friendly golden retriever, with a previously fearful event, such as getting bitten by a Rottweiler. This theory seems to be true in the case of post-traumatic stress disorder (PTSD). Harmless sights and sounds, for instance, such as a bus traveling on down a street, can trigger a panic attack in someone with PTSD who was once involved in a bus crash. Furthermore, the sufferer may not be immediately able to pinpoint the cause of his or her anxiety attack.

Now scientists are using brain imaging techniques to explore how the unconscious fear signal may be turned up in people with PTSD and other anxiety disorders. To study the brain processes underlying anxiety, researchers use functional magnetic resonance imaging (fMRI) to measure a person's brain activity while he or she looks at threatening signals, such as a picture of a fearful face. These frightening pictures will spark activity in the part of the brain known as the amygdala, which is part of the evolutionarily ancient brain involved in processing emotion and fear. To study the unconscious aspects of fear and anxiety, the researchers flash the ominous picture so quickly that subjects don't consciously notice it -- the brain reacts to the image, even though the person cannot determine whether or not they actually saw it.

Last year, Amit Etkin and collaborators at Columbia University showed that people who score high on anxiety tests have a stronger amygdala response to fearful faces when those images are presented below the level of conscious perception than people who score lower on the tests. Their findings suggest that the way people respond unconsciously to the world around them could also affect their daily anxiety levels.

Now the Columbia researchers want to determine if this lab observation can be used therapeutically. To do so, they plan to study 25 people with generalized anxiety disorder, first to determine whether this exaggerated amygdala response is present in people with the disorder, then to see if cognitive behavioral therapy -- one of the best-established forms of talk therapy -- can reduce the exaggerated unconscious response.

"We can use imaging as a way of evaluating the outcome of therapy," says Eric Kandel, a Nobel-prize winning neuroscientist at Columbia University who's collaborating on the project. "Maybe we can take people who have a large [anxiety] signal and turn it down as the result of a therapeutic experience," he says.

People with PTSD show a similar exaggerated amygdala response to fearful faces. Jorge Armony, the Canada Research Chair in Affective Neuroscience at McGill University in Montreal, is studying both PTSD patients and people who have recently experienced a traumatic event and may develop PTSD. Armony and his team want to see if they can use the amygdala signal and other factors to predict who is vulnerable to the disorder and who will be resistant to therapy. "After 6 to 12 months, some people recover -- what's the difference between people who recover and people who don't?" Armony asks.

While fMRI measures of unconscious processes are useful for studying populations of people with an illness, they're not yet precise enough to diagnose an individual with a particular disorder, says Armony. "We can say that [statistically] a person with PTSD will have an exaggerated amygdala response, but that doesn't mean that everyone will have it."

Hans Breiter, a neuroscientist at Harvard Medical School, one of the first researchers to study amygdala activity with fMRI in the mid-1990s, agrees that a more extensive evaluation of the neurological changes in psychiatric disorders is necessary before the technique can have clinical applications. "This approach is promising and is the right first step, but scientists will need to study larger numbers of people with fMRI to get a better sense of the variability in brain functions that underlie anxiety and depression," he says. "They may have very different [brain activity patterns] and may have very different therapeutic needs." He predicts those larger-scale studies will happen within the next five years.

Breiter and other scientists are optimistic that fMRI can one day be used to evaluate the benefits of therapy, but they say it's unclear what brain signals, conscious or unconscious ones, will be the most effective measure.

"The question still remains, how important are these subconscious phenomena?" says Cohen at Brown. "From a cognitive behavioral perspective, the conscious aspects of depression and anxiety are more important."

Both Etkin at Columbia and Armony at McGill are also using fMRI to study conscious processes, such as attention, in people with anxiety disorders; and they plan to examine how these different factors may be important in different anxiety-related diseases, such as depression and eating disorders.

"There's information processing going on in the brain that's completely outside of awareness, which previously we could only investigate with psychoanalysis," says Tom Insel, director of the National Institutes of Mental Health in Bethesda, MD. "Now you can track [those processes] with neuro-imaging -- a tool that may be much more compelling."

Man 'roused from coma' by a magnetic field


A volunteer models the TMS device which is worn over the front of the head to stimulate the underlying brain tissue
Enlarge image
A volunteer models the TMS device which is worn over the front of the head to stimulate the underlying brain tissue

JOSH VILLA was 26 and driving home after a drink with a friend on 28 August 2005 when his car mounted the kerb and flipped over. Villa was thrown through the windscreen, suffered massive head injuries and fell into a coma.

Almost a year later, there was little sign of improvement. "He would open his eyes, but he was not responsive to any external stimuli in his environment," says Theresa Pape of the US Department of Veterans Affairs in Chicago, who helped treat him.

Usually there is little more that can be done for people in this condition. Villa was to be sent home to Rockford, Illinois, where his mother, Laurie McAndrews, had volunteered to care for him.

But Pape had a different suggestion. She enrolled him in a six-week study in which an electromagnetic coil was held over the front of his head to stimulate the underlying brain tissue. Such transcranial magnetic stimulation (TMS) has been investigated as a way of treating migraine, stroke, Parkinson's disease and depression, with some promising results, but this is the first time it has been used as a potential therapy for someone in a coma-like state.

The rapidly changing magnetic fields that the coil creates can be used either to excite or inhibit brain cells - making it easier or harder for them to communicate with one another. In Villa's case, the coil was used to excite brain cells in the right prefrontal dorsolateral cortex. This area has strong connections to the brainstem, which sends out pulses to the rest of the brain that tell it to pay attention. "It's like an 'OK, I'm awake' pulse," says Pape.

At first, there was little change in Villa's condition, but after around 15 sessions something happened. "You started talking to him and he would turn his head and look at you," says McAndrews. "That was huge."

Villa started obeying one-step commands, such as following the movement of a thumb and speaking single words. "They were very slurred but they were there," says Pape, who presented her findings this month at an international meeting on brain stimulation at the University of Göttingen, Germany. "He'd say like 'erm', 'help', 'help me'."

After the 30 planned sessions the TMS was stopped. Without it, Villa became very tired and his condition declined a little, but he was still much better than before. Six weeks later he was given another 10 sessions, but there were no further improvements and he was sent home, where he remains today.

Villa is by no means cured. But he is easier to care for and can interact with visitors such as his girlfriend, who has stuck by him following the accident. "When you talk to him he will move his mouth to show he is listening," McAndrews says. "If I ask him, 'Do you love me?' he'll do two slow eye blinks, yes. Some people would say it's not much, but he's improving and that's the main thing."

John Whyte of the Moss Rehabilitation Research Institute in Philadelphia, Pennsylvania, cautions that as intriguing as Villa's case is, it alone does not show that TMS is a useful treatment. "Even after eight months, it is not uncommon for patients to transition from the vegetative to the minimally conscious state without any particular intervention," he points out. He says TMS merits further investigation, along with other experimental treatments such as drugs which have temporarily roused three men from a coma, and deep brain stimulation, an invasive technique that roused a man out of a minimally conscious state.

"This is the first and very interesting use of repetitive TMS in coma," says Steven Laureys of the Coma Research Group at the University of Liège in Belgium. Our understanding of disorders of consciousness is so limited that even a single study can provide new insights, he says.

Pape acknowledges that further studies are needed to demonstrate that TMS really is beneficial, though she is convinced that it helped Villa. He had only been given a 20 to 40 per cent chance of long-term recovery, and until he was given TMS his functioning had not improved since about four months after the accident. What's more, after the 15th TMS session, he improved incrementally with each session - further evidence that TMS was the cause.

Pape hopes to begin treating a second patient in a coma-like state later this year. This time she plans to adjust the number of pulses of TMS in each train, and to alter the gap between pulses to see if there is an optimum interval.

McAndrews is also in no doubt that her son's quality of life has improved as a result of TMS. "Before I felt like he was not responsive, that he was depressed almost. Now you move him around and he complains - he can show emotions on that level."

See "Editorial: improving the lot of coma patients"
A gentle current helps when words are hard to find

People with Alzheimer's disease got better at a word-recognition task after their brains were stimulated with an electric current.

Like transcranial magnetic stimulation or TMS (see main story), transcranial direct current stimulation (tDCS) aims to activate or inhibit areas of the brain by making it easier or harder for the brain cells to fire. While TMS involves holding a current-carrying coil over the subject's head, tDCS, which has previously shown promise in treating pain and depression (New Scientist, 5 April 2006, p 34), uses electrodes to send a current of 1 to 2 milliamps through the skull.

In Alzheimer's, the temporoparietal areas of the brain, which are involved in memory and recognition, are known to be less active than in healthy people. So Alberto Priori at the University of Milan, Italy, and his colleagues used tDCS to stimulate these areas. They asked 10 people with mild to moderate Alzheimer's to perform a word-recognition and a visual-attention task, before and after receiving tDCS or a sham treatment.

With tDCS, word recognition improved by 17 per cent, but there was no improvement in visual attention. Word recognition worsened when tDCS was used to inhibit neurons, and there was no change when the sham treatment was applied (Neurology, vol 71, p 493).

"Our findings are consistent with evidence that tDCS improves cognitive functions in healthy subjects and in patients with neurological disorders," Priori says. He is now running a larger study to confirm the results, and to find out how long the improvement lasts.

From issue 2678 of New Scientist magazine, 15 October 2008, page 8-9
Close this window
Printed on Thu Oct 16 05:09:08 BST 2008

Thursday, October 9, 2008

Natural-Born Liars

Why do we lie, and why are we so good at it? Because it works

By David Livingstone Smith
Deception runs like a red thread throughout all of human history. It sustains literature, from Homer's wily Odysseus to the biggest pop novels of today. Go to a movie, and odds are that the plot will revolve around deceit in some shape or form. Perhaps we find such stories so enthralling because lying pervades human life. Lying is a skill that wells up from deep within us, and we use it with abandon. As the great American observer Mark Twain wrote more than a century ago: "Everybody lies ... every day, every hour, awake, asleep, in his dreams, in his joy, in his mourning. If he keeps his tongue still his hands, his feet, his eyes, his attitude will convey deception." Deceit is fundamental to the human condition.

Research supports Twain's conviction. One good example was a study conducted in 2002 by psychologist Robert S. Feldman of the University of Massachusetts Amherst. Feldman secretly videotaped students who were asked to talk with a stranger. He later had the students analyze their tapes and tally the number of lies they had told. A whopping 60 percent admitted to lying at least once during 10 minutes of conversation, and the group averaged 2.9 untruths in that time period. The transgressions ranged from intentional exaggeration to flat-out fibs. Interestingly, men and women lied with equal frequency; however, Feldman found that women were more likely to lie to make the stranger feel good, whereas men lied most often to make themselves look better.

In another study a decade earlier by David Knox and Caroline Schacht, both now at East Carolina University, 92 percent of college students confessed that they had lied to a current or previous sexual partner, which left the husband-and-wife research team wondering whether the remaining 8 percent were lying. And whereas it has long been known that men are prone to lie about the number of their sexual conquests, recent research shows that women tend to underrepresent their degree of sexual experience. When asked to fill out questionnaires on personal sexual behavior and attitudes, women wired to a dummy polygraph machine reported having had twice as many lovers as those who were not, showing that the women who were not wired were less honest. It's all too ironic that the investigators had to deceive subjects to get them to tell the truth about their lies.

These references are just a few of the many examples of lying that pepper the scientific record. And yet research on deception is almost always focused on lying in the narrowest sense-literally saying things that aren't true. But our fetish extends far beyond verbal falsification. We lie by omission and through the subtleties of spin. We engage in myriad forms of nonverbal deception, too: we use makeup, hairpieces, cosmetic surgery, clothing and other forms of adornment to disguise our true appearance, and we apply artificial fragrances to misrepresent our body odors. We cry crocodile tears, fake orgasms and flash phony "have a nice day" smiles. Out-and-out verbal lies are just a small part of the vast tapestry of human deceit.

The obvious question raised by all of this accounting is: Why do we lie so readily? The answer: because it works. The Homo sapiens who are best able to lie have an edge over their counterparts in a relentless struggle for the reproductive success that drives the engine of evolution. As humans, we must fit into a close-knit social system to succeed, yet our primary aim is still to look out for ourselves above all others. Lying helps. And lying to ourselves--a talent built into our brains--helps us accept our fraudulent behavior.

Passport to Success

If this bald truth makes any one of us feel uncomfortable, we can take some solace in knowing we are not the only species to exploit the lie. Plants and animals communicate with one another by sounds, ritualistic displays, colors, airborne chemicals and other methods, and biologists once naively assumed that the sole function of these communication systems was to transmit accurate information. But the more we have learned, the more obvious it has become that nonhuman species put a lot of effort into sending inaccurate messages.

The mirror orchid, for example, displays beautiful blue blossoms that are dead ringers for female wasps. The flower also manufactures a chemical cocktail that simulates the pheromones released by females to attract mates. These visual and olfactory cues keep hapless male wasps on the flower long enough to ensure that a hefty load of pollen is clinging to their bodies by the time they fly off to try their luck with another orchid in disguise. Of course, the orchid does not "intend" to deceive the wasp. Its fakery is built into its physical design, because over the course of history plants that had this capability were more readily able to pass on their genes than those that did not. Other creatures deploy equally deceptive strategies. When approached by an erstwhile predator, the harmless hog-nosed snake flattens its head, spreads out a cobralike hood and, hissing menacingly, pretends to strike with maniacal aggression, all the while keeping its mouth discreetly closed.

These cases and others show that nature favors deception because it provides survival advantages. The tricks become increasingly sophisticated the closer we get to Homo sapiens on the evolutionary chain. Consider an incident between Mel and Paul:

Mel dug furiously with her bare hands to extract the large succulent corm from the rock-hard Ethiopian ground. It was the dry season and food was scarce. Corms are edible bulbs somewhat like onions and are a staple during these long, hard months. Little Paul sat nearby and surreptitiously observed Mel's labors. Paul's mother was out of sight; she had left him to play in the grass, but he knew she would remain within earshot in case he needed her. Just as Mel managed, with a final pull, to yank her prize out of the earth, Paul let out an ear-splitting cry that shattered the peace of the savannah. His mother rushed to him. Heart pounding and adrenaline pumping, she burst upon the scene and quickly sized up the situation: Mel had obviously harassed her darling child. Shrieking, she stormed after the bewildered Mel, who dropped the corm and fled. Paul's scheme was complete. After a furtive glance to make sure nobody was looking, he scurried over to the corm, picked up his prize and began to eat. The trick worked so well that he used it several more times before anyone wised up.

The actors in this real-life drama were not people. They were Chacma baboons, described in a 1987 article by primatologists Richard W. Byrne and Andrew Whiten of the University of St. Andrews in Scotland for i magazine and later recounted in Byrne's 1995 book The Thinking Ape (Oxford University Press). In 1983 Byrne and Whiten began noticing deceptive tactics among the mountain baboons in Drakensberg, South Africa. Catarrhine primates, the group that includes the Old World monkeys, apes and ourselves, are all able to tactically dupe members of their own species. The deceptiveness is not built into their appearance, as with the mirror orchid, nor is it encapsulated in rigid behavioral routines like those of the hog-nosed snake. The primates' repertoires are calculated, flexible and exquisitely sensitive to shifting social contexts.

Byrne and Whiten catalogued many such observations, and these became the basis for their celebrated Machiavellian intelligence hypothesis, which states that the extraordinary explosion of intelligence in primate evolution was prompted by the need to master ever more sophisticated forms of social trickery and manipulation. Primates had to get smart to keep up with the snowballing development of social gamesmanship.

The Machiavellian intelligence hypothesis suggests that social complexity propelled our ancestors to become progressively more intelligent and increasingly adept at wheeling, dealing, bluffing and conniving. That means human beings are natural-born liars. And in line with other evolutionary trends, our talent for dissembling dwarfs that of our nearest relatives by several orders of magnitude.

The complex choreography of social gamesmanship remains central to our lives today. The best deceivers continue to reap advantages denied to their more honest or less competent peers. Lying helps us facilitate social interactions, manipulate others and make friends.

There is even a correlation between social popularity and deceptive skill. We falsify our r¿¿¿m¿¿¿to get jobs, plagiarize essays to boost grade-point averages and pull the wool over the eyes of potential sexual partners to lure them into bed. Research shows that liars are often better able to get jobs and attract members of the opposite sex into relationships. Several years later Feldman demonstrated that the adolescents who are most popular in their schools are also better at fooling their peers. Lying continues to work. Although it would be self-defeating to lie all the time (remember the fate of the boy who cried, "Wolf!"), lying often and well remains a passport to social, professional and economic success.

Fooling Ourselves
Ironically, the primary reasons we are so good at lying to others is that we are good at lying to ourselves. There is a strange asymmetry in how we apportion dishonesty. Although we are often ready to accuse others of deceiving us, we are astonishingly oblivious to our own duplicity. Experiences of being a victim of deception are burned indelibly into our memories, but our own prevarications slip off our tongues so easily that we often do not notice them for what they are.

The strange phenomenon of self-deception has perplexed philosophers and psychologists for more than 2,000 years. On the face of it, the idea that a person can con oneself seems as nonsensical as cheating at solitaire or embezzling money from one's own bank account. But the paradoxical character of self-deception flows from the idea, formalized by French polymath escartes in the 17th century, that human minds are transparent to their owners and that introspection yields an accurate understanding of our own mental life. As natural as this perspective is to most of us, it turns out to be deeply misguided.

If we hope to understand self-deception, we need to draw on a more scientifically sound conception of how the mind works. The brain comprises a number of functional systems. The system responsible for cognition--the thinking part of the brain--is somewhat distinct from the system that produces conscious experiences. The relation between the two systems can be thought of as similar to the relation between the processor and monitor of a personal computer. The work takes place in the processor; the monitor does nothing but display information the processor transfers to it. By the same token, the brain's cognitive systems do the thinking, whereas consciousness displays the information that it has received. Consciousness plays a less important role in cognition than previously expected.

This general picture is supported by a great deal of experimental evidence. Some of the most remarkable and widely discussed studies were conducted several decades ago by neuroscientist Benjamin Libet, now professor emeritus at the University of California at San Diego. In one experiment, Libet placed subjects in front of a button and a rapidly moving clock and asked them to press the button whenever they wished and to note the time, as displayed on the clock, the moment they felt an impulse to press the button. Libet also attached electrodes over the motor cortex, which controls movement, in each of his subjects to monitor the electrical tension that mounts as the brain prepares to initiate an action. He found that our brains begin to prepare for action just over a third of a second before we consciously decide to act. In other words, despite appearances, it is not the conscious mind that decides to perform an action: the decision is made unconsciously. Although our consciousness likes to take the credit (so to speak), it is merely informed of unconscious decisions after the fact. This study and others like it suggest that we are systematically deluded about the role consciousness plays in our lives. Strange as it may seem, consciousness may not do any-thing except display the results of unconscious cognition.

This general model of the mind, supported by various experiments beyond Libet's, gives us exactly what we need to resolve the paradox of self-deception--at least in theory. We are able to deceive ourselves by invoking the equivalent of a cognitive filter between unconscious cognition and conscious awareness. The filter preempts information before it reaches consciousness, preventing selected thoughts from proliferating along the neural pathways to awareness.

Solving the Pinocchio Problem
But why would we filter information? Considered from a biological perspective, this notion presents a problem. The idea that we have an evolved tendency to deprive ourselves of information sounds wildly implausible, self-defeating and biologically disadvantageous. But once again we can find a clue from Mark Twain, who bequeathed to us an amazingly insightful explanation. "When a person cannot deceive himself," he wrote, "the chances are against his being able to deceive other people." Self-deception is advantageous because it helps us lie to others more convincingly. Concealing the truth from ourselves conceals it from others.

In the early 1970s biologist Robert L. Trivers, now at Rutgers University, put scientific flesh on Twain's insight. Trivers made the case that our flair for self-deception might be a solution to an adaptive problem that repeatedly faced ancestral humans when they attempted to deceive one another. Deception can be a risky business. In the tribal, hunter-gatherer bands that were presumably the standard social environment in which our hominid ancestors lived, being caught red-handed in an act of deception could result in social ostracism or banishment from the community, to become hyena bait. Because our ancestors were socially savvy, highly intelligent primates, there came a point when they became aware of these dangers and learned to be self-conscious liars.

This awareness created a brand-new problem. Uncomfortable, jittery liars are bad liars. Like Pinocchio, they give themselves away by involuntary, nonverbal behaviors. A good deal of experimental evidence indicates that humans are remarkably adept at making inferences about one another's mental states on the basis of even minimal exposure to nonverbal information. As Freud once commented, "No mortal can keep a secret. If his lips are silent, he chatters with his fingertips; betrayal oozes out of him at every pore." In an effort to quell our rising anxiety, we may automatically raise the pitch of our voice, blush, break out into the proverbial cold sweat, scratch our nose or make small movements with our feet as though barely squelching an impulse to flee.

Alternatively, we may attempt to rigidly control the tone of our voice and, in an effort to suppress telltale stray movements, raise suspicion by our stiff, wooden bearing. In any case, we sabotage our own efforts to deceive. Nowadays a used-car salesman can hide his shifty eyes behind dark sunglasses, but this cover was not available during the Pleistocene epoch. Some other solution was required.

Natural selection appears to have cracked the Pinocchio problem by endowing us with the ability to lie to ourselves. Fooling ourselves allows us to selfishly manipulate others around us while remaining conveniently innocent of our own shady agendas.

If this is right, self-deception took root in the human mind as a tool for social manipulation. As Trivers noted, biologists propose that the overriding function of self-deception is the more fluid deception of others. Self-deception helps us ensnare other people more effectively. It enables us to lie sincerely, to lie without knowing that we are lying. There is no longer any need to put on an act, to pretend that we are telling the truth. Indeed, a self-deceived person is actually telling the truth to the best of his or her knowledge, and believing one's own story makes it all the more persuasive.

Although Trivers's thesis is difficult to test, it has gained wide currency as the only biologically realistic explanation of self-deception as an adaptive feature of the human mind. The view also fits very well with a good deal of work on the evolutionary roots of social behavior that has been supported empirically.

Of course, self-deception is not always so absolute. We are sometimes aware that we are willing dupes in our own con game, stubbornly refusing to explicitly articulate to ourselves just what we are up to. We know that the stories we tell ourselves do not jibe with our behavior, or they fail to mesh with physical signs such as a thumping heart or sweaty palms that betray our emotional states. For example, the students described earlier, who admitted their lies when watching themselves on videotape, knew they were lying at times, and most likely they did not stop themselves because they were not disturbed by this behavior.

At other times, however, we are happily unaware that we are pulling the wool over our own eyes. A biological perspective helps us understand why the cognitive gears of self-deception engage so smoothly and silently. They cleverly and imperceptibly embroil us in performances that are so skillfully crafted that the act gives every indication of complete sincerity, even to the actors themselves.

Wednesday, October 1, 2008

Self-Managed Cognitive Behavior Therapy Over the Internet Shows Promise in PTSD

BY: Marlene Busko

Medscape Medical News 2007.

November 16, 2007 — Patients with posttraumatic stress disorder (PTSD) who received 8 weeks of Internet-delivered self-management cognitive behavior therapy had greater reductions in symptoms than patients in a comparison group who received Internet-delivered non–trauma-focused supportive counseling.

In this small pilot study, 25% of patients in the online self-management cognitive behavior therapy group no longer had a diagnosis of PTSD immediately after treatment or at 6 months; only 5% and 3% of patients in the comparison group had an absence of PTSD at these time periods.

These findings are published in the November issue of the American Journal of Psychiatry.

Not in the Cards

"If resources were unlimited, of course we would want everyone who was traumatized to be treated by an individual psychotherapist . . . in a face-to-face way, but that's just not in the cards," lead author Brett T. Litz, PhD, from VA Boston Healthcare System and Boston University School of Medicine, in Massachusetts, told Medscape Psychiatry. "We're encouraged that there are technologies that we might be able to use to get people the care they need when they are reluctant to get it or it's not available," he added.

The researchers conducted a randomized controlled proof-of-concept trial of therapist-assisted Internet-delivered self-management cognitive therapy vs Internet-delivered supportive non–trauma-focused counseling.

They recruited military service members from the Washington, DC area who had PTSD as a result of the September 11, 2001 attack on the Pentagon or combat in Iraq or Afghanistan. Of 141 individuals who responded to study recruitment ads, 96 were ineligible, did not consent, or could not be contacted, leaving 45 patients who were randomly assigned to receive 1 of the 2 treatments for 8 weeks. A total of 12 participants dropped out before or during treatment; this dropout rate (30%) was similar to regular cognitive behavior therapy and unrelated to treatment group.

All participants received an initial face-to-face meeting with a therapist. During the 8 weeks of treatment, participants were asked to log on daily to 1 of 2 different Web sites, depending on their treatment group. Each Web site had educational information about PTSD and provided homework assignments.

The self-management cognitive behavior therapy Web site, which was called "Delivery of self-training and education for stressful situations" (DE-STRESS), delivered therapy that was highly trauma focused. "The approach was to get people to learn ways of managing and coping with 'trauma triggers,' " said Dr. Litz.

The Internet-delivered supportive counseling was not trauma focused but rather was centered on problems in daily-living issues. "People would be prompted to think of ways they could tackle these situations differently," he said.


The DE-STRESS Web site was well received by the participants in the self-management cognitive behavior therapy group. Patients in this treatment group had a sharper decline in mean total PTSD symptom severity. One-third of patients who completed this program were considered to have high end-state functioning at 6 months, whereas none of the participants who completed the supportive counseling program were. Patients in both groups, however, were considered to have significant declines in PTSD symptoms and depression at 6 months.

Self-managed cognitive behavior therapy is a potential solution to meet the need for efficient, low-cost, and stigma-reducing interventions for traumatic stress, the authors conclude. They acknowledge that this was a small initial study and call for further research in self-management strategies.

"When you factor in disasters and veterans of war, the scale is pretty large in terms of need," Dr. Litz noted. "So we have no alternative in the mental healthcare field but to be creative and use technology such as the Internet, the telephone, or other telehealth approaches."

Extinguishing Conditioned Responses

The data from this study by Dr. Litz and colleagues are promising and provide "further evidence for the importance of the underlying mechanism of extinction in the treatment of PTSD," Ruth Lanius, MD, from the University of Western Ontario, in London, Ontario, writes in an accompanying editorial.

She notes that the rationale behind the self-management cognitive behavior therapy that was used is based on a fear-conditioning model, which is widely accepted in the PTSD literature. The study used graduated exposure to trauma triggers, with the goal of extinguishing the association between previously neutral stimuli and the conditioned response.

"However, insofar as exposure therapy failed to eliminate all PTSD symptoms, the inference is that there are probably additional mechanisms at work," she suggested.

The study authors and editorialist have disclosed no relevant financial relationships.

Very Early Mobilization Cuts Rates of Poststroke Depression by 50%

By: Caroline Cassels

Medscape Medical News 2008.

September 30, 2008 — Very early mobilization of stroke patients within 24 hours of symptom onset significantly reduces rates of depression in this population.

These latest phase 2 findings from A Very Early Rehabilitation Trial (AVERT), show getting stroke patients up and out of bed within 24 hours of symptom onset cuts rates of severe depression by 50% 7 days after the stroke.

"We were surprised at the strength of these results and the impact very early and frequent mobilization had on severe depressive symptoms — 42% for controls vs 21% for those in the intervention group is quite a marked difference," study investigator Toby Cumming, PhD, from the National Stroke Research Institute, in Victoria, Australia, told Medscape Psychiatry.

"We need to stress that these are preliminary findings from quite a small study. Nevertheless, the results are exciting and quite unexpected. Based on these results and our earlier findings, I think we'd encourage people to consider mobilizing stroke patients earlier, because it may have beneficial effects, and reducing depression may be 1 of them," he said.

The study is published in the September issue of the Journal of Rehabilitation Medicine.

Poststroke Depression Common

Previous research shows approximately one-third of all stroke patients experience depression. Further, said Dr. Cumming, poststroke depression has also been associated with poor outcomes, including increased mortality, poorer rate and extent of recovery, poorer quality of life, and reduced participation in rehabilitation programs.

In addition, studies examining detection and treatment of poststroke depression show that it is commonly underdiagnosed and undertreated.

"If we can find a way to reduce poststroke depression, we may be able to positively affect many of these other, very important stroke outcomes, which is a really exciting possibility," he said.

The current results build on earlier feasibility and safety results from AVERT, a randomized controlled trial designed to compare outcomes in stroke patients who undergo very early mobilization (VEM) with those who receive standard care (SC).

The study included 71 patients with confirmed stroke who were admitted to 1 of 2 large centers within 24 hours of symptom onset and randomized to receive SC or VEM. Both groups were comparable with respect to demographics, impairment, disability, and all other baseline measures.

Patients in the VEM group began mobilization as soon as it was practical, with the aim of first mobilization within 24 hours of symptom onset. A nurse and physiotherapist team delivered VEM for the first 14 days after admission or until the patient was discharged, whichever came first.

No Sustained Effect

The study's primary outcome was patient well-being measured using the Irritability, Depression, and Anxiety (IDA) scale. Subjects were assessed at day 7 and 14 and again at 3, 6, and 12 months after stroke.

At 7 days, investigators found depressive symptoms were significantly more common in the SC group than in VEM patients. While there was a trend toward anxiety reduction in the VEM group, this was not statistically significant.

According to the investigators, it is well know that physical activity reduces depression in healthy individuals, but it is not known whether this effect endures once exercise ceases.

The study showed that VEM reduced depression in the immediate poststroke phase. However, at 1 year there was no difference between the 2 groups, a finding that may suggest the need for ongoing physical activity intervention to keep depressive symptoms at bay in this patient population.

Further research, said Dr. Cumming, is needed to shed light on whether the potential mechanism is psychological, physiological, or some combination of the 2. He said he is also interested examining the possibility that early mobilization may help improve cognition in stroke patients.

The study was supported by the National Heart Foundation of Australia, Affinity Health, Austin Health Medical Research Fund, and the National Health and Medical Research Council.