Friday, December 7, 2007
By Nikhil Swaminathan
Alterations in the genetic coding for a nerve cell receptor, which detects a chemical signal that is key to behavioral change, could point the way to designing therapies most effective for patients suffering from schizophrenia, drug addiction and other mental illnesses.
"I don't know if what we just published is a viable biomarker," says Wolfgang Sadee, chair of the Department of Pharmacology at The Ohio State University (O.S.U.) College of Medicine and the co-author of a report on the finding published this week in Proceedings of the National Academy of Sciences USA. "But, I think there's a good chance that this is a biomarker that we will at least test and we will know soon if there is something worthwhile."
A team of scientists from O.S.U. examined 68 samples of postmortem tissue from the brains of people without a history of mental illness in search of the profile of messenger RNA (mRNA) transcribed from a particular gene. (mRNA is the intermediate blueprint between gene and protein.) Researchers were specifically hunting for the mRNA created from the two alleles (copies) of the gene DRD2, which codes for a receptor protein for the neurotransmitter dopamine. D2 dopamine receptor malfunction has been linked to drug addiction, schizophrenia and Parkinson's disease. The team focused its search on the striatum (a midbrain region implicated in planning and movement) and the prefrontal cortex, the brain's central processing area.
In 15 of the brain samples, researchers found that one copy of DRD2 was producing at least 50 percent more mRNA than the other one; in the remaining brains, they discovered that both alleles produced equal amounts. They also identified SNPs (single nucleotide polymorphisms, or alterations to the genetic code created by the addition or deletion of a single nucleotide in a gene's long chain). Two of these changes caused differences in the protein made by the gene; one of them appeared to be the result of DRD2 gene being spliced together differently, resulting in a protein consisting of a slightly longer than normal chain of amino acid building blocks.
The unexpected finding of a splice variant caused some excitement in the lab, Sadee says, because, according to the literature "the short form is more inhibitory and the long form may be facilitating dopaminergic transition. … When dopaminergic input comes in, [individuals with the SNPs on one gene copy] would have a chance of having more transmission" than those with two normal copies of the gene.
Sadee contacted Alessandro Bertolino at the University of Bari in Italy, who was doing research that involved monitoring the brain activity of 117 volunteers with functional magnetic resonance imaging (fMRI) during memory tests. Seventeen of the subjects in Bertolino's pool carried the SNPs on a single allele and showed increased activity in their striata and prefrontal cortices during the mental exercises, yet performed worse on the memory tests and had less attention control than the other study participants.
Sadee speculates that the brains of these subjects may be in "unnecessary hyperdrive. The dopamine stimulates more activity and that relates to more brain activity during a memory task," he says. "That is maybe not as good as memory function. … The brain has to work harder to master the same task, and that's induced by this polymorphism."
He says the study could improve current treatments for patients suffering from mental illnesses. The proper antipsychotic drugs may in the future be determined by genotyping patients to assure the most positive effect. Physicians now often have to try out different drugs to test their effectiveness, because this class of medications is highly varied and targets different brain receptors. Such findings as these could dramatically reduce the guesswork involved, thereby leading to the proper prescription from day one. Currently, Sadee says, antipsychotics are only effective 50 to 60 percent of the time and take five to six weeks to begin working.
"The influence of antipsychotics that inhibit D2 antagonist activity will differ between the two" forms of the protein receptor, he says. "One is facilitating and the other inhibiting, so the net effect of inhibiting the D2 receptor will change. So, we think that is one possible mechanism for differences in antipsychotic response."
Thursday, November 15, 2007
Two companies plan to market the first lie-detecting devices that use
magnetic resonance imaging (MRI) and say the new tests can spot liars
with 90% accuracy.
No Lie MRI plans to begin offering brain-based lie-detector tests in
Philadelphia in late July or August, says Joel Huizenga, founder of
the San Diego-based start-up. Cephos Corp. of Pepperell, Mass., will
offer a similar service later this year using MRI machines at the
Medical University of South Carolina in Charleston, says its
president, Steven Laken.
Both rely in part on recent research funded by the federal government
aimed at producing a foolproof method of detecting deception.
The new devices differ from polygraphs in key ways. The polygraph
detects stress brought on by telling a lie. The MRI-based devices
purport to measure the lie itself, and not changes in breathing and
pulse rate brought on by lying or by some other cause.
"We're at the beginning of a technology that's going to become more
and more mature," Huizenga says. "But right now, we can offer
(customers) a chance to show they are telling the truth with a
scientific basis and a high degree of accuracy. That's something they
haven't been able to get before."
Potential customers: law enforcement, accused persons, spouses under
suspicion and job applicants. Huizenga says a 1988 law that bars
private employers from giving polygraphs to potential employees
appears not to apply to MRI tests.
No Lie MRI plans to charge $30 a minute to use its device. Cephos has
not yet set a price.
The new products are being introduced as the polygraph is under fire.
In 2002, a National Academy of Sciences study concluded that polygraph
results are too unpredictable to be used for security screening at
national labs. Yet the Department of Defense, as well as the FBI, CIA
and National Security Agency, continue to administer thousands of
polygraph tests each year to job candidates and others seeking
The Department of Defense administered about 12,000 tests in 2002, the
most recent year in which it made data public.
"They haven't found anything yet that they think can top (the
polygraph)," says Britton Chance, a University of Pennsylvania
researcher who has studied brain-based lie detection.
Some scientists and privacy advocates criticize the new lie detectors.
They note that they haven't yet proved themselves in real-world tests
and face prolonged scrutiny before they are admitted in court.
"They are going to be deployed to read people's thoughts," says Barry
Steinhardt, director of the American Civil Liberties Union's
technology and liberty project. "Little if any attention has been paid
to potential misuse and the devastating impact it would have on our
The new tests work this way: A subject lies with his head in an MRI
machine and pushes a button to answer yes-no questions as they are
flashed on a screen. The machine measures brain activity, focusing on
areas that are believed to show extra exertion when a lie is
generated. Specially designed software grades the test. The two lie detector companies use similar test techniques, but different software.
Most American courts do not admit polygraph evidence because
differences in examiner skill make it hard to determine accuracy rates.
MRI machines are used by hospitals on a daily basis to diagnose tumors
and other disorders.
After the 9/11 attacks, the FBI, CIA, Department of Defense and other
agencies began funding research into how changes in brain activity
correlate with truth telling.
Daniel Langleben, a physician and professor at the University of
Pennsylvania, and Frank Andrew Kozel, a brain image researcher at the
Medical University of South Carolina, received funding for research
into MRI-based lie detection.
Langleben's research was used by No Lie MRI to help develop its lie
detector. Cephos' lie detector is based in part on Kozel's research.
David Heeger, professor of psychology and neural science at New York
University, opposes using MRI for lie detection until more research is
done. Among the many gaps in the research, he says, is whether a false
or imagined memory will show up as a true response or a lie.
MRI-based lie detection "says changes happen (in the brain of a liar)
but it doesn't say why," says Heeger. "There's a lot more work to be
done ... before we have a deep understanding of what goes on when
Gregg Bloche, a medical doctor and a law professor at Georgetown
University, says scientists, lawyers and ethicists should conduct a
"high profile discussion" of the new technology's potential uses and
pitfalls before it is made available to the public.
The start-up companies say the technology is ready now. Both say they
will focus on winning acceptance in court for tests taken by
customers. No Lie MRI already is working with a defendant in a
California criminal case, Huizenga says.
"We understand that there are further ethics conversations (needed)
when science pushes the envelope," says Cephos' Laken. "But we don't
see these (tests) being set up in dressing rooms and shopping malls.
That's not going to happen."
By Corinna Underwood
As any Baby Boomer will tell you, Americans have more information to cram into their memories than ever. Yet, as we age, our capacity for recall grows weaker.
But what if you could capture every waking moment of your entire life, store it on your computer and then recall digital snapshots of everything you've seen and heard with just a quick search?
Renowned computer scientist Gordon Bell, head of Microsoft's Media Presence Research Group and founder of the Computer History Museum in Silicon Valley, thinks he might be able to do just that.
He calls it a "surrogate memory," and what he considers an early version of it even has an official name — MyLifeBits.
"The goal is to live as much of life as possible versus spending time maintaining our memory system," Bell explains.
• Click here for FOXNews.com's Personal Technology Center.
Perfect surrogate memory would be supplemental to, but ultimately as good as, your original memory.
It could let you listen to every conversation you had when you were 21 or find that photograph of the obscure date you had on summer vacation.
As Bell says, it would "supplement (and sometimes supplant) other information-processing systems, including people."
MyLifeBits isn't quite there yet, but Bell's nevertheless "gone paperless" for the past decade as part of the project, keeping a detailed, digitized diary that documents his life with photographs, letters and voice recordings.
So that he doesn't miss out on important daily events, Bell wears a SenseCam, developed by Microsoft Research, that takes pictures whenever it detects he may want a photograph.
The camera's infrared sensor picks up on body heat and takes snapshots of anyone else in the room, adjusting itself as available light changes.
Not only does MyLifeBits record your life's digital information, but the software, developed by Bell's researchers Jim Gemmell and Roger Lueder, also can help you retrieve it.
"MyLifeBits is a system aimed at capturing cyber-content in the course of daily life with the goal of being able to utilize it in various ways at work, in our personal life — e.g. finances, family, health and for our future memory," Bell says.
Simply enter a keyword such as "pet," for example, and the search engine will find all available information on your childhood puppy.
It also can run more intricate searches, allowing you to cross-reference all associations linked to certain people or places.
If you're having difficulty remembering where you were and who you were with on a certain day, MyLifeBits would remind you.
And just how much data is needed on a day-to-day basis?
"All the bits that we can that will likely have value for our memory in the near and long-term future, a few bits just for the hell of it," Bell says. "We end up with more bits because we need them for relationships."
Still, is recalling every single detail of an entire lifetime too much? How can anyone guess what's going to be important 20 years from now?
"It is impossible to know what will be required in the future," says Bell. "Furthermore, recording everything allows one item to be used to find another item that may have been created at the same time."
Bell says MyLifeBits could have another important benefit: It may actually improve your real memory.
According to Bell, being reminded of someone in a photograph or screensaver strengthens our recollections.
We constantly are reminded of other events when we delve into our past to find snippets for which we are looking. This reinforces a whole host of links to other memories we otherwise may have forgotten.
But since all this is digitally recorded, what if hackers find it? Couldn't MyLifeBits be a threat to privacy and a boon to identity thieves?
Bell doesn't seem overly concerned.
"MLB introduces no new problems that aren't present in modern computer systems," he said, "except that we present a larger cross-section that makes all the content potentially more valuable."
Additional passwords are being built into the most sensitive documents, he explains.
An even bigger hurdle for the project is cost-efficiency.
The Microsoft team predicts that by 2010, a 1-terabyte (1,000-gigabyte) hard drive will cost less than $300.
That could easily hold all text documents, voice files and photographs of a person's complete life experience — but if it came to video, it would be only enough for four hours per day for an entire year.
On a somewhat smaller level, Sunil Vemuri, co-founder and chief product officer of Hyderabad, India-based QTech, Inc., has been working to develop a "memory prosthesis" that can help people with common, day-to-day memory problems.
QTech's "reQall" service provides a toll-free number that allows clients to use any phone to record reminders of events, appointments or thoughts as and when needed.
It then saves and organizes the recordings and sends daily reminders as needed.
"ReQall is meant for anyone who forgets, for anyone with a day-to-day memory problem," Vemuri says. "The aim of reQall is to provide a long-term service that is available to everyone right now."
Vemuri sees great growth potential for reQall. He wants his team to refine the service to suit users' individual memory needs, whether that involves helping patients remember doctors' appointments, friends remember birthdays or even journalists remember specific quotes.
More ambitious is Vemuri's "What Was I Thinking," a project he worked on while a graduate student at MIT.
That centered around software running on a Compaq iPaq personal digital assistant, similar to a Palm Pilot, which then synced to PCs running Mac OS X, Windows or Linux. It was capable of recording data and using a number of search tools to help the user find forgotten memories, using a range of built-in triggers.
"Many things can serve as good memory triggers: the smell or taste of homemade cooking, the smile on a child's face, a good joke, the roar of the crowd when your sports team scores, etc," Vemuri explained on his MIT Web page.
"In our case, the device records audio from conversations and happenings, analyzes and indexes the audio in an attempt to identify the best memory triggers, and provides a suite of retrieval tools to help the wearer access memories when a forgetting incident occurs."
The device's retrieval tools included an analysis of audio recordings to determine if conversations were heated, calm or humorous and a transcription of audio files to text files by means of a speech-recognition program.
In this way, the text files could be searched for specific words or speech patterns that can trigger those elusive memories.
In the future, some variation on these memory prostheses could change our lives on many levels, from settling a squabble over last week's football scores to assisting an elderly patient remember if she has taken her medication.
We rely on our hard drives for saving our music, photographs, e-mails and videos — so perhaps life-logging software and memory prosthetics are simply the next stage in the evolution of our relationship to the computer.
Wednesday, November 7, 2007
By Tina Benitez
Bipolar disorder runs in the family. However, scientists still cannot pinpoint it to one specific gene since there are many other factors that may lead to the manic-depressive illness.
The disorder is 75 to 80 percent hereditary, according to Dr. Candida Fink, a child and adolescent psychiatrist and author of Bipolar Disorder for Dummies and The Ups and Downs of Raising a Bipolar Child. Stress, family life, emotional and psychological stresses should also be taken into account.
“It runs a pretty broad range,” said Fink. “Different stresses are likely contributors. Vulnerability and emotional and psychological stress is partly related, but it isn’t always one, and this is important to know. One can develop bipolar with stress or other things combined. We don’t know if it’s something biological or a mixture of developmental, hormonal imbalances.”
Fink said that some hereditary factors are strong indicators of whether someone will develop bipolar disorder or not. Identical twins have a 70 percent higher chance of developing bipolar disorder. If one parent is bipolar, there is a 7 to 10 percent higher risk of getting bipolar disorder. If there are two parents, there is a 20 percent chance of developing bipolar in the offspring because of the multiple genes.
“It’s a huge issue when getting family history and sitting with someone,” said Fink. “Does bipolar run in the family? Particularly, sometimes the further back you go, the family was not diagnosing it, or no one talked about it. It was kind of like, did you have an aunt Sally that never left the house? The most relevant indicator is the immediate family, which is typical of most mental illnesses, but there’s not just one thing that is going to give us the answer.”
Individuals who are bipolar will present signs of depression first, said Fink, but experts must look for the red flags in the different types of depression when diagnosing it. Was it onset at a young age or not? Were there some brief or discreet episodes in their past?
Dr. Francis McMahon, chief of the Genetics Unit of the Mood Anxiety Disorders Program at the National Institute of Mental Health (NIMH) in Bethesda, Md., said that it has been known for a long time that bipolar disorder and manic depression are the most genetic of all illnesses. There have been three kinds of studies, including family studies that show that bipolar disorder definitely runs in families. The other studies were conducted on twins, identical and fraternal.
“If you have a close family member, you have a 10-fold chance of getting bipolar disorder,” said McMahon. “If you do the math, there is about an 80 percent difference of risk of bipolar, which leaves about a 20 percent chance for other factors, and we know even less about what those are.”
Who Gets It — and Why?
For parents, Fink wants to develop a handbook for parents with children who may be bipolar, because there are so many different mood disturbances throughout childhood. “In studies of kids, there may be depression, but you have to ask if they really do look like the classic bipolar disorder,” Fink said. “There’s also attention deficit disorder, or other illnesses brought on by huge trauma. They have to understand the neurobiology component.”
There’s even a division between men and women when it comes to acquiring bipolar, because more women suffer from depression. Bipolar dramatic episodes may show up in the early 30s with more manic episodes later in life in the 40s. “A lot of bipolar disorder is struggling with depression, mostly reported in childhood. Manic episodes show up later, and that’s when you’re hit over the head with bipolar,” said Fink.
Life experiences, use of abusive drugs can also play a role in developing bipolar disorder, but any one of these can cause anxiety or depression, which may exacerbate bipolar episodes, according to McMahon. “Assuming that you are already susceptible, since it is in the family, there may be more of a chance of getting bipolar.”
McMahon added that if people are bipolar, they have to be aware of several things to control episodes including, but not limited, to:
— Missing nights of sleep
— Trans-meridian travel, which causes stress on the body
— Whether they have stress in the family
“Stress in general is not a good thing,” he said. “But in and of itself some people with bipolar have to watch out for stresses that may come from jet lag as well as occupational stress.
Work stress can lead to manic or depressive episodes, often in the case of someone getting a promotion or getting more responsibilities at work, which can tip them over into being susceptible. Bipolar also tends to be something people get early on in life, and according to McMahon, 80 percent of people who are bipolar will see signs of the illness when they are 24, but it doesn’t mean they will be diagnosed.
Once a psychiatrist goes over the family history with them, there may have been some illness that occurred in adolescence. People can go months and years being undiagnosed and continue to get a number of episodes down the line. “Bipolar is episodic,” he said. “A lot of studies are about what triggers the episode, but there’s very little evidence of why people get it.”
Some of the bigger challenges with new medications used for treating bipolar disorders, according to McMahon is that there are a lot of drugs that treat the episodes, or prevent the episodes, but there’s nothing available to cure the illness. He added that the big challenge for big treatments in the next decade lie in genetic studies.
One group of studies that has made strides in linking bipolar, major depression and other diseases to the genes is the Bipolar Disorder Phenome Database, a collaboration of institutional research from NIMH, Johns Hopkins University, the University of Colorado’s Health Sciences Center, and the University of Heidelberg in Germany. The database includes 20 years of studies on bipolar individuals and those with a related illness and their family history.
He said that there is one gene, which has been clearly identified as influencing bipolar and other diseases, as well as two other genes that need to be reviewed more deeply.
McMahon said that researchers are seeing more and more linkage between specific genes and diseases, specifically bipolar disorder and depression, but it’s still not clear what happens to this genetic makeup, whether it is brought on hereditarily or from outside factors.
“It sounds like a big achievement, and it has had contradictory results, but it still feels like a big step forward,” he said. “We have a long way forward. We have to figure out how and what they do that goes wrong. If we talk six months from now, we may clearly have found specific genes that link between bipolar and depression.”
Tuesday, November 6, 2007
Tuesday, November 06, 2007
Research about to published in the journal Molecular Psychiatry, resulting from a collaboration between scientists in Germany, Portugal and the UK, suggests that stress contribute directly to the development of Alzheimer’s disease (AD).
According to the results now published, stress induces the production of amyloid beta (Ab) peptide – the molecule associated with the neural plaques characteristic of the disease – and also makes neurons more vulnerable to Ab toxicity. Administration of glucocorticoids (GC) - the production of which is the first physiological response to stress – was shown to have the same effect, confirming the role of stress in AD. This last result is particularly important as GC are used to treat Alzheimer’s patients and according to this research instead of helping they might be, instead, contributing to the disease.
Alzheimer’s disease is part of a group of illnesses called amyloidoses, which result from protein failure to fold and work properly (proteins’ shape is directly related to their functionality) leading, instead, to their accumulation as toxic insoluble plaques (or amyloids). In Alzheimer’s the misfolded protein is called Ab and is found as insoluble plaques in the diseased neurons of patients.
It is known that AD patients can have higher anxiety and GC levels than those found on normal individuals and, in rodent models of AD, it has been found that stress can exacerbate the disease. Furthermore, high stressful conditions leads to cognitive impairments very similar to those found in AD patients. These observations have led researchers C Catania, N Sousa, OFX Almeida and colleagues in Germany, Portugal and the UK to wonder if there could be a causal relationship between stress and AD.
In order to investigate this possible link the researchers tested middle-aged rats in different stressful situations looking into Ab peptide (and also another molecule called C99) levels in the hippocampus and the prefrontal cortex areas of the rats’ brain.
In fact, the first signs of AD do not correlate with the insoluble plaques of Ab protein found in the diseased brain, but instead, with the levels of soluble Ab peptide, while the hippocampus and the prefrontal cortex are the first brain areas affected in AD. Furthermore the Ab peptide is formed from the breakdown of the amyloid precursor protein APP in a series of consecutive steps that originate a molecule called C99, which, when further degraded/cleaved, creates the Ab peptide (APP –… – C99 – Ab). And while Ab peptide is well known to be neurotoxic, recent reports have indicated that C99 – besides being the precursor of Ab - has a similar toxicity with both molecules affecting neural function and also cognitive behaviour. This has led Catania, Sousa, Almeida and colleagues to use the brain levels of both Ab and C99 as a measure of potential neuro-damage and AD development in their experiments. Additionally, the researchers also looked into the consequences of glucocorticoids administration in order to confirm the specific effects of stress in AD, since GC secretion is the first physiological response to stress.
The team of researchers found that stressful situations or injections of GC (which mimic stress) led to an increase of both C99 and, eventually, of Ab, in both the hippocampus and the prefrontal cortex of the rats’ brain. Furthermore, rats with a history of stress were more susceptible to the effects of stress or GC administration, showing bigger increases in C99 levels. It was also observed that administration of soluble Ab led to a similar increase in C99 in the rats’ brain, supporting results by others that have shown that Ab can induce its own production, but also suggesting that Ab, stress and GC induced the same biochemical responses.
Next, Catania and colleagues looked at the animals’ behaviour - as behavioural changes are the hallmark of AD - more specifically, at their learning and memory abilities, as these are the two first cognitive functions affected in the disease. They also looked into the rats’ anxiety levels since AD patients are known to be abnormally anxious.
For the analysis of spatial memory abilities, stressed and non-stressed rats were tested in a maze over 4 days while their emotional state was accessed by looking into anxiety levels, locomotion patterns and exploratory behaviour.
Their first conclusion was that, like AD patients, rats put into stressful situations or receiving GC were much more anxious than controls. It was also shown that these rats showed had less exploratory interest than control animals. Spatial reference memory – which is involved in learning with repeated experiences, such as those experienced in the maze – was similarly impaired by stress or administration of GC or Ab. This last result again supports the conclusion that the GC and Ab act on DA through the same biochemical mechanism. When stress and GC were applied together, spatial reference memory impairment increased revealing a cumulative effect of the various factors.
In conclusion, Catania, Sousa, Almeida and colleagues show that stress can contribute to Alzheimer’s disease in two ways: by inducing the production of known neurotoxic molecules - C99 and Ab - that affect neural function and cognitive behaviour, but also – in the case of existing a previous history of stress – by making animals more susceptible to the C99-inducing effects of GC and Ab. These results – if further confirmed – have important implications for the understanding of the mechanisms behind Alzheimer’s disease and its predisposing factors and, consequently, also for possible therapeutic approaches.
Catania, Sousa, Almeida and colleagues’ research elucidates the mechanism behind stress and GC direct effect on the brain, and can also be important to understand how stress-mediated diseases, such as depression, affect brain function. The research also alert to the need of investigating further the use of GC in AD therapy and calls for the importance of, when treating AD patients, access previous history of stress or GC therapy.
From:Molecular Psychiatry 2007 advance online publication
“The amyloidogenic potential and behavioral correlates of stress”
Thursday, November 1, 2007
February 25, 2007
Formerly approved drug imparts lasting learning and memory improvements to impaired mice
Researchers may have finally found a drug candidate for reducing the mental retardation caused by Down syndrome, which afflicts more than 350,000 people in the U.S. Researchers gave low doses of a human drug to mice bred to mimic the learning and memory problems in people with Down syndrome. After as little as two weeks, the impaired mice performed as well as normal ones in learning tests, and the improvement lasted for up to two months after treatment ended.
But there is a catch: the drug was taken off the market 25 years ago after being found to cause dangerous seizures in some people. And many compounds that boost learning in mice fail in human trials.
Nevertheless, "anyone studying Down's is going to have their socks blown off by this," says geneticist Roger Reeves, a Down syndrome specialist at the Johns Hopkins School of Medicine in Baltimore, who was not involved in the study. "There hasn't been anything out there that we really could take to patients or that we had a strong possibility of taking into the clinic."
Researchers tested the drug, pentylenetetrazole (PTZ), as well as two other compounds—picrotoxin and a gingko biloba extract called bilobalide—because they all interfere with tiny ion channels on brain cells (neurons). When activated, the channels, known as GABAA receptors, inhibit the cells, making it harder for them to form new synapses, or connections, with neighboring neurons.
The deficits of Down syndrome may occur because the brain contains too many such inhibitory signals, says Stanford University neurobiologist Craig Garner, whose group performed the experiments. "In order to learn, you have to have a period during which synapses can get stronger or weaker," he says. "This changing is what's not possible when you have too much inhibition."
So Garner, his student Fabian Fernandez, and their colleagues gave their mice either low doses of PTZ mixed with milk, or low-dose injections of picrotoxin or bilobalide, daily for two to four weeks to slightly raise the level of excitation in the brain. Immediately after treatment, the animals' scores on two memory tests—for recognizing objects they had seen before or remembering how they last entered a maze—were on par with normal mice; two months later, they still did much better than they normally would, the researchers report in a paper appearing online February 25 in Nature Neuroscience.
The treatment "is allowing the normal properties of neurons to work," Garner says. "This slowly over time leads to an improved circuit."
Reeves says there may be other ways to treat Down syndrome, but "you can see your way to clinical testing most easily from here," because researchers identified specific chemicals. "It's hugely promising," he says. "Maybe it will have a big effect, but we don't know that." The inhibition model is plausible, but still unproved in people, he notes, and until researchers better understand the mechanisms by which the compounds work, "I'm wary of rushing into the clinic."
Garner says clinical trials of PTZ could begin in the next year or two, and evaluating them might take five to 10 years. He notes that although PTZ is nearly 100 years old and was used to treat psychiatric disorders and later dementia, researchers never concluded it was effective. It also caused seizures (at doses 100-fold higher than those given to the mice), so the FDA revoked its approval in 1982.
In Down syndrome, chromosome 21 is present in three copies instead of two. Similarly, the mice used in the study have a duplicated piece of chromosome 16. As in Down syndrome, these animals have malformed facial bones and problems forming new memories.
Reeves notes that many researchers have long considered Down syndrome too complex to crack, but the study "serves as notice to the neuroscience community that there are a lot of interesting things to do here. This is not some vague, mega complex issue."
June 26, 2007
Finding could set the stage for ways to reverse damage in sufferers of the inherited fragile X syndrome
In a case of life imitating art, researchers at the Massachusetts Institute of Technology (M.I.T.) reported today that they had successfully reversed mental retardation in mice, just as scientists did in the classic 1966 novel Flowers for Algernon. In the book by Daniel Keyes, scientists use experimental surgery—first tested on a mouse named Algernon—to dramatically boost the intelligence of a mentally retarded janitor named Charlie Gordon. Now M.I.T. scientists report in Proceedings of the National Academy of the Sciences USA that they ameliorated brain damage in mice caused by a genetic disorder known as fragile X syndrome by blocking an enzyme involved in cellular development.
Fragile X affects one in 4,000 boys and one in 6,000 girls. It is caused by a mutation in the fragile x mental retardation 1 gene (FMR1)—located on the X sex chromosome— that results in the loss of the fragile x mental retardation protein (FMRP). The resulting illness is characterized by hyperactivity, attention deficit, repetitive behavior, anxiety and cognitive difficulties ranging from learning disability to mental retardation.
Previous studies of fragile X show that nerve cells in the cortex (the outermost layer of the brain) of both patients and knockout mice have dendrites (rootlike projections that nerve cells use to receive electrical signals from peers) with a greater number of branches. These extra shoots are thin, stubby, immature and less functional than normal dendrites, causing poor transmission among neurons across synapses (the spaces between nerve cells).
When studying the formation of dendrites for a 2004 paper, Mansuo Hayashi, a research affiliate in M.I.T.'s Picower Institute for Learning and Memory, discovered that these structures could be strengthened and altered to transmit information more efficiently by inhibiting nerve cell production of the enzyme called p21-activated kinase (PAK). PAK regulates actin, another protein, which shapes parts of the cell (including the dendrites). When PAK is inhibited, more actin is manufactured and the dendrites are able to properly mature.
In the current study, Hayashi led a team that worked with "double mutant" mice. First, the researchers mutated the FMR1 so that the mice would lack FMRP protein and thus show symptoms of fragile X. A second mutation to the gene that codes for PAK caused the mouse's cells in the forebrain to stop producing the enzyme at three weeks after birth. "When these two mutants were crossed together, [the mice] were normal," says senior study author Susumu Tonegawa, a professor of neuroscience and biology at Picower.
Not only did the dendrites of the fragile X–afflicted mice become more robust and efficient, but researchers report that several of the behavioral symptoms associated with the disorder also abated: Double mutant mice behaved more like normal mice than like those with fragile X.
William Greenough, a professor of psychology at the University of Illinois at Urbana-Champaign, says that the study demonstrates the long-term potential of gene therapy. "I am impressed by the kinds of different symptoms … that seem to be under [the] governance of this single signaling pathway," he says. "It seems to affect everything from neuronal morphology (or shape) to behavior that may or may not be coupled to that morphology."
Tonegawa says that the results may pave the way to a possible new molecular target for new drugs. "If you take a fragile X patient and feed them or inject them with a compound that will inhibit their own endogenous PAK enzyme," he says, "it may reduce the fragile X symptoms." He notes that a significant finding of the study is that the effects of fragile X—which is typically diagnosed only after a child shows marked developmental delays, such as being late to crawl, walk or talk—could be reversed at the onset.
How the brain parses music—and pays attention
It's a classic cocktail party conundrum: How do our brains decide where we should train our attention when people are milling all about us chatting away—some to us, some to others?
In an attempt to find out, researchers at Stanford University and McGill University in Montreal scanned the brains of 18 subjects who were listening to classical music by 18th-century British composer William Boyce.
"You have to kind of segment these streams [of information] into chunks," says senior study author Vinod Menon, an associate professor of psychiatry and behavioral science at Stanford. The process of slicing the data, he continues, requires that you "identify something that's interesting and then you have to switch on the attentional network."
"Memory doesn't work like video recorder, it's more like DVD," in how it recalls events as discrete chapters, explains study co-author Daniel Levitin, a psychology professor at McGill.
But why music?
Simple, says Sridhar Devarajan, a Stanford neuroscience graduate student involved in the project. "Transitions between musical movements," he notes, "offer an ideal setting to study the dynamically changing landscape of activity in the brain during this segmentation process."
The team wove together different movements from several of Boyce's four- to five-minute symphonies and had volunteers listen to two nine-minute compositions through noise-canceling headphones while lying in an fMRI machine; each of the musical tapestries consisted of 10 orchestral movements. (Boyce's compositions were selected because he is a relatively obscure composer and subjects were less likely to be familiar with his work, canceling out brain activity that would be generated if they had recollections of his symphonies.)
The team conducted full brain scans that allowed them to focus on regions of the brain of particular interest, monitoring them from 10 seconds before a transition between movements to 10 seconds after. (A transition between movements is marked by a decline in the amplitude of sound followed by a brief period of silence that leads into a new section of music.)
According to Menon, during the transition periods the team not only observed activity in discrete brain regions, but they also noticed co-active networks (two areas responding simultaneously) reacting to the musical shifts. Activity began in areas in the forward section of the prefrontal cortex of the brain as well as in parts of the temporal cortex just above the brain stem. Menon speculates that it is this network that is "detecting a salient change" in the information stream. Next, areas toward the rear of the prefrontal cortex and parts of the parietal cortex (the outermost layer of the parietal lobe at the upper rear of the brain), began to respond. The regions, Menon notes, are linked to attention and working memory.
"We feel that it could be a very general brain mechanism…part of a core control signal that is generated in response to any attention-demanding task," Menon says. "I think any task that involves detecting a salient event and attending to it will command a similar type of response."
Levitin adds, "Here we had a peak of activation…associated with nothingness," when there was no sound at all. "[Clearly], there are neural processes that are responsible for signaling the beginning and ending of events."
Next up for Menon's team: trying to determine the next level of processing after recognizing and responding to a change in an information stream. It also plans to apply whatever new information it discovers to learn more about "what structure actually means in music."
Stem cell transplants rescue memory problems in mice, but not in the way you might expect
A new study finds that neural stem cells may be able to save dying brain cells without transforming into new brain tissue, at least in rodents. Researchers from the University of California, Irvine, report that stem cells rejuvenated the learning and memory abilities of mice engineered to lose neurons in a way that simulated the aftermath of Alzheimer's disease, stroke and other brain injuries.
Researchers expect stem cells to transform into replacement tissue capable of replacing damaged cells. But in this case, the undifferentiated stem cells, harvested from 14-day-old mouse brains, did not simply replace neurons that had died off. Rather, the group speculates that the transplanted cells secreted protective neurotrophins, proteins that promote cell survival by keeping neurons from inducing apoptosis (programmed cell death). Instead, the once ill-fated neurons strengthened their interconnections and kept functioning.
"The primary implication here is that stem cells can help rescue memory deficits that are due to cell loss," says Frank LaFerla, a professor of neurobiology and behavior at U.C. Irvine and the senior author on a new study published in The Journal of Neuroscience. If the therapeutic benefit was indeed solely due to a neurotrophic factor, the door could be opened to using that protein alone as a drug to restore learning ability.
LaFerla's team genetically engineered mice to lose cells in their hippocampus, a region in the forebrain important for short-term memory formation. These mice were about twice as likely than unaltered rodents to fail a test of their ability to discern whether an object in a cage had been moved since their previous visit.
But when the mutant mice were injected with about 200,000 stem cells directly into their hippocampi and retested up to three months later, the injured animals performed up to par with their normal counterparts.
LaFerla's team found that in healthy mice that were similarly injected, the stem cells (which were marked with a green fluorescent dye) had spread throughout the brain. In the brains of the diseased mice, however, nearly all the cells congregated in the hippocampi. "Somehow, in the damaged region, there is some kind of signal that's telling the stem cells to stay local," LaFerla explains.
Curiously, the researchers discovered that only about five percent of the stem cells injected into the brain-addled mice matured into adult neurons. The surrounding neurons that were there all along, however, had sprouted a denser set of connections with other cells, presumably allowing for better transmission of information and recovery of function. "We think it's some neurotrophic factor being secreted by the [stem] cells," LaFerla says. If his group can identify it, he adds, they can answer the question: "Can that substance [alone] be provided to the brain and rescue the memory deficit?"
Eugene Redmond, a professor of psychiatry and surgery at Yale University School of Medicine notes the new work is "certainly well done. Their conclusion is similar to our study in Parkinsonian monkeys." He notes that in his study there was evidence of stem cells replacing lost neurons as well as other benefits conferred by the transplant.
Monday, October 29, 2007
Medical News Today
24 Oct 2007
A new US study suggests that people with more years of formal education experience the onset of memory decline associated with dementia later, but once it starts it declines more rapidly, compared to people with fewer years of education.
The study is published in the 23rd October issue journal Neurology and is the work of Dr Charles B Hall, from the Albert Einstein College of Medicine in Bronx, New York and colleagues.
Hall said that:
"Higher levels of education delay the onset of dementia, but once it begins, the accelerated memory loss is more rapid in people with more education."
He and his colleagues found that a person with 16 years of formal education experienced a 50 per cent faster rate of memory decline than someone with only 4 years of education.
The researchers wanted to test something called the cognitive reserve hypothesis. This suggests that people with more years of education have a reserve of cognitive ability that hides (and this can be without the person realising it) the onset of memory decline (for instance they use thinking skills to work out answers to memory tests). This could explain why low education is a well known risk factor for Alzheimer's Disease (AD), a condition that is characterised by accelerated memory decline.
Hall and colleagues followed 488 people in the Bronx Aging Study of whom 117 developed dementia. The participants underwent detailed assessment of their cognitive skills at the start of the study, and also every year during the 6 years of follow up, when they completed a memory test called the Buschke Selective Reminding Test. Their formal education ranged from fewer than 3 years of elementary school to completion of postgraduate education.
The researchers estimated the point at which the rate of cognitive decline began to accelerate (the change point as they called it), and the rates of decline before and after this point.
They found that every additional year of formal education delayed the change point by 0.21 years (2.5 months). After the change point, the rate of memory decline increased by 0.10 points for each year of additional formal education. This translated to a 4 per cent faster rate of decline for each additional year of formal education.
The researchers gave an example. The onset of accelerated memory decline for a college graduate with 16 years of formal education who is diagnosed with dementia at 85 years of age would have started four years earlier, at age 81. But a person with only 4 years of formal education, who is diagnosed at the same age of 85, would have started to experience a slower rate of decline six years before diagnosis, at age 79.
This seemed to reflect previous research findings that showed people with more years of education suffered memory loss more quickly once they were diagnosed with dementia, wrote the researchers.
The researchers concluded that:
"As predicted by the cognitive reserve hypothesis, higher education delays the onset of accelerated cognitive decline; once it begins it is more rapid in persons with more education."
Commenting on the findings, Hall said that:
"This rapid decline may be explained by how people with more education have a greater cognitive reserve, or the brain's ability to maintain function in spite of damage."
"So while they're often diagnosed with dementia at a later date, once the cognitive reserve is no longer able to compensate for the damage that's occurred, then the symptoms emerge," he added.
The researchers wrote that while their study was important because it was the first to test the cognitive reserve hypothesis in people with preclinical dementia, they pointed out that the people in the study were born at the turn of the 20th century and their life experiences and education may not be representative of people currently in education today.
"Education delays accelerated decline on a memory test in persons who develop dementia."
Hall, C. B., Derby, C., LeValley, A., Katz, M. J., Verghese, J., Lipton, R. B.
Neurology 2007 69: 1657-1664
Sunday, October 28, 2007
A team of neuroscientists announced a scientific breakthrough last week in the use of brain scans to discover what's on someone's mind.
Researchers from the Max Planck Institute for Human Cognitive and Brain Sciences, along with scientists from London and Tokyo, asked subjects to secretly decide in advance whether to add or subtract two numbers they would later be shown. Using computer algorithms and functional magnetic resonance imaging, or fMRI, the scientists were able to determine with 70 percent accuracy what the participants' intentions were, even before they were shown the numbers.
The study used "multivariate pattern recognition" to identify oxygen flow in the brain that occurs in association with specific thoughts. The researchers trained a computer to recognize these flow patterns and to extrapolate from what it had learned to accurately read intentions.
The finding raises issues about the application of such tools for screening suspected terrorists -- as well as for predicting future dangerousness more generally. Are we closer than ever to the
crime-prediction technology of Minority Report?
As I've argued in this space before, the popular press tends to over-dramatize scientific advances in mind reading. FMRI results have to account for heart rate, respiration, motion and a number of other factors that might all cause variance in the signal. Also, individual brains differ, so scientists need to study a subject's patterns before they can train a computer to identify those patterns or make predictions.
While the details of this particular study are not yet published, the subjects' limited options of either adding or subtracting the numbers means the computer already had a 50/50 chance of guessing correctly even without fMRI readings. The researchers indisputably made physiological findings that are significant for future experiments, but we're still a long way from mind reading.
Still, the more we learn about how the brain operates, the more predictable human beings seem to become. In the Dec. 19, 2006, issue of The Economist, an article questioned the scientific validity of the notion of free will: Individuals with particular congenital genetic characteristics are predisposed, if not predestined, to violence.
Studies have shown that genes and organic factors like frontal lobe impairments, low serotonin levels and dopamine receptors are highly correlated with criminal behavior. Studies of twins show that heredity is a major factor in criminal conduct. While no one gene may make you
a criminal, a mixture of biological factors, exacerbated by environmental conditions, may well do so.
Looking at scientific advances like these, legal scholars are beginning to question the foundational principles of our criminal justice system.
For example, University of Florida law professor Christopher Slobogin, who is visiting at Stanford this year, has set forth a compelling case for putting prevention before retribution in criminal justice.
Two weeks ago, Slobogin gave a talk based on his book, Minding Justice. He pointed to the studies showing that our behavior is predetermined or strongly influenced by biology, and that if we can identify those biological factors, we can predict behavior. He argues that the justice system should provide treatment for potential wrongdoers based on predictions of dangerousness instead of settling for punishing them after the fact.
It's a tempting thought. If there is no such thing as free will, then a system that punishes transgressive behavior as a matter of moral condemnation does not make a lot of sense. It's compelling to contemplate a system that manages and reduces the risk of criminal behavior in the first place.
Yet, despite last week's announcement from the Max Planck Institute, neuroscience and bioscience are not at a point where we can reliably predict human behavior. To me, that's the most powerful objection to a preventative justice system -- if we aren't particularly good at
predicting future behavior, we risk criminalizing the innocent.
We aren't particularly good at rehabilitation, either, so even if we were sufficiently accurate in identifying future offenders, we wouldn't really know what to do with them.
Nor is society ready to deal with the ethical and practical problems posed by a system that classifies and categorizes people based on oxygen flow, genetics and environmental factors that are correlated as much with poverty as with future criminality.
"Minority Report," a short story by Philip K. Dick that became the 2002 Steven Spielberg blockbuster, portrays a society where investigators can learn in advance that someone will commit a murder, even before the offender himself knows what he will do. Gattaca, a
1997 film, tells of a society that discriminates against genetically "inferior" individuals.
Science fiction has long grappled with the question of how a society that can predict future behavior should act. The stories suggest that people are more varied, more free, than the computer models allow. They also suggest that a system based on predictions is subject not
only to innocent mistakes but also to malicious manipulation at a level far greater than our current punishment-oriented regime.
In time, neuroscience may produce reliable behavior predictions. But until then, we should take the lessons of science fiction to heart when deciding how to use new predictive techniques.
LONDON -- They are life's perennial questions: Pepsi or Coke, Mac or PC, cremation or burial?
Scientists in the United Kingdom believe they may be close to unraveling some of the brain processes that ultimately dictate the choices we make as consumers.
Using a revolutionary method of imaging the brain, researchers from the Open University (OU) and the London Business School say they have identified the brain region that becomes active as the shopper reaches to the supermarket shelf to make their final choice.
If that sounds a frivolous finding, the scientists involved insist their work has serious applications.
"How people use their brains to learn, record and store memory is a fundamental question in contemporary neuroscience," said Steven Rose, director of the brain and behavior research group and professor of biology at the Open University.
Beyond the local supermarket, Rose and his team believe the findings could go on to show the brain processes behind the conscious decisions people make when it comes to important life choices, such as selecting a partner or career. The research may also one day help manufacturers and marketers shape advertising and branding strategies for their products.
The scientists used a technique known as magnetic encephalography (MEG) -- the fastest of all scanner methods -- to measure the minute magnetic fields around the brain and identify regions that are active during the second or so it takes for a person to make their shopping choice.
Subjects were taken on an 18-minute virtual tour of a supermarket. During regular pauses in the tour, they were asked to make a choice between different brands and products on the shelves by pressing a button.
Researchers found that the brain was hugely active during the 2.5 seconds it took for the button press to be made.
"Within 80 milliseconds their visual cortex responds as they perceive the choice items," Rose said. "A little later, regions of the brain associated with memory and speech become active."
More interesting for the researchers, however, was what happened as consumers made their final choice.
"After about 800 milliseconds -- and this was the surprising thing -- if and only if they really prefer one of the choice items, a region called the right parietal cortex becomes active. This then is the region of the brain involved in making conscious decisions -- in this experiment about shopping choices, but maybe for more important life choices too," Rose said.
Other scientists gave a guarded welcome to the findings of the OU researchers -- Rose, Professor Stephen Swithenby, Dr. Sven Braeutigam and Dr. John Stins -- who will publish their findings in the next issue of the journal Neural Plasticity.
Michael Miller, professor and chair in the department of neuroscience and physiology at SUNY Upstate Medical University, said that MEG scans provided a unique insight into the real-time activity of brain regions in humans and animals. "They are very insightful. There is a growing literature about the role of the prefrontal cortex and other areas that are involved in volitional activities," he said.
Dr. Wise Young, director of the W.M. Keck Center for Collaborative Neuroscience and a professor at Rutgers University, said that the finding was "interesting" although not particularly surprising or groundbreaking.
"The parietal cortex has been conjectured as one of the sites of decision-making in the brain. Because that part of the brain is showing activity during decision-making does not necessarily mean that the decision is actually made in that part of the brain. Also, the authors examined only several sites on the brain and there may be parts of the brain that were activated but they did not record," he said.
Young said that for centuries scientists have struggled with the question of whether function is localized or distributed in the brain.
"Much data suggests that many functions are quite broadly distributed, the brain is quite plastic and quite remarkable recovery can occur after injury to large parts of the brain. On the other hand, it is also clear that some areas are more important for some functions than others," Young said.
Dr. Susan Bookheimer, director of brain mapping center behavioral testing and training laboratory at UCLA, was more skeptical. "When I first read this, I thought it was a joke. It is not an interesting finding," she said. "The investigators have added all this supermarket stuff to make it more appealing to a general audience, but you'd find the same thing if you just used pictures of dots, for example."
While Bookheimer agrees there are areas of the brain responsible for making choices, she believes the real story is much more complex than that presented by the OU study.
"We must integrate knowledge before we make a choice," she said. "Many, not all choices, involve an emotional component controlled by a different area of the brain. Others require integrating information from multiple choices. We have to generate a response, search available responses and then initiate a plan to demonstrate the choice. It is likely that only this last process was involved in the study here -– the process of directing one's reach toward the goal."
A person's optimism in the future seems to be controlled by a small front part of the mid-brain, according to a study that used brain imaging.
That area deep behind the eyes activates when people think good thoughts about what might happen in the future.
The more optimistic a person is, the brighter the area showed up in brain scans, the scientists reported in a small study published online Thursday in the journal Nature.
That same part of the brain, called the rostral anterior cingulate cortex (rACC), seems to malfunction in people suffering depression, said the study co-authors, Elizabeth Phelps of New York University and Tali Sharot of University College London.
Researchers gave 15 people functional magnetic resonance imaging scans while they thought about future possibilities.
When the participants thought about good events, both the rACC and amygdala, which is involved in emotional responses including fear, were activated. But the correlation with optimism was biggest with the cingulate cortex.
The same study also found that people tended to think that happier events were closer in time and more vivid than the bad ones, even if they had no reason to believe it, Phelps said.
Psychologists have long known people have an "optimism bias," but the new study offers new details.
When researchers asked the subjects to think about 80 different future events that could be good, bad or neutral, they had a hard time getting people to think negatively, or even neutrally, about the future.
For example, when people were asked to ponder a future haircut, they imagined getting the best haircut of their lives, instead of just an ordinary trim, Phelps said.
The study makes sense and pulls together new and different parts of research on optimism and the brain, said Dan Schacter, a professor of psychology at Harvard University who wasn't part of the research.
Having our brains wired to optimism is generally a good thing because "if you were pessimistic about the future you would not be motivated to take a lot of action," Phelps said.
Among a sample of 184 young people being evaluated for psychiatric disorders and allergies, 105 (57 percent) had a history of allergic disorders, including asthma, hay fever, hives and eczema.
Psychiatric evaluations revealed that 124 (67 percent) had an internalizing disorder, either alone or in combination with an externalizing disorder, such as ADHD, oppositional defiant disorder
and conduct disorder. The children in the sample were between 4 and 20 years old; their average age was 13.
Researchers found that youth with internalizing disorders were almost twice as likely to have a history of allergies than those with a diagnosis that wasn't classified as an internalizing or externalizing disorder. The psychiatric disorders in this group included substance abuse, tic disorders, bed-wetting and attachment disorder.
Moreover, the association was found to be specific for "pure" internalizing disorders. That is, the likelihood of having a history of allergies was significant only among youths who had an internalizing disorder and no other psychiatric conditions.
"These findings add to the growing body of evidence supporting an association between anxiety, depressive, and allergic disorders," write Dr. Mauricio Infante and colleagues from University of
Wisconsin, Madison in the Journal of Clinical Psychiatry.
The findings also suggest that these psychiatric and medical disorders "may share risk factors and underlying pathways that contribute to the development of both types of disorders."
The Wisconsin team notes that studies are needed to identify the reasons for these associations so that effective treatment and prevention strategies that target both disorders can be developed.
SOURCE: Journal Clinical Psychiatry, September 2007.
Monday, October 22, 2007
September 21, 2005
Pressure from insurance companies and competition from drug therapies
are prompting analysts to get patients off the couch more quickly
By Jamie Talan
Wendy spent five years in psychoanalysis, delving so deeply into her
mind that she could no longer see the connection between her adult
problems and her teenage episodes of "cutting" her wrists. After she
and her analyst had their final session, during which he welcomed her
to move on with her life, Wendy was not completely happy, but she was
happier than she ever had been. And that, psychologists say, is
Psychoanalysis probes the unconscious mind to unlock the mysteries
that drive conscious emotions and behavior. The discipline is built on
pillars set by Sigmund Freud a century ago. It is characterized by
frequent sessions that can take place over many years, wherein
patients are encouraged to freely associate whatever comes to mind as
the analyst sits quietly and listens.
Today the practice is changing. The transformation is in part the
result of a better understanding of what works during self-analysis.
But increasingly, psychotherapy is changing just to survive, held
hostage to limits on insurance coverage determined by managed care
companies and facing replacement by psychoactive drugs that in the
long run are far cheaper than a patient's weekly visit to the
therapist's office. In this incarnation, it suddenly matters less that
symptoms may disappear without patients figuring out the underlying cause.
To keep psychoanalysis alive, contemporary therapists are revamping
Freud's theories. They have discarded some traditional beliefs and
have loosened requirements so patients can succeed in fewer sessions.
Many analysts are even talking to their patients and sharing their own
thoughts and feelings, a practice that Freud said would complicate the
Some experts chafe at the changes, however. They say that short-term
therapy can be successful for some problems such as phobias but does
not work for personality disorders, chronic depression and other
substantial mental illnesses. They claim that managed care companies
make decisions based on cost, not on any science that shows what works
best for a specific condition. Insurance companies argue that patients
can do just as well on medication as they can with talk therapy and
that for talk, "short term" is enough.
Extended analysis certainly is under siege. Today patients having
long-term psychotherapy--more than 20 sessions--account for only 15
percent of those who seek treatment, according to a study in the
American Journal of Psychiatry. Psychoanalysts contend that it takes
longer to work out issues that have been shaped by a lifetime of
emotion and experience, yet they know they must compete in a
magic-pill era in which people may be content to have their symptoms
disappear without much thought to why they emerged in the first place.
"A better understanding of the self is needed for a better recovery,"
asserts Gail Saltz, a Manhattan analyst and author of Becoming Real
(Riverhead Trade, 2005), a book about the benefits of analysis. She
says that there are still people who lie on the couch four times a
week, but many analysts have accepted a once-a-week regimen. And
although studies have shown that certain patients progress better when
therapy is frequent, Saltz believes once a week can still be
successful. Psychologists have at least agreed that even long-term
analysis should be completed within four years.
Patients may be content to have symptoms disappear without much
thought to why they ever emerged.
Regardless of frequency, Saltz says, the goal is to help patients
"better tolerate the ups and downs of life" or, as Freud put it, "get
beyond everyday human misery." Freud developed his ideas before
scientists knew much about the brain's workings, however, and today
some of his once popular theories about human development are seen as
High on the list is that infants have complicated sexual desires.
Peter D. Kramer, a Massachusetts psychiatrist who popularized the new
generation of antidepressants in his best-selling book Listening to
Prozac (Penguin, 1993), says that "there is no evidence that infants
have sexual desires." Kramer notes that although Freud believed that
adult complaints of childhood sexual abuse stemmed from such childhood
fantasies, the evidence today is plain that sexual abuse of children
is common, affecting up to 20 percent of girls and 10 percent of boys.
Freud also had little to offer the therapist in understanding trauma,
which experts now know can cause lifelong problems. Trauma therapy is
a relatively new field, built on work with war veterans.
Post-traumatic stress disorder is a hot topic in psychotherapy today,
one that was poorly addressed before, Kramer notes, because it was not
possible to have effective treatment when the theoretical
underpinnings were shaky.
Friend, Not Father
Readdressing the basic tenets of psychoanalysis has led to perhaps the
most radical change of all: modern psychologists are actually talking
to their patients. Freud's original "transference" theory demanded
that an analyst remain quiet and aloof so as to serve as a "screen"
onto which the patient could project her emotions. But therapists are
now sharing more of themselves. "How can I remain opaque when my
clients can go online and learn that I love Greek music?" asks
psychoanalyst Spyros D. Orfanos, clinic director in psychoanalysis at
New York University.
Orfanos says that today's analyst is not an authoritative father
figure but a partner in figuring out "the powerful emotional forces
that drive behavior." He thinks that having a dialog with a patient
is the best way to work toward change. Many analysts also now agree
that empathy is key to the relationship, and empathy requires
engagement, not just listening.
Psychoanalysis is also changing in the face of steady competition from
other forms of help, such as cognitive behavioral therapy, in which
patients try to change certain troubling behaviors, and goal-oriented
therapy, which lays out ways to attain, say, certain kinds of
relationships. These practices may or may not touch on the patient's
past. And to hold its own, psychoanalysis is shedding its image as a
privileged treatment for the wealthy; so-called training centers are
popping up everywhere that provide low-cost appointments.
Scientists are also attempting to study the biology of the analysis
process itself. At New York-Presbyterian Hospital/Weill Cornell
Medical Center, psychiatrists Otto F. Kernberg and David A.
Silbersweig are recording brain scans of patients before and after
analysis. Such studies may help end the debate over the effectiveness
of lengthy treatment, notes Kramer, who recently published Against
Depression (Viking Adult, 2005), an assessment of mood disorders. "We
don't know what works or what doesn't work."
Orfanos is dubious about scanning, maintaining that analysis is a
humanistic endeavor that does not necessarily fit into a biology-based
medical model. "It's about understanding how your mind works," he
says, "so that you can have more choices in your life."
October 05, 2007
A pair of brain regions work together to assess the threat of punishment and override our selfish tendencies
Whether you subscribe to the Ten Commandments, the Golden Rule or some instinctive moral code, society functions largely because most of its denizens adhere to a set of norms that allow them to live together in relative tranquility.
But, why is it that we put a vast amount of social resources into keeping stealing, murdering and other unfair (not to mention violent and illegal) acts to a minimum? Seems it all comes down to the fact that most of us don't cotton to being punished by our peers.
"The reason why punishment for norm violations is important is that it disciplines the potential norm violators," says Ernst Fehr, an economist at the University of Zurich and the senior author of a paper on the issue published this week in Neuron.
In the new study, Fehr and colleagues uncovered activity in two areas of the brain underlying the neural mechanism involved in conforming to society's values. They further determined that subjects with Machiavellian personalities—a strong sense of self-interest, opportunism and manipulation—have heightened activity in one of these regions, which the authors believe is related to assessing the threat of punishment.
During the study, which also involved scientists at the University of Ulm in Germany, 23 male students were instructed to play a version of the "ultimatum game" while their brains were scanned via functional magnetic resonance imaging (fMRI). Each participant was given a sum of money (100 monetary units) to split however he chose with an anonymous partner. In some cases the recipient simply had to accept any offer made. Other times, after an offer was made, the recipient had the option penalize the giver by taking some or all of their money, if the latter had not shared generously.
The subjects' brains were only scanned when they played the giver role. Before each trial, both players were told whether the recipient would be allowed to exact a punishment if he felt he got too slim a slice of the pie. Two areas of the cortex (the brain's primary processing unit) were particularly active during the trials when punishment was an option: the lateral orbitofrontal cortex, a region below the temples of the head that had, in previous research, been implicated in processing a threat stimulus, and a section just behind it called the dorsolateral prefrontal cortex.
"The lateral orbitofrontal cortex [activity] represents the punishment threat here," says Fehr, citing previous research that fingered it in threat assessment. "More specifically, how bad does the brain interpret this punishment threat?"
Alternatively, he says, "[the dorsolateral prefrontal cortex] is an area that is involved in cognitive control and overriding prepotent impulses. Here, we have a design where the prepotent impulse is not to share the money—at least to the extent that player B wants it shared."
Interestingly, the research team also had their subjects fill out a questionnaire to determine their degree of Machiavellian behavior. Those who proved to be the most ruthless of the bunch offered little to nothing when there was no threat of punishment, but within the punishment paradigm, they were generous enough to stave off retribution.
"These are socially intelligent, selfish people," Fehr says about the more calculating subjects. "They escape the punishments that are inherent in social interactions, because they seem to have a fine sense of when punishment is in the air."
Jorge Moll, principal investigator of the cognitive and behavioral neuroscience unit at the Rede Labs-D'Or Hospitals in Rio de Janeiro, says the most interesting findings were that individual scores on Machiavellianism predicted "how much a given subject will change his behavior depending on the presence of punishment," and "that the level of activity within the lateral orbitofrontal cortex is strongly related to Machiavellian personality style."
Researchers say the results could have wide-reaching implications, potentially paving the way to understand—and perhaps one day reverse—the neurobiology behind psychopathic and sociopathic personalities. They intend to repeat the study with patients suffering from antisocial anxiety and personality disorders to determine if their behavior can be explained by a lack of impulse control or a poor assessment of punishment.
Fehr argues the results could also impact the criminal justice system since the dorsolateral prefrontal cortex does not fully develop until after a person is around 20 years old.
"This area seems to be critically important in overriding self-interest," he says. Thus, "you just can't treat an immature adolescent the same way as a mature adult—that's at least my view of doing justice." It's unclear whether judges and juries see it that way, however.
By Amir S. Irvine, CA
Donald Wilson, a professor of zoology at the University of Oklahoma and co-author of the 2006 book Learning to Smell: Olfactory Perception from Neurobiology to Behavior, sniffs around for an answer.
In 2004 the Nobel Prize in Physiology or Medicine went to Linda B. Buck and Richard Axel for their research showing that there is a huge family of genes that encode proteins called olfactory receptors. Their findings, published in 1991, opened many doors toward understanding the function of the olfactory system.
One important observation was that individual olfactory sensory neurons typically express just one of those genes. Thus, signals coming from a given neuron provide information about odors that activate the specific receptor protein expressed by that cell. A single receptor protein, however, appears to bind (or recognize) many different odors. Thus, rather than having neurons that respond selectively to coffee or vanilla or Bordeaux, most individual cells (via their receptors) respond to submolecular features of the volatile chemicals coming from those objects. For example, an olfactory sensory receptor neuron may respond to a hydrocarbon chain of a particular length or a specific functional group like an alcohol or aldehyde.
This means that any given sensory neuron will respond to many different odors as long as they share a common feature. The brain (specifically, the olfactory bulb and olfactory cortex) then looks at the combination of sensory neurons activated at any given time and interprets that pattern in the context of previous patterns that have been experienced and other kinds of available information. The interpreted pattern is what you perceive as smell.
Olfactory sensory neurons, which sit in the mucus in the back of the nose and relay data into the brain via axons (fingerlike projections that transmit information out from the cell body), do not live forever. In fact, they are one of the increasingly large number of neuron types that are known to die and be replaced throughout life.
Fortunately, they do not all die at the same time. There are many thousands of olfactory sensory neurons expressing any particular olfactory receptor. When a small subset dies, the pattern of activity that the olfactory processing regions in the brain receives for a specific smell doesn't change very much. In fact, when an olfactory sensory neuron expressing a particular receptor gene dies and a new neuron expressing that same gene matures, the new neuron's axons plug in to the same group of olfactory bulb neurons that its predecessor did. This results in remarkable pattern stability over years, despite continual rewiring.
Let's imagine that on a recent holiday you tried a new wine. The odor (or bouquet, to oenophiles) of that wine is composed of dozens of different volatile chemicals, and each chemical has several submolecular features. Therefore, the wine activates a complex pattern of olfactory sensory neurons by engaging each of their olfactory receptor proteins, which recognize these different features. This pattern is processed and remembered by neural circuits in the olfactory bulb and olfactory cortex.
Several weeks later when you return home, you find the wine in your local market. Despite having replaced at least a subset of the olfactory sensory neurons that first interacted with that wine's unique odor, you are still able to recognize its aroma when you pour a glass, because the overall pattern of activity within the olfactory system remains relatively constant.
It could not have been easy being Elliott Roosevelt. If the alcohol wasn't getting him, the morphine was. If it wasn't the morphine, it was the struggle with depression. Then, of course, there were the constant comparisons with big brother Teddy.
In 1883, the year Elliott began battling melancholy, Teddy had already published his first book and been elected to the New York State assembly. By 1891—about the time Elliott, still unable to establish a career, had to be institutionalized to deal with his addictions—Teddy was U.S. Civil Service Commissioner and the author of eight books. Three years later, Elliott, 34, died of alcoholism. Seven years after that, Teddy, 42, became President.
Elliott Roosevelt was not the only younger sibling of an eventual President to cause his family heartaches—or at least headaches. There was Donald Nixon and the loans he wangled from billionaire Howard Hughes. There was Billy Carter and his advocacy on behalf of the pariah state Libya. There was Roger Clinton and his year in jail on a cocaine conviction. And there is Neil Bush, younger sib of both a President and a Governor, implicated in the savings-and-loan scandals of the 1980s and recently gossiped about after the release of a 2002 letter in which he lamented to his estranged wife, "I've lost patience for being compared to my brothers."
Welcome to a very big club, Bro. It can't be easy being a runt in a litter that includes a President. But it couldn't have been easy being Billy Ripken either, an unexceptional major league infielder craning his neck for notice while the press swarmed around Hall of Famer and elder brother Cal. It can't be easy being Eli Manning, struggling to prove himself as an NFL quarterback while big brother Peyton polishes a Super Bowl trophy and his superman stats. And you may have never heard of Tisa Farrow, an actress of no particular note beyond her work in the 1979 horror film Zombie, but odds are you've heard of her sister Mia.
Of all the things that shape who we are, few seem more arbitrary than the sequence in which we and our siblings pop out of the womb. Maybe it's your genes that make you a gifted athlete, your training that makes you an accomplished actress, an accident of brain chemistry that makes you a drunk instead of a President. But in family after family, case study after case study, the simple roll of the birth-date dice has an odd and arbitrary power all its own.
The importance of birth order has been known—or at least suspected—for years. But increasingly, there's hard evidence of its impact. In June, for example, a group of Norwegian researchers released a study showing that firstborns are generally smarter than any siblings who come along later, enjoying on average a three-point IQ advantage over the next eldest—probably a result of the intellectual boost that comes from mentoring younger siblings and helping them in day-to-day tasks. The second child, in turn, is a point ahead of the third. While three points might not seem like much, the effect can be enormous. Just 2.3 IQ points can correlate to a 15-point difference in sat scores, which makes an even bigger difference when you're an Ivy League applicant with a 690 verbal score going head to head against someone with a 705. "In many families," says psychologist Frank Sulloway, a visiting scholar at the University of California, Berkeley, and the man who has for decades been seen as the U.S.'s leading authority on birth order, "the firstborn is going to get into Harvard and the second-born isn't."
The differences don't stop there. Studies in the Philippines show that later-born siblings tend to be shorter and weigh less than earlier-borns. (Think the slight advantage the 6-ft. 5-in. [196 cm] Peyton Manning has over the 6-ft. 4-in. [193 cm] Eli doesn't help when he's trying to throw over the outstretched arms of a leaping lineman?) Younger siblings are less likely to be vaccinated than older ones, with last-borns getting immunized sometimes at only half the rate of firstborns. Eldest siblings are also disproportionately represented in high-paying professions. Younger siblings, by contrast, are looser cannons, less educated and less strapping, perhaps, but statistically likelier to live the exhilarating life of an artist or a comedian, an adventurer, entrepreneur, GI or firefighter. And middle children? Well, they can be a puzzle—even to researchers.
For families, none of this comes as a surprise. There are few extended clans that can't point to the firstborn, with the heir-apparent bearing, who makes the best grades, keeps the other kids in line and, when Mom and Dad grow old, winds up as caretaker and executor too. There are few that can't point to the lost-in-the-thickets middle-born or the wild-child last-born.
Indeed, to hear families tell it, the birth-order effect may only be getting stronger. In the past, girls were usually knocked out of the running for the job and college perks their place in the family should have accorded them. In most other ways, however, there was little to distinguish a first-, second- or third-born sister from a first-, second- or third-born brother. Now, with college and careers more equally available, the remaining differences have largely melted away.
"There are stereotypes out there about birth order, and very often those stereotypes are spot-on," says Delroy Paulhus, a professor of psychology at the University of British Columbia in Vancouver. "I think this is one of those cases in which people just figured things out on their own."
But have they? Stack up enough anecdotal maybes, and they start to look like a scientific definitely. Things that appear definite, however, have a funny way of surprising you, and birth order may conceal all manner of hidden dimensions—within individuals, within families, within the scientific studies. "People read birth-order books the way they read horoscopes," warns Toni Falbo, professor of educational psychology at the University of Texas. "'I'm a middle-born, so that explains everything in my life'—it's just not like that." Still, such skepticism does not prevent more and more researchers from being drawn to the field, and as they are, their findings, and the debate over them, continue to grow.
Humans aren't alone
If you think it's hard to manage the birth-order issues in your family, be thankful you're not an egret or an orange blossom. Egrets are not the intellectual heavyweights of the animal kingdom—or even the bird world—but nature makes them remarkably cunning when it comes to planning their families. Like most other birds, egrets lay multiple eggs, but rather than brooding them all the same way so that the chicks emerge on more or less the same day, the mother begins incubating her first and second eggs before laying the remaining ones in her clutch. That causes the babies to appear on successive days, which gives the first-arriving chick the earliest crack at the food and a 24-hour head start on growth. The second-hatched may not have too difficult a time catching up, but the third may struggle. The fourth and beyond will have the hardest go, getting pushed aside or even pecked to death if food, water and shelter become scarce. All that makes for a nasty nursery, but that's precisely the way the mother wants it. "The parents overproduce a bit," says Douglas Mock, professor of zoology at the University of Oklahoma, "maybe making one more baby than they can normally afford to raise and then letting it take the fall if the resource budget is limited."
Orange trees are even tougher on their young. A typical orange tree carries about 100,000 pollinated blossoms, each of which is a potential orange, complete with the seeds that are potential trees. But in the course of a season, only about 500 oranges are actually produced. The tree determines which ones make the cut, shedding the blossoms that are not receiving enough light or that otherwise don't seem viable. It is, for a tree, a sort of selective termination on a vast scale. "You've got 99% of the babies being thrown out by the parent," says Mock. "The tree just drops all the losers."
Even mammals, warm-blooded in metabolism and—we like to think—temperament, can play a similarly pitiless game. Runts of litters are routinely ignored, pushed out or consigned to the worst nursing spots somewhere near Mom's aft end, where the milk flow is the poorest and the outlook for survival the bleakest. The rest of the brood is left to fight it out for the best, most milk-rich positions.
Humans, more sentimental than birds, trees or litter bearers, don't like to see themselves as coming from the same child-rearing traditions, but we face many of the same pressures. As recently as 100 years ago, children in the U.S. had only about a 50% chance of surviving into adulthood, and in less developed parts of the world, the odds remain daunting. It can be a sensible strategy to have multiple offspring to continue your line in case some are claimed by disease or injury.
While the eldest in an overpopulated brood has it relatively easy—getting 100% of the food the parents have available—things get stretched thinner when a second-born comes along. Later-borns put even more pressure on resources. Over time, everyone might be getting the same rations, but the firstborn still enjoys a caloric head start that might never be overcome.
Food is not the only resource. There's time and attention too and the emotional nourishment they provide. It's not for nothing that family scrapbooks are usually stuffed with pictures and report cards of the firstborn and successively fewer of the later-borns—and the later-borns notice it. Educational opportunities can be unevenly shared too, particularly in families that can afford the tuition bills of only one child. Catherine Salmon, an assistant professor of psychology at the University of Redlands in Redlands, Calif., laments that even today she finds it hard to collect enough subjects for birth-order studies from the student body alone, since the campus population is typically overweighted with eldest sibs. "Families invest a lot in the firstborn," she says.
All of this favoritism can become self-reinforcing. As parental pampering produces a fitter, smarter, more confident firstborn, Mom and Dad are likely to invest even more in that child, placing their bets on an offspring who—in survival terms at least—is looking increasingly like a sure thing. "From a parental perspective," says Salmon, "you want offspring who are going to survive and reproduce."
Firstborns do more than survive; they thrive. In a recent survey of corporate heads conducted by Vistage, an international organization of ceos, poll takers reported that 43% of the people who occupy the big chair in boardrooms are firstborns, 33% are middle-borns and 23% are last-borns. Eldest siblings are disproportionately represented among surgeons and M.B.A.s too, according to Stanford University psychologist Robert Zajonc. And a recent study found a statistically significant overload of firstborns in what is—or at least ought to be—the country's most august club: the U.S. Congress. "We know that birth order determines occupational prestige to a large extent," says Zajonc. "There is some expectation that firstborns are somehow better qualified for certain occupations."
Little sibs, big role
For eldest siblings, this is a pretty sweet deal. There is not much incentive for them to change a family system that provides them so many goodies, and typically they don't try to. Younger siblings see things differently and struggle early on to shake up the existing order. They clearly don't have size on their side, as their physically larger siblings keep them in line with what researchers call a high-power strategy. "If you're bigger than your siblings, you punch 'em," Sulloway says.
But there are low-power strategies too, and one of the most effective ones is humor. It's awfully hard to resist the charms of someone who can make you laugh, and families abound with stories of last-borns who are the clowns of the brood, able to get their way simply by being funny or outrageous. Birth-order scholars often observe that some of history's great satirists—Voltaire, Jonathan Swift, Mark Twain—were among the youngest members of large families, a pattern that continues today. Faux bloviator Stephen Colbert—who yields to no one in his ability to get a laugh—often points out that he's the last of 11 children.
Such examples might be little more than anecdotal, but personality tests show that while firstborns score especially well on the dimension of temperament known as conscientiousness—a sense of general responsibility and follow-through—later-borns score higher on what's known as agreeableness, or the simple ability to get along in the world. "Kids recognize a good low-power strategy," says Sulloway. "It's the way any sensible organism sizes up the niches that are available."
Even more impressive is how early younger siblings develop what's known as the theory of mind. Very small children have a hard time distinguishing the things they know from the things they assume other people know. A toddler who watches an adult hide a toy will expect that anyone who walks into the room afterward will also know where to find it, reckoning that all knowledge is universal knowledge. It usually takes a child until age 3 to learn that that's not so. For children who have at least one elder sibling, however, the realization typically comes earlier. "When you're less powerful, it's advantageous to be able to anticipate what's going on in someone else's mind," says Sulloway.
Later-borns, however, don't try merely to please other people; they also try to provoke them. Richard Zweigenhaft, a professor of psychology at Guilford College in Greensboro, N.C., who revealed the overrepresentation of firstborns in Congress, conducted a similar study of picketers at labor demonstrations. On the occasions that the events grew unruly enough to lead to arrests, he would interview the people the police rounded up. Again and again, he found, the majority were later- or last-borns. "It was a statistically significant pattern," says Zweigenhaft. "A disproportionate number of them were choosing to be arrested."
Later-borns are similarly willing to take risks with their physical safety. All sibs are equally likely to be involved in sports, but younger ones are likelier to choose the kinds that could cause injury. "They don't go out for tennis," Sulloway says. "They go out for rugby, ice hockey." Even when siblings play the same sport, they play it differently. Sulloway is currently collaborating on a study of 300 brothers who were major league ballplayers. Though the work is not complete, he is so far finding that the elder brothers excel at skills that involve less physical danger. Younger siblings are the ones who put themselves in harm's way—crouching down in catcher's gear to block an incoming runner, say. "It doesn't just hold up in this study but a dozen studies," Sulloway says.
It's not clear whether such behavior extends to career choice, but Sandra Black, an associate professor of economics at ucla, is intrigued by findings that firstborns tend to earn more than later-borns, with income dropping about 1% for every step down the birth-order ladder. Most researchers assume this is due to the educational advantages eldest siblings get, but Black thinks there may be more to it. "I'd be interested in whether it's because the second child is taking the riskier jobs," she says.
Black's forthcoming studies will be designed to answer that question, but research by Ben Dattner, a business consultant and professor of industrial and organizational psychology at New York University, is showing that even when later-borns take conservative jobs in the corporate world, they approach their work in a high-wire way. Firstborn ceos, for example, do best when they're making incremental improvements in their companies: shedding underperforming products, maximizing profits from existing lines and generally making sure the trains run on time. Later-born ceos are more inclined to blow up the trains and lay new track. "Later-borns are better at transformational change," says Dattner. "They pursue riskier, more innovative, more creative approaches."
If eldest sibs are the dogged achievers and youngest sibs are the gamblers and visionaries, where does this leave those in between? That it's so hard to define what middle-borns become is largely due to the fact that it's so hard to define who they are growing up. The youngest in the family, but only until someone else comes along, they are both teacher and student, babysitter and babysat, too young for the privileges of the firstborn but too old for the latitude given the last. Middle children are expected to step up to the plate when the eldest child goes off to school or in some other way drops out of the picture—and generally serve when called. The Norwegian intelligence study showed that when firstborns die, the IQ of second-borns actually rises a bit, a sign that they're performing the hard mentoring work that goes along with the new job.
Stuck for life in a center seat, middle children get shortchanged even on family resources. Unlike the firstborn, who spends at least some time as the only-child eldest, and the last-born, who hangs around long enough to become the only-child youngest, middlings are never alone and thus never get 100% of the parents' investment of time and money. "There is a U-shaped distribution in which the oldest and youngest get the most," says Sulloway. That may take an emotional toll. Sulloway cites other studies in which the self-esteem of first-, middle- and last-borns is plotted on a graph and follows the same curvilinear trajectory.
The phenomenon known as de-identification may also work against a middle-born. Siblings who hope to stand out in a family often do so by observing what the elder child does and then doing the opposite. If the firstborn gets good grades and takes a job after school, the second-born may go the slacker route. The third-born may then de-de-identify, opting for industriousness, even if in the more unconventional ways of the last-born. A Chinese study in the 1990s showed just this kind of zigzag pattern, with the first child generally scoring high as a "good son or daughter," the second scoring low, the third scoring high again and so on. In a three-child family, the very act of trying to be unique may instead leave the middling lost, a pattern that may continue into adulthood.
The holes in the theories
The birth-order effect, for all its seeming robustness, is not indestructible. There's a lot that can throw it out of balance—particularly family dysfunction. In a 2005 study, investigators at the University of Birmingham in Britain examined the case histories of 400 abused children and the 795 siblings of those so-called index kids. In general, they found that when only one child in the family was abused, the scapegoat was usually the eldest. When a younger child was abused, some or all of the other kids usually were as well. Mistreatment of any of the children usually breaks the bond the parents have with the firstborn, turning that child from parental ally to protector of the brood. At the same time, the eldest may pick up some of the younger kids' agreeableness skills—the better to deal with irrational parents—while the youngest learn some of the firstborn's self-sufficiency. Abusiveness is going to "totally disrupt the birth-order effects we would expect," says Sulloway.
The sheer number of siblings in a family can also trump birth order. The 1% income difference that Black detected from child to child tends to flatten out as you move down the age line, with a smaller earnings gap between a third and fourth child than between a second and third. The IQ-boosting power of tutoring, meanwhile, may actually have less influence in small families, with parents of just two or three kids doing most of the teaching, than in the six- or eight-child family, in which the eldest sibs have to pitch in more. Since the Norwegian IQ study rests on the tutoring effect, those findings may be open to question. "The good birth-order studies will control for family size," says Bo Cleveland, associate professor of human development and family studies at Penn State University. "Sometimes that makes the birth-order effect go away; sometimes it doesn't."
The most vocal detractors of birth-order research question less the findings of the science than the methods. To achieve any kind of statistical significance, investigators must assemble large samples of families and look for patterns among them. But families are very different things—distinguished by size, income, hometown, education, religion, ethnicity and more. Throw enough random factors like those into the mix, and the results you get may be nothing more than interesting junk.
The alternative is what investigators call the in-family studies, a much more pointillist process, requiring an exhaustive look at a single family, comparing every child with every other child and then repeating the process again and again with hundreds of other families. Eventually, you may find threads that link them all. "I would throw out all the between-family studies," says Cleveland. "The proof is in the in-family design."
Ultimately, of course, the birth-order debate will never be entirely settled. Family studies and the statistics they yield are cold and precise things, parsing human behavior down to decimal points and margins of error. But families are a good deal sloppier than that, a mishmash of competing needs and moods and clashing emotions, better understood by the people in the thick of them than by anyone standing outside. Yet millenniums of families would swear by the power of birth order to shape the adults we eventually become. Science may yet overturn the whole theory, but for now, the smart money says otherwise.
— Reported by Dan Cray/Los Angeles