Thursday, November 15, 2007
Two companies plan to market the first lie-detecting devices that use
magnetic resonance imaging (MRI) and say the new tests can spot liars
with 90% accuracy.
No Lie MRI plans to begin offering brain-based lie-detector tests in
Philadelphia in late July or August, says Joel Huizenga, founder of
the San Diego-based start-up. Cephos Corp. of Pepperell, Mass., will
offer a similar service later this year using MRI machines at the
Medical University of South Carolina in Charleston, says its
president, Steven Laken.
Both rely in part on recent research funded by the federal government
aimed at producing a foolproof method of detecting deception.
The new devices differ from polygraphs in key ways. The polygraph
detects stress brought on by telling a lie. The MRI-based devices
purport to measure the lie itself, and not changes in breathing and
pulse rate brought on by lying or by some other cause.
"We're at the beginning of a technology that's going to become more
and more mature," Huizenga says. "But right now, we can offer
(customers) a chance to show they are telling the truth with a
scientific basis and a high degree of accuracy. That's something they
haven't been able to get before."
Potential customers: law enforcement, accused persons, spouses under
suspicion and job applicants. Huizenga says a 1988 law that bars
private employers from giving polygraphs to potential employees
appears not to apply to MRI tests.
No Lie MRI plans to charge $30 a minute to use its device. Cephos has
not yet set a price.
The new products are being introduced as the polygraph is under fire.
In 2002, a National Academy of Sciences study concluded that polygraph
results are too unpredictable to be used for security screening at
national labs. Yet the Department of Defense, as well as the FBI, CIA
and National Security Agency, continue to administer thousands of
polygraph tests each year to job candidates and others seeking
The Department of Defense administered about 12,000 tests in 2002, the
most recent year in which it made data public.
"They haven't found anything yet that they think can top (the
polygraph)," says Britton Chance, a University of Pennsylvania
researcher who has studied brain-based lie detection.
Some scientists and privacy advocates criticize the new lie detectors.
They note that they haven't yet proved themselves in real-world tests
and face prolonged scrutiny before they are admitted in court.
"They are going to be deployed to read people's thoughts," says Barry
Steinhardt, director of the American Civil Liberties Union's
technology and liberty project. "Little if any attention has been paid
to potential misuse and the devastating impact it would have on our
The new tests work this way: A subject lies with his head in an MRI
machine and pushes a button to answer yes-no questions as they are
flashed on a screen. The machine measures brain activity, focusing on
areas that are believed to show extra exertion when a lie is
generated. Specially designed software grades the test. The two lie detector companies use similar test techniques, but different software.
Most American courts do not admit polygraph evidence because
differences in examiner skill make it hard to determine accuracy rates.
MRI machines are used by hospitals on a daily basis to diagnose tumors
and other disorders.
After the 9/11 attacks, the FBI, CIA, Department of Defense and other
agencies began funding research into how changes in brain activity
correlate with truth telling.
Daniel Langleben, a physician and professor at the University of
Pennsylvania, and Frank Andrew Kozel, a brain image researcher at the
Medical University of South Carolina, received funding for research
into MRI-based lie detection.
Langleben's research was used by No Lie MRI to help develop its lie
detector. Cephos' lie detector is based in part on Kozel's research.
David Heeger, professor of psychology and neural science at New York
University, opposes using MRI for lie detection until more research is
done. Among the many gaps in the research, he says, is whether a false
or imagined memory will show up as a true response or a lie.
MRI-based lie detection "says changes happen (in the brain of a liar)
but it doesn't say why," says Heeger. "There's a lot more work to be
done ... before we have a deep understanding of what goes on when
Gregg Bloche, a medical doctor and a law professor at Georgetown
University, says scientists, lawyers and ethicists should conduct a
"high profile discussion" of the new technology's potential uses and
pitfalls before it is made available to the public.
The start-up companies say the technology is ready now. Both say they
will focus on winning acceptance in court for tests taken by
customers. No Lie MRI already is working with a defendant in a
California criminal case, Huizenga says.
"We understand that there are further ethics conversations (needed)
when science pushes the envelope," says Cephos' Laken. "But we don't
see these (tests) being set up in dressing rooms and shopping malls.
That's not going to happen."
By Corinna Underwood
As any Baby Boomer will tell you, Americans have more information to cram into their memories than ever. Yet, as we age, our capacity for recall grows weaker.
But what if you could capture every waking moment of your entire life, store it on your computer and then recall digital snapshots of everything you've seen and heard with just a quick search?
Renowned computer scientist Gordon Bell, head of Microsoft's Media Presence Research Group and founder of the Computer History Museum in Silicon Valley, thinks he might be able to do just that.
He calls it a "surrogate memory," and what he considers an early version of it even has an official name — MyLifeBits.
"The goal is to live as much of life as possible versus spending time maintaining our memory system," Bell explains.
• Click here for FOXNews.com's Personal Technology Center.
Perfect surrogate memory would be supplemental to, but ultimately as good as, your original memory.
It could let you listen to every conversation you had when you were 21 or find that photograph of the obscure date you had on summer vacation.
As Bell says, it would "supplement (and sometimes supplant) other information-processing systems, including people."
MyLifeBits isn't quite there yet, but Bell's nevertheless "gone paperless" for the past decade as part of the project, keeping a detailed, digitized diary that documents his life with photographs, letters and voice recordings.
So that he doesn't miss out on important daily events, Bell wears a SenseCam, developed by Microsoft Research, that takes pictures whenever it detects he may want a photograph.
The camera's infrared sensor picks up on body heat and takes snapshots of anyone else in the room, adjusting itself as available light changes.
Not only does MyLifeBits record your life's digital information, but the software, developed by Bell's researchers Jim Gemmell and Roger Lueder, also can help you retrieve it.
"MyLifeBits is a system aimed at capturing cyber-content in the course of daily life with the goal of being able to utilize it in various ways at work, in our personal life — e.g. finances, family, health and for our future memory," Bell says.
Simply enter a keyword such as "pet," for example, and the search engine will find all available information on your childhood puppy.
It also can run more intricate searches, allowing you to cross-reference all associations linked to certain people or places.
If you're having difficulty remembering where you were and who you were with on a certain day, MyLifeBits would remind you.
And just how much data is needed on a day-to-day basis?
"All the bits that we can that will likely have value for our memory in the near and long-term future, a few bits just for the hell of it," Bell says. "We end up with more bits because we need them for relationships."
Still, is recalling every single detail of an entire lifetime too much? How can anyone guess what's going to be important 20 years from now?
"It is impossible to know what will be required in the future," says Bell. "Furthermore, recording everything allows one item to be used to find another item that may have been created at the same time."
Bell says MyLifeBits could have another important benefit: It may actually improve your real memory.
According to Bell, being reminded of someone in a photograph or screensaver strengthens our recollections.
We constantly are reminded of other events when we delve into our past to find snippets for which we are looking. This reinforces a whole host of links to other memories we otherwise may have forgotten.
But since all this is digitally recorded, what if hackers find it? Couldn't MyLifeBits be a threat to privacy and a boon to identity thieves?
Bell doesn't seem overly concerned.
"MLB introduces no new problems that aren't present in modern computer systems," he said, "except that we present a larger cross-section that makes all the content potentially more valuable."
Additional passwords are being built into the most sensitive documents, he explains.
An even bigger hurdle for the project is cost-efficiency.
The Microsoft team predicts that by 2010, a 1-terabyte (1,000-gigabyte) hard drive will cost less than $300.
That could easily hold all text documents, voice files and photographs of a person's complete life experience — but if it came to video, it would be only enough for four hours per day for an entire year.
On a somewhat smaller level, Sunil Vemuri, co-founder and chief product officer of Hyderabad, India-based QTech, Inc., has been working to develop a "memory prosthesis" that can help people with common, day-to-day memory problems.
QTech's "reQall" service provides a toll-free number that allows clients to use any phone to record reminders of events, appointments or thoughts as and when needed.
It then saves and organizes the recordings and sends daily reminders as needed.
"ReQall is meant for anyone who forgets, for anyone with a day-to-day memory problem," Vemuri says. "The aim of reQall is to provide a long-term service that is available to everyone right now."
Vemuri sees great growth potential for reQall. He wants his team to refine the service to suit users' individual memory needs, whether that involves helping patients remember doctors' appointments, friends remember birthdays or even journalists remember specific quotes.
More ambitious is Vemuri's "What Was I Thinking," a project he worked on while a graduate student at MIT.
That centered around software running on a Compaq iPaq personal digital assistant, similar to a Palm Pilot, which then synced to PCs running Mac OS X, Windows or Linux. It was capable of recording data and using a number of search tools to help the user find forgotten memories, using a range of built-in triggers.
"Many things can serve as good memory triggers: the smell or taste of homemade cooking, the smile on a child's face, a good joke, the roar of the crowd when your sports team scores, etc," Vemuri explained on his MIT Web page.
"In our case, the device records audio from conversations and happenings, analyzes and indexes the audio in an attempt to identify the best memory triggers, and provides a suite of retrieval tools to help the wearer access memories when a forgetting incident occurs."
The device's retrieval tools included an analysis of audio recordings to determine if conversations were heated, calm or humorous and a transcription of audio files to text files by means of a speech-recognition program.
In this way, the text files could be searched for specific words or speech patterns that can trigger those elusive memories.
In the future, some variation on these memory prostheses could change our lives on many levels, from settling a squabble over last week's football scores to assisting an elderly patient remember if she has taken her medication.
We rely on our hard drives for saving our music, photographs, e-mails and videos — so perhaps life-logging software and memory prosthetics are simply the next stage in the evolution of our relationship to the computer.
Wednesday, November 7, 2007
By Tina Benitez
Bipolar disorder runs in the family. However, scientists still cannot pinpoint it to one specific gene since there are many other factors that may lead to the manic-depressive illness.
The disorder is 75 to 80 percent hereditary, according to Dr. Candida Fink, a child and adolescent psychiatrist and author of Bipolar Disorder for Dummies and The Ups and Downs of Raising a Bipolar Child. Stress, family life, emotional and psychological stresses should also be taken into account.
“It runs a pretty broad range,” said Fink. “Different stresses are likely contributors. Vulnerability and emotional and psychological stress is partly related, but it isn’t always one, and this is important to know. One can develop bipolar with stress or other things combined. We don’t know if it’s something biological or a mixture of developmental, hormonal imbalances.”
Fink said that some hereditary factors are strong indicators of whether someone will develop bipolar disorder or not. Identical twins have a 70 percent higher chance of developing bipolar disorder. If one parent is bipolar, there is a 7 to 10 percent higher risk of getting bipolar disorder. If there are two parents, there is a 20 percent chance of developing bipolar in the offspring because of the multiple genes.
“It’s a huge issue when getting family history and sitting with someone,” said Fink. “Does bipolar run in the family? Particularly, sometimes the further back you go, the family was not diagnosing it, or no one talked about it. It was kind of like, did you have an aunt Sally that never left the house? The most relevant indicator is the immediate family, which is typical of most mental illnesses, but there’s not just one thing that is going to give us the answer.”
Individuals who are bipolar will present signs of depression first, said Fink, but experts must look for the red flags in the different types of depression when diagnosing it. Was it onset at a young age or not? Were there some brief or discreet episodes in their past?
Dr. Francis McMahon, chief of the Genetics Unit of the Mood Anxiety Disorders Program at the National Institute of Mental Health (NIMH) in Bethesda, Md., said that it has been known for a long time that bipolar disorder and manic depression are the most genetic of all illnesses. There have been three kinds of studies, including family studies that show that bipolar disorder definitely runs in families. The other studies were conducted on twins, identical and fraternal.
“If you have a close family member, you have a 10-fold chance of getting bipolar disorder,” said McMahon. “If you do the math, there is about an 80 percent difference of risk of bipolar, which leaves about a 20 percent chance for other factors, and we know even less about what those are.”
Who Gets It — and Why?
For parents, Fink wants to develop a handbook for parents with children who may be bipolar, because there are so many different mood disturbances throughout childhood. “In studies of kids, there may be depression, but you have to ask if they really do look like the classic bipolar disorder,” Fink said. “There’s also attention deficit disorder, or other illnesses brought on by huge trauma. They have to understand the neurobiology component.”
There’s even a division between men and women when it comes to acquiring bipolar, because more women suffer from depression. Bipolar dramatic episodes may show up in the early 30s with more manic episodes later in life in the 40s. “A lot of bipolar disorder is struggling with depression, mostly reported in childhood. Manic episodes show up later, and that’s when you’re hit over the head with bipolar,” said Fink.
Life experiences, use of abusive drugs can also play a role in developing bipolar disorder, but any one of these can cause anxiety or depression, which may exacerbate bipolar episodes, according to McMahon. “Assuming that you are already susceptible, since it is in the family, there may be more of a chance of getting bipolar.”
McMahon added that if people are bipolar, they have to be aware of several things to control episodes including, but not limited, to:
— Missing nights of sleep
— Trans-meridian travel, which causes stress on the body
— Whether they have stress in the family
“Stress in general is not a good thing,” he said. “But in and of itself some people with bipolar have to watch out for stresses that may come from jet lag as well as occupational stress.
Work stress can lead to manic or depressive episodes, often in the case of someone getting a promotion or getting more responsibilities at work, which can tip them over into being susceptible. Bipolar also tends to be something people get early on in life, and according to McMahon, 80 percent of people who are bipolar will see signs of the illness when they are 24, but it doesn’t mean they will be diagnosed.
Once a psychiatrist goes over the family history with them, there may have been some illness that occurred in adolescence. People can go months and years being undiagnosed and continue to get a number of episodes down the line. “Bipolar is episodic,” he said. “A lot of studies are about what triggers the episode, but there’s very little evidence of why people get it.”
Some of the bigger challenges with new medications used for treating bipolar disorders, according to McMahon is that there are a lot of drugs that treat the episodes, or prevent the episodes, but there’s nothing available to cure the illness. He added that the big challenge for big treatments in the next decade lie in genetic studies.
One group of studies that has made strides in linking bipolar, major depression and other diseases to the genes is the Bipolar Disorder Phenome Database, a collaboration of institutional research from NIMH, Johns Hopkins University, the University of Colorado’s Health Sciences Center, and the University of Heidelberg in Germany. The database includes 20 years of studies on bipolar individuals and those with a related illness and their family history.
He said that there is one gene, which has been clearly identified as influencing bipolar and other diseases, as well as two other genes that need to be reviewed more deeply.
McMahon said that researchers are seeing more and more linkage between specific genes and diseases, specifically bipolar disorder and depression, but it’s still not clear what happens to this genetic makeup, whether it is brought on hereditarily or from outside factors.
“It sounds like a big achievement, and it has had contradictory results, but it still feels like a big step forward,” he said. “We have a long way forward. We have to figure out how and what they do that goes wrong. If we talk six months from now, we may clearly have found specific genes that link between bipolar and depression.”
Tuesday, November 6, 2007
Tuesday, November 06, 2007
Research about to published in the journal Molecular Psychiatry, resulting from a collaboration between scientists in Germany, Portugal and the UK, suggests that stress contribute directly to the development of Alzheimer’s disease (AD).
According to the results now published, stress induces the production of amyloid beta (Ab) peptide – the molecule associated with the neural plaques characteristic of the disease – and also makes neurons more vulnerable to Ab toxicity. Administration of glucocorticoids (GC) - the production of which is the first physiological response to stress – was shown to have the same effect, confirming the role of stress in AD. This last result is particularly important as GC are used to treat Alzheimer’s patients and according to this research instead of helping they might be, instead, contributing to the disease.
Alzheimer’s disease is part of a group of illnesses called amyloidoses, which result from protein failure to fold and work properly (proteins’ shape is directly related to their functionality) leading, instead, to their accumulation as toxic insoluble plaques (or amyloids). In Alzheimer’s the misfolded protein is called Ab and is found as insoluble plaques in the diseased neurons of patients.
It is known that AD patients can have higher anxiety and GC levels than those found on normal individuals and, in rodent models of AD, it has been found that stress can exacerbate the disease. Furthermore, high stressful conditions leads to cognitive impairments very similar to those found in AD patients. These observations have led researchers C Catania, N Sousa, OFX Almeida and colleagues in Germany, Portugal and the UK to wonder if there could be a causal relationship between stress and AD.
In order to investigate this possible link the researchers tested middle-aged rats in different stressful situations looking into Ab peptide (and also another molecule called C99) levels in the hippocampus and the prefrontal cortex areas of the rats’ brain.
In fact, the first signs of AD do not correlate with the insoluble plaques of Ab protein found in the diseased brain, but instead, with the levels of soluble Ab peptide, while the hippocampus and the prefrontal cortex are the first brain areas affected in AD. Furthermore the Ab peptide is formed from the breakdown of the amyloid precursor protein APP in a series of consecutive steps that originate a molecule called C99, which, when further degraded/cleaved, creates the Ab peptide (APP –… – C99 – Ab). And while Ab peptide is well known to be neurotoxic, recent reports have indicated that C99 – besides being the precursor of Ab - has a similar toxicity with both molecules affecting neural function and also cognitive behaviour. This has led Catania, Sousa, Almeida and colleagues to use the brain levels of both Ab and C99 as a measure of potential neuro-damage and AD development in their experiments. Additionally, the researchers also looked into the consequences of glucocorticoids administration in order to confirm the specific effects of stress in AD, since GC secretion is the first physiological response to stress.
The team of researchers found that stressful situations or injections of GC (which mimic stress) led to an increase of both C99 and, eventually, of Ab, in both the hippocampus and the prefrontal cortex of the rats’ brain. Furthermore, rats with a history of stress were more susceptible to the effects of stress or GC administration, showing bigger increases in C99 levels. It was also observed that administration of soluble Ab led to a similar increase in C99 in the rats’ brain, supporting results by others that have shown that Ab can induce its own production, but also suggesting that Ab, stress and GC induced the same biochemical responses.
Next, Catania and colleagues looked at the animals’ behaviour - as behavioural changes are the hallmark of AD - more specifically, at their learning and memory abilities, as these are the two first cognitive functions affected in the disease. They also looked into the rats’ anxiety levels since AD patients are known to be abnormally anxious.
For the analysis of spatial memory abilities, stressed and non-stressed rats were tested in a maze over 4 days while their emotional state was accessed by looking into anxiety levels, locomotion patterns and exploratory behaviour.
Their first conclusion was that, like AD patients, rats put into stressful situations or receiving GC were much more anxious than controls. It was also shown that these rats showed had less exploratory interest than control animals. Spatial reference memory – which is involved in learning with repeated experiences, such as those experienced in the maze – was similarly impaired by stress or administration of GC or Ab. This last result again supports the conclusion that the GC and Ab act on DA through the same biochemical mechanism. When stress and GC were applied together, spatial reference memory impairment increased revealing a cumulative effect of the various factors.
In conclusion, Catania, Sousa, Almeida and colleagues show that stress can contribute to Alzheimer’s disease in two ways: by inducing the production of known neurotoxic molecules - C99 and Ab - that affect neural function and cognitive behaviour, but also – in the case of existing a previous history of stress – by making animals more susceptible to the C99-inducing effects of GC and Ab. These results – if further confirmed – have important implications for the understanding of the mechanisms behind Alzheimer’s disease and its predisposing factors and, consequently, also for possible therapeutic approaches.
Catania, Sousa, Almeida and colleagues’ research elucidates the mechanism behind stress and GC direct effect on the brain, and can also be important to understand how stress-mediated diseases, such as depression, affect brain function. The research also alert to the need of investigating further the use of GC in AD therapy and calls for the importance of, when treating AD patients, access previous history of stress or GC therapy.
From:Molecular Psychiatry 2007 advance online publication
“The amyloidogenic potential and behavioral correlates of stress”
Thursday, November 1, 2007
February 25, 2007
Formerly approved drug imparts lasting learning and memory improvements to impaired mice
Researchers may have finally found a drug candidate for reducing the mental retardation caused by Down syndrome, which afflicts more than 350,000 people in the U.S. Researchers gave low doses of a human drug to mice bred to mimic the learning and memory problems in people with Down syndrome. After as little as two weeks, the impaired mice performed as well as normal ones in learning tests, and the improvement lasted for up to two months after treatment ended.
But there is a catch: the drug was taken off the market 25 years ago after being found to cause dangerous seizures in some people. And many compounds that boost learning in mice fail in human trials.
Nevertheless, "anyone studying Down's is going to have their socks blown off by this," says geneticist Roger Reeves, a Down syndrome specialist at the Johns Hopkins School of Medicine in Baltimore, who was not involved in the study. "There hasn't been anything out there that we really could take to patients or that we had a strong possibility of taking into the clinic."
Researchers tested the drug, pentylenetetrazole (PTZ), as well as two other compounds—picrotoxin and a gingko biloba extract called bilobalide—because they all interfere with tiny ion channels on brain cells (neurons). When activated, the channels, known as GABAA receptors, inhibit the cells, making it harder for them to form new synapses, or connections, with neighboring neurons.
The deficits of Down syndrome may occur because the brain contains too many such inhibitory signals, says Stanford University neurobiologist Craig Garner, whose group performed the experiments. "In order to learn, you have to have a period during which synapses can get stronger or weaker," he says. "This changing is what's not possible when you have too much inhibition."
So Garner, his student Fabian Fernandez, and their colleagues gave their mice either low doses of PTZ mixed with milk, or low-dose injections of picrotoxin or bilobalide, daily for two to four weeks to slightly raise the level of excitation in the brain. Immediately after treatment, the animals' scores on two memory tests—for recognizing objects they had seen before or remembering how they last entered a maze—were on par with normal mice; two months later, they still did much better than they normally would, the researchers report in a paper appearing online February 25 in Nature Neuroscience.
The treatment "is allowing the normal properties of neurons to work," Garner says. "This slowly over time leads to an improved circuit."
Reeves says there may be other ways to treat Down syndrome, but "you can see your way to clinical testing most easily from here," because researchers identified specific chemicals. "It's hugely promising," he says. "Maybe it will have a big effect, but we don't know that." The inhibition model is plausible, but still unproved in people, he notes, and until researchers better understand the mechanisms by which the compounds work, "I'm wary of rushing into the clinic."
Garner says clinical trials of PTZ could begin in the next year or two, and evaluating them might take five to 10 years. He notes that although PTZ is nearly 100 years old and was used to treat psychiatric disorders and later dementia, researchers never concluded it was effective. It also caused seizures (at doses 100-fold higher than those given to the mice), so the FDA revoked its approval in 1982.
In Down syndrome, chromosome 21 is present in three copies instead of two. Similarly, the mice used in the study have a duplicated piece of chromosome 16. As in Down syndrome, these animals have malformed facial bones and problems forming new memories.
Reeves notes that many researchers have long considered Down syndrome too complex to crack, but the study "serves as notice to the neuroscience community that there are a lot of interesting things to do here. This is not some vague, mega complex issue."
June 26, 2007
Finding could set the stage for ways to reverse damage in sufferers of the inherited fragile X syndrome
In a case of life imitating art, researchers at the Massachusetts Institute of Technology (M.I.T.) reported today that they had successfully reversed mental retardation in mice, just as scientists did in the classic 1966 novel Flowers for Algernon. In the book by Daniel Keyes, scientists use experimental surgery—first tested on a mouse named Algernon—to dramatically boost the intelligence of a mentally retarded janitor named Charlie Gordon. Now M.I.T. scientists report in Proceedings of the National Academy of the Sciences USA that they ameliorated brain damage in mice caused by a genetic disorder known as fragile X syndrome by blocking an enzyme involved in cellular development.
Fragile X affects one in 4,000 boys and one in 6,000 girls. It is caused by a mutation in the fragile x mental retardation 1 gene (FMR1)—located on the X sex chromosome— that results in the loss of the fragile x mental retardation protein (FMRP). The resulting illness is characterized by hyperactivity, attention deficit, repetitive behavior, anxiety and cognitive difficulties ranging from learning disability to mental retardation.
Previous studies of fragile X show that nerve cells in the cortex (the outermost layer of the brain) of both patients and knockout mice have dendrites (rootlike projections that nerve cells use to receive electrical signals from peers) with a greater number of branches. These extra shoots are thin, stubby, immature and less functional than normal dendrites, causing poor transmission among neurons across synapses (the spaces between nerve cells).
When studying the formation of dendrites for a 2004 paper, Mansuo Hayashi, a research affiliate in M.I.T.'s Picower Institute for Learning and Memory, discovered that these structures could be strengthened and altered to transmit information more efficiently by inhibiting nerve cell production of the enzyme called p21-activated kinase (PAK). PAK regulates actin, another protein, which shapes parts of the cell (including the dendrites). When PAK is inhibited, more actin is manufactured and the dendrites are able to properly mature.
In the current study, Hayashi led a team that worked with "double mutant" mice. First, the researchers mutated the FMR1 so that the mice would lack FMRP protein and thus show symptoms of fragile X. A second mutation to the gene that codes for PAK caused the mouse's cells in the forebrain to stop producing the enzyme at three weeks after birth. "When these two mutants were crossed together, [the mice] were normal," says senior study author Susumu Tonegawa, a professor of neuroscience and biology at Picower.
Not only did the dendrites of the fragile X–afflicted mice become more robust and efficient, but researchers report that several of the behavioral symptoms associated with the disorder also abated: Double mutant mice behaved more like normal mice than like those with fragile X.
William Greenough, a professor of psychology at the University of Illinois at Urbana-Champaign, says that the study demonstrates the long-term potential of gene therapy. "I am impressed by the kinds of different symptoms … that seem to be under [the] governance of this single signaling pathway," he says. "It seems to affect everything from neuronal morphology (or shape) to behavior that may or may not be coupled to that morphology."
Tonegawa says that the results may pave the way to a possible new molecular target for new drugs. "If you take a fragile X patient and feed them or inject them with a compound that will inhibit their own endogenous PAK enzyme," he says, "it may reduce the fragile X symptoms." He notes that a significant finding of the study is that the effects of fragile X—which is typically diagnosed only after a child shows marked developmental delays, such as being late to crawl, walk or talk—could be reversed at the onset.
How the brain parses music—and pays attention
It's a classic cocktail party conundrum: How do our brains decide where we should train our attention when people are milling all about us chatting away—some to us, some to others?
In an attempt to find out, researchers at Stanford University and McGill University in Montreal scanned the brains of 18 subjects who were listening to classical music by 18th-century British composer William Boyce.
"You have to kind of segment these streams [of information] into chunks," says senior study author Vinod Menon, an associate professor of psychiatry and behavioral science at Stanford. The process of slicing the data, he continues, requires that you "identify something that's interesting and then you have to switch on the attentional network."
"Memory doesn't work like video recorder, it's more like DVD," in how it recalls events as discrete chapters, explains study co-author Daniel Levitin, a psychology professor at McGill.
But why music?
Simple, says Sridhar Devarajan, a Stanford neuroscience graduate student involved in the project. "Transitions between musical movements," he notes, "offer an ideal setting to study the dynamically changing landscape of activity in the brain during this segmentation process."
The team wove together different movements from several of Boyce's four- to five-minute symphonies and had volunteers listen to two nine-minute compositions through noise-canceling headphones while lying in an fMRI machine; each of the musical tapestries consisted of 10 orchestral movements. (Boyce's compositions were selected because he is a relatively obscure composer and subjects were less likely to be familiar with his work, canceling out brain activity that would be generated if they had recollections of his symphonies.)
The team conducted full brain scans that allowed them to focus on regions of the brain of particular interest, monitoring them from 10 seconds before a transition between movements to 10 seconds after. (A transition between movements is marked by a decline in the amplitude of sound followed by a brief period of silence that leads into a new section of music.)
According to Menon, during the transition periods the team not only observed activity in discrete brain regions, but they also noticed co-active networks (two areas responding simultaneously) reacting to the musical shifts. Activity began in areas in the forward section of the prefrontal cortex of the brain as well as in parts of the temporal cortex just above the brain stem. Menon speculates that it is this network that is "detecting a salient change" in the information stream. Next, areas toward the rear of the prefrontal cortex and parts of the parietal cortex (the outermost layer of the parietal lobe at the upper rear of the brain), began to respond. The regions, Menon notes, are linked to attention and working memory.
"We feel that it could be a very general brain mechanism…part of a core control signal that is generated in response to any attention-demanding task," Menon says. "I think any task that involves detecting a salient event and attending to it will command a similar type of response."
Levitin adds, "Here we had a peak of activation…associated with nothingness," when there was no sound at all. "[Clearly], there are neural processes that are responsible for signaling the beginning and ending of events."
Next up for Menon's team: trying to determine the next level of processing after recognizing and responding to a change in an information stream. It also plans to apply whatever new information it discovers to learn more about "what structure actually means in music."
Stem cell transplants rescue memory problems in mice, but not in the way you might expect
A new study finds that neural stem cells may be able to save dying brain cells without transforming into new brain tissue, at least in rodents. Researchers from the University of California, Irvine, report that stem cells rejuvenated the learning and memory abilities of mice engineered to lose neurons in a way that simulated the aftermath of Alzheimer's disease, stroke and other brain injuries.
Researchers expect stem cells to transform into replacement tissue capable of replacing damaged cells. But in this case, the undifferentiated stem cells, harvested from 14-day-old mouse brains, did not simply replace neurons that had died off. Rather, the group speculates that the transplanted cells secreted protective neurotrophins, proteins that promote cell survival by keeping neurons from inducing apoptosis (programmed cell death). Instead, the once ill-fated neurons strengthened their interconnections and kept functioning.
"The primary implication here is that stem cells can help rescue memory deficits that are due to cell loss," says Frank LaFerla, a professor of neurobiology and behavior at U.C. Irvine and the senior author on a new study published in The Journal of Neuroscience. If the therapeutic benefit was indeed solely due to a neurotrophic factor, the door could be opened to using that protein alone as a drug to restore learning ability.
LaFerla's team genetically engineered mice to lose cells in their hippocampus, a region in the forebrain important for short-term memory formation. These mice were about twice as likely than unaltered rodents to fail a test of their ability to discern whether an object in a cage had been moved since their previous visit.
But when the mutant mice were injected with about 200,000 stem cells directly into their hippocampi and retested up to three months later, the injured animals performed up to par with their normal counterparts.
LaFerla's team found that in healthy mice that were similarly injected, the stem cells (which were marked with a green fluorescent dye) had spread throughout the brain. In the brains of the diseased mice, however, nearly all the cells congregated in the hippocampi. "Somehow, in the damaged region, there is some kind of signal that's telling the stem cells to stay local," LaFerla explains.
Curiously, the researchers discovered that only about five percent of the stem cells injected into the brain-addled mice matured into adult neurons. The surrounding neurons that were there all along, however, had sprouted a denser set of connections with other cells, presumably allowing for better transmission of information and recovery of function. "We think it's some neurotrophic factor being secreted by the [stem] cells," LaFerla says. If his group can identify it, he adds, they can answer the question: "Can that substance [alone] be provided to the brain and rescue the memory deficit?"
Eugene Redmond, a professor of psychiatry and surgery at Yale University School of Medicine notes the new work is "certainly well done. Their conclusion is similar to our study in Parkinsonian monkeys." He notes that in his study there was evidence of stem cells replacing lost neurons as well as other benefits conferred by the transplant.