Friday, October 12, 2007

Mild Cognitive Impairment

Britton Ashley Arey, MD, MBA
Introduction
As the United States population ages, mild cognitive impairment (MCI) is a clinical condition that is receiving increasing attention due to growing interest in the early diagnosis, treatment, and prevention of dementia. MCI is a heterogeneous entity, representing decrements in cognitive functioning in various domains which do not meet the diagnostic threshold for dementia.

There are three types of MCI that have been proposed by Portet and colleagues:1
(1) Amnestic MCI, which may progress to Alzheimer's disease
(2) MCI characterized by a slight impairment of multiple cognitive domains, which may progress to Alzheimer's disease or vascular dementia, or may represent a normal cognitive decline with age
(3) MCI with isolated impairment of a cognitive domain other than of memory
(‘‘single-domain non-memory MCI''), which may progress to non-Alzheimer-type dementia.

The best characterized form of MCI is amnestic MCI, which is defined as the presence of impaired memory function for age and education without dementia, as evidenced by preserved general cognitive function and intact activities of daily living.2 For the purposes of this review, the designation MCI will refer to amnestic MCI unless otherwise indicated.

The criteria proposed by Petersen and colleagues2 for amnestic MCI are as follows:
(1) not demented,
(2) memory complaint,
(3) preserved general cognitive function,
(4) intact activities of daily living, and
(5) impaired memory for age and education.

Properly diagnosing MCI identifies a subset of patients that is at increased risk for dementia.2 These patients may then be carefully monitored, and may be candidates for interventions that can improve functioning and quality of life. Several guidelines and consensus statements exist regarding MCI. These include the report of the Current Concepts in Mild Cognitive Impairment conference,2 and the American Academy of Neurology's (AAN) practice parameters for mild cognitive impairment1 and diagnosis of dementia.2 Together, these guidelines answer a number of questions about the current state of knowledge regarding MCI, and provide expert opinion on its diagnosis and impact.


Objectives of the American Academy of Neurology's Practice Parameters
To summarize the current state of knowledge in MCI diagnosis, epidemiology, consequences, and management, the AAN's practice parameters address a number of questions related to the diagnosis of dementing illness in general and MCI in particular. These questions provide a framework for reviewing the current data on MCI. Those most relevant to a discussion of MCI include the following:
  • Does the presence of mild cognitive impairment predict the development of dementia?
  • Does screening at-risk subjects with a specific instrument in a specific setting accurately lead to the diagnosis of dementia?
  • What comorbidities should be screened for in elderly patients undergoing an initial assessment for dementia?
  • What are important directions for future research?

QUESTION 1
Does the presence of mild cognitive impairment predict the development of dementia?

MCI is best understood as part of a continuum between normal aging and dementia. Controversy exists over the definition of diagnostic criteria that accurately differentiate between normal aging, MCI, and dementia; however, evidence points to the likely validity of defining MCI as a transitional state between normal age-related cognitive changes and dementia. The rate of progression from MCI to Alzheimer's Disease is between 6 and 25% per year. This is in comparison to rates in the general population of 0.2% in the 65 to 69 year age range to 3.9% in the 85 to 89 year range.3 Additional data estimates that among individuals diagnosed with MCI, 60 to 80% will progress to dementia within six years.2 Thus, patients with MCI are at high risk for developing a dementing illness such as Alzheimer's disease, and these patients should be monitored regularly for cognitive and functional declines consistent with dementia.3

A variety of evidence supports the view of MCI as part of a continuum of dementing conditions, and may ultimately help to refine our understanding of the progression from MCI to Alzheimer's disease. For instance, MCI is characterized by increased hippocampal atrophy compared to controls, and the degree of hippocampal atrophy can predict the rate of conversion from MCI to Alzheimer's disease.2 Hippocampal atrophy appears to occur at the same rate in individuals who progress from normal cognition to MCI as in individuals who go from MCI to Alzheimer's disease, further suggesting a transition along a disease spectrum.2

Neuropathologic evidence also points to the validity of MCI as a transitional state between normal aging and Alzheimer's disease. Several studies have indicated that subjects who died while meeting clinical criteria for MCI had pathologic changes that appeared to be transitional between the changes expected in normal aging and those of mild Alzheimer's disease.5 These changes include early deposition of neurofibrillary tangles and beta-amyloid plaques, as well as changes in total tau protein, hyperphosphorylated tau protein and amyloid beta42 protein that may help distinguish MCI and early or symptomatic Alzheimers from patients undergoing normal aging, those with depression, and those with Parkinson's disease.1 Additionally, there is evidence that MCI may present with imaging abnormalities which may also point to early neuropathologic evidence of Alzheimers, including volumetric changes in the entorhinal and hipocampal-amygdala regions, as well as functional changes in the tempo-parietal cortex and posterior cingulate areas .1 However, at this time, despite the neuropathological similarities to Alzheimer's and the potential utility of imaging in the future, MCI remains a purely clinical diagnosis.

Several features have been identified which may predict the rate of progression from MCI to Alzheimer's disease. These include apolipoprotein E4 (ApoE4) carrier status, cognitive features such as inability to benefit from semantic cues, degree of hippocampal atrophy, FDG ([11C]flumazenil and [2-18F]2-deoxy-2-fluoro-D-glucose ([18F]) PET markers, and CSF amyloid beta (Αβ) and tau levels. However, the clinical utility of these measures has yet to be established.5

QUESTION 2
Does screening at-risk subjects with a specific instrument in a specific setting accurately lead to the diagnosis of dementia?

Diagnosing MCI is complicated by disagreement on the definition of normal aging and the degree of cognitive change that is nonpathologic.2 Nonetheless, a number of screening tools and more extensive diagnostic batteries exist to help clinicians evaluate and characterize cognitive function in individuals in whom cognitive impairment is suspected due to a subjective complaint of memory impairment. These tools are of variable diagnostic utility, and are especially helpful in cases of MCI and mild dementia in which functional and behavioral impairment may be minimal. In such cases, symptoms may not be obvious, and objective screening tools can pick up more subtle symptoms.

General cognitive screening instruments are among the most useful tools in detecting MCI and mild dementia in clinical settings. Of these, the Mini-Mental State Examination and the Memory Impairment Screen are supported by the strongest evidence, and there is also evidence supporting the utility of the Kokmen Short Test of Mental Status and the 7-Minute Screen. These instruments are most useful in populations that are at elevated risk for cognitive impairment, either due to age or to subjective complaints of memory dysfunction. 3 Impairments on these instruments warrant further assessment and monitoring, as described below.

Attempts have also been made to develop screening tools that can be administered in a briefer time frame. However, these assessments, such as the Clock Drawing Test and the Time and Change Test, focus on limited aspects of cognitive function and are therefore less reliable.3

Neuropsychological batteries, and especially neuropsychological instruments emphasizing memory function, are also useful in objectively differentiating dementia from mild cognitive impairment and normal aging. These instruments, often administered by neuropsychologists, can provide a more in-depth and quantitative assessment of cognitive functioning than can more general screening instruments. As with general screening instruments, these tests are most useful when administered to a population at risk for cognitive impairment.3

Neuropsychological tests alone are not sufficient to diagnose MCI or Alzheimer's disease. The results of these tests can be influenced by multiple factors, including education, age, cultural background, and other illnesses. Moreover, patients with different types of dementia may have overlap in neuropsychological profiles. Therefore, test results must always be supplemented by clinical judgment.2

Inconclusive evidence supports instruments that rely on interviewing collateral sources such as relatives and caretakers, such as the Blessed Dementia Rating Scale, Clinical Dementia Rating, and Informant Questionnaire on Cognitive Decline in the Elderly. Nonetheless, corroborating information about a patient's cognitive and functional status can be clinically useful in identifying candidates for additional screening.2

While neuroimaging studies cannot differentiate MCI from dementia or normal aging, they can help to evaluate for structural or vascular causes of cognitive impairment. Therefore a baseline noncontrast CT or MRI is indicated in the initial workup of patients with cognitive complaints.4

QUESTION 3
What comorbidities should be screened for in elderly patients undergoing an initial assessment for dementia?

Several conditions which occur commonly in the elderly are of particular interest in patients with memory complaints, as they may either cause cognitive difficulties secondarily, or may worsen preexisting dementing illness. Therefore, patients who present with memory complaints require screening for common comorbidities.

Depression. The relationship between cognitive impairment and depression is bidirectional. While depression can produce symptoms that mimic cognitive impairment, true cognitive impairment is also a risk factor for depression. Patients with depression and coexisting cognitive complaints have a high likelihood of dementia on longitudinal follow-up.4 Moreover, in one study approximately 12% of patients with dementia had comorbid depression.4 Because depression can either be a cause or a result of cognitive difficulties, routine screening for depression is indicated in patients with evidence of MCI.

Vitamin B12 deficiency. While B12 deficiency is common in the elderly, and is associated with slight decrements in cognitive performance, there is no evidence that nondemented individuals with B12 deficiency are at higher risk to later develop dementia. Evidence for improvement of cognitive function with treatment of B12 deficiency is equivocal. Moreover, the prevalence of dementia caused by B12 deficiency appears to be very low.4

Hypothyroidism. Hypothyroidism, also common in the elderly, may cause cognitive deficits in nondemented individuals. While elevated TSH levels carried an increased risk for dementia in one study, the prevalence of dementia attributed solely to hypothyroidism was very low.4

Syphilis. There have been no reported cases of tertiary syphilis in North America in the last 20 years, and therefore syphilis screening in patients with cognitive impairment is not indicated unless the patient is at high risk, has a prior history of syphilis, or lives in one of the few areas in the United States with high rates of syphilis.4

Summary of comorbidities. Depression, B12 deficiency, and hypothyroidism are common in the elderly in general and in patients with cognitive difficulties in particular. Screening for and treatment of these disorders is indicated in patients with cognitive complaints, even though treatment may not completely reverse cognitive impairment. Syphilis screening is not indicated except in patients from high risk groups.4

QUESTION 4
What are important directions for future research?

Further studies are needed in order to clarify the distinctions between normal aging, MCI, and early dementia.4 The relationship between MCI and the various categories of dementia, such as Alzheimer's disease, vascular dementia, frontotemporal dementia, and Lewy body dementia are incompletely understood.2 Increased understanding of the pathophysiology of MCI may help to establish this relationship. Further assessment is also needed to determine whether treating MCI can prevent or delay the progression of dementia.2 Biomarkers are needed that can predict the development of dementia in asymptomatic patients or those with MCI in order to further guide early detection and treatment.4

Implications for Management
Awareness of the diagnostic implications described above can help to guide the care of patients with MCI. At present, there are no Food and Drug Administration-approved treatments for MCI. Several clinical trials have looked at the efficacy of pharmacologic treatment in slowing the rate of progression from MCI to Alzheimer's disease. One randomized controlled trial found that treatment of MCI with a cholinesterase inhibitor reduced rates of progression to Alzheimer's disease at 12-month but not at three year follow up. Trials of other agents have not shown a statistically significant treatment benefit.5

While there is no established benefit to pharmacologic treatment of patients with MCI, accurate diagnosis can guide the counseling and monitoring of these patients. Patients should be counseled that they are at risk for developing dementia, but that it is also possible that their condition will not progress. Patients with MCI may also benefit from lifestyle changes such as increasing exercise, intellectual activities, and social connectedness, and modifying their diet to optimize cardiovascular health.5 Moreover, there is evidence that spouses of patients with MCI exhibit elevated caregiver burden and symptoms of anxiety and depression, and therefore may benefit from supportive services.6 Finally, patients with MCI should be monitored clinically for signs of functional impairment consistent with dementia, so that treatment options for Alzheimer's disease may be considered as needed.

Conclusions
An increasing body of evidence suggests that MCI is a useful clinical construct, representing an intermediate state between normal aging and dementia. Patients diagnosed with MCI have an increased risk for progression to Alzheimer's disease. Use of clinical screening instruments and neuropsychological testing can improve recognition of MCI in patients with subjective memory complaints. These patients should also be screened for common comorbidities which may complicate the diagnosis. Patients with MCI should be counseled about the condition's implications, and healthy lifestyle modifications should be encouraged. Patients with MCI should be monitored closely for signs of progression to dementia. Further research is needed to further refine the diagnosis of MCI, better predict the risk of progression to dementia, and evaluate the utility of treatment to delay progression and improve quality of life.

References
  1. Portet, F., Ousset, P.J, Visser, P.J., Frisoni, G.B., Nobili, F., Scheltens, P.H, Vellas, B.,Touchon, J. & MCI Working Group of the European Consortium on Alzheimer's Disease.(EADC) (2006). Mild cognitive impairment (MCI) in medical practice: A critical review of theconcept and new diagnostic procedure. Report of the MCI Working Group of the EuropeanConsortium on Alzheimer's Disease. Journal of Neurology, Neurosurgery, and Psychiatry, 77(6),714-718.
  2. Petersen RC et al, Current Concepts in Mild Cognitive Impairment. Arch Neurol, 2001; 58: 1985-1992.
  3. Petersen RC et al, Practice parameter: Early detection of dementia: Mild cognitive impairment (an evidence-based review): Report of the Quality Standards Subcommittee of the American Academy of Neurology. Neurology, 2001; 56(9):1133-42.
  4. Knopman DS et al, Practice parameter: Diagnosis of dementia (an evidence-based review): Report of the Quality Standards Subcommittee of the American Academy of Neurology. Neurology, 2001; 56:1143-1153.
  5. Petersen RC, Mild Cognitive Impairment: Current Research and Clinical Implications. Seminars in Neurology, 2007; 27: 22-31.
  6. Small BJ et al, Early identification of cognitive deficits: Preclinical Alzheimer's disease and mild cognitive impairment. Geriatrics, 2007; 62: 19-23.

An Introduction to Restless Legs Syndrome

Richard P. Allen, PhD

Neurology & Neurosurgery.

Medscape 2005

Introduction

Although it was first described in the medical literature over 300 years ago, restless legs syndrome (RLS), a very common and potentially very disabling condition, has been largely overlooked by the medical community until the last decade. After Willis'[1] description in Latin in 1672, it was rediscovered by Karl Ekbom[2] who in 1945 not only documented its common occurrence but also coined the name "restless legs" to describe the condition. Despite this pioneering work, RLS was largely ignored for the next half-century. Several factors have contributed to the recent attention given to RLS. The medical community is recognizing more and more the importance of maintaining optimum health and quality of life; the average life span has dramatically increased; and sleep medicine has come into its own as a subspecialty. Each one of these factors has contributed to the growing recognition of RLS. Although extremely rarely if ever mortal, RLS can cause dramatic distress, greatly disturb quality of life, and profoundly disrupt sleep. The condition is more common in those over the age of 50. Finally, excellent treatments have recently been developed that permit sometimes dramatic relief from the suffering that comes with RLS. Thus, today RLS has become recognized as a common, medically significant disorder that once diagnosed can often be effectively treated.

Diagnosis

Although primarily a sensory abnormality, RLS nearly always produces a motor sign. The diagnosis of RLS depends almost entirely on the clinical history characterizing the sensory abnormality. Diagnosis requires meeting all 4 of the essential criteria established by the International RLS Study Group and the National Institutes of Health (NIH) RLS workshop.[3] The patient must experience an urge to move the legs (an akathisia focused on the legs). This urge is commonly accompanied by an unpleasant sensation (a paresthesia) usually described as a feeling of something active deep within the legs. Many patients will describe only the focal akathisia without any associated paresthesia. Because this is an urge to move the legs, it is a conscious sensory experience and not an unconscious habit, such as repetitive foot tapping, which can occur without the patient's being aware of the movement. The RLS focal akathisia is generally not present continuously throughout the day, but rather occurs episodically. Once present, it persists for several minutes, if not hours, unless the patient takes some effective countermeasure to reduce the symptoms. Although the symptoms are usually relieved by movement, they often return as soon as the movement is stopped. Patients with more severe symptoms will struggle to find an hour or so of rest at night when they can lie or sit down without being driven to move about or get up and walk.

The remaining 3 diagnostic criteria involve the circumstances in which the condition occurs. First, it must occur when the patient is at rest either sitting or lying down. Usually, the longer the duration of rest and the more complete the lack of alerting stimulation, the more likely the symptoms will start. For a diagnosis of RLS, the symptoms must not start while the patient is walking. Second, the symptoms must be at least partly relieved by bodily movement, usually movement of the affected leg. It may also be relieved by other physical activities, such as rubbing or stretching the legs. Also, alerting mental activities, such as an argument or a mentally engaging task, may provide some relief. This condition generally occurs at its worst in the borderland between sleep and waking. Most alerting stimulation will reduce the symptoms, including very hot or cold baths. Movement, particularly walking and emotional social interactions, are, of course, among the most alerting events and the most effective means of reducing symptoms. The specific features of the rest that induce the symptoms or of the alerting activities that reduce the symptoms vary from patient to patient, but generally speaking, quiescence induces and motor activity reduces the RLS symptoms. Third, the diagnostic criteria specify that the RLS symptoms must be worse in the evening and night than at other times of the day. Precisely when symptoms become worse in the evening or night varies considerably depending on the activities of the patient, but the patient should describe being able to stay at rest longer with fewer symptoms in the morning than in the evening. This circadian variation in symptoms often leaves the patient completely symptom-free while at the doctor for an examination, but the patient may dread the evening or night, anticipating the torture of the sleep-depriving sensations. The sidebar provides the 4 basic diagnostic criteria and some common descriptions of the sensory experience, which as noted, may be only the akathisia without any focal paresthesia.

The differential diagnosis of RLS includes leg cramps and positional discomfort. Both can lead to confusion in the diagnosis and need to be specifically ruled out for the diagnosis of RLS to be definitive. Positional discomfort symptoms occur only in one very specific body position and are relieved simply by changing positions without other movement. In contrast, RLS symptoms should occur in any position of rest during the evening or night, and relief should require some alerting activity, such as moving about or walking.

This primarily sensory disorder produces a motor sign characterized by periodic leg movements (PLMs) that occur approximately every 5-90 seconds when the patient is asleep or lying down resting. These PLMs in sleep, although not specific to RLS, occur in at least 80% of RLS patients, and correlate with clinical ratings of RLS severity.[4,5] If the patient does not have PLMs, the differential diagnosis should be reviewed to ensure that that patient does not have another condition producing symptoms like RLS. When present, PLMs support the diagnosis, and the degree of PLMs provides a good objective measure of RLS severity of likely response to treatment.

Risk Factors

Besides being a primary disorder, RLS also occurs secondary to other conditions, particularly iron-deficiency anemia, end-stage renal disease, and pregnancy. In these secondary cases of RLS, when the primary condition resolves, the RLS very often resolves, too. Several other conditions are associated with an increased risk of RLS, including rheumatoid arthritis and gastric surgery. Neuropathy has been associated with increased severity and onset of RLS, but in itself has not clearly been identified with an increased risk factor for developing RLS.

Clinical Profile

Although primary RLS can start at any age, 2 broad phenotypes have been identified according to age of onset. Early-onset RLS, which is defined as RLS that occurs before age 45, is characterized by insidiously advancing symptoms slowly progressing over several years, if not most of the patient's life.[6] First-degree relatives of patients with this type of RLS have a 5-fold higher risk for RLS than the general population.[7] Once RLS starts, it may have periods of remission and in some cases no reoccurrence, but in the large bulk of cases, it is a persisting lifelong disorder.[3]

The prevalence of RLS has recently been evaluated in a few population-based studies with the full diagnostic criteria. A German study based on direct diagnostic interviews reported a population prevalence of 10.6% (7.6% for men and 13.4% for women).[8] Two Swedish population studies using questionnaires similarly showed a higher prevalence for women (13.9%) than men (6.1%).[9,10] A carefully conducted multinational population study in 5 European countries and the United States with a validated questionnaire reported a slightly lower overall prevalence of 7.2%, with again a lower incidence in men (5.4%) than in women (9.0%). That study also identified the prevalence of more severe RLS, which is characterized by the occurrence of at least moderately distressing symptoms 3 or more times a week. The prevalence of this more severe RLS was 2.7% ( men 1.7%, women 3.7%).[11] Prevalence seems to be lower in Turkey (3.2%)[12] and even lower in the urban Chinese population in Singapore[13] (.6%).

Etiology

Because all of the secondary causes of RLS noted above involve problems with iron balance, it has been postulated that brain iron insufficiency may be a major cause of RLS. Studies of cerebrospinal fluid have demonstrated abnormally reduced ferritin and increased transferrin for RLS despite normal levels of these iron measures in serum.[14,15] MRI studies of regional brain iron have shown decreased iron content in the substantia nigra, and this decrease correlates with RLS severity.[16] Finally, autopsy studies have found reduced iron, increased transferrin, and reduced H-ferritin, but not L-ferritin, in the neuromelanin cells of the substantia nigra.[17,18] Thus, it seems likely that one primary cause of RLS is brain iron insufficiency.[19]

Treatment

The serendipitous finding by Akpinar[20] that a relatively small dose of levodopa can provide dramatic and nearly complete relief of even very severe RLS symptoms has led to the general assumption that RLS also involves a dopaminergic dysfunction.[21] This notion has been supported by numerous other studies of pharmacologic response. All dopamine agonists evaluated to date have reduced RLS symptoms and conversely the dopamine antagonists tested have exacerbated symptoms. Moreover, dopamine pathology has been found to occur with iron deficiency in animals,[22] and thus it seems likely that the brain iron deficiency seen in some RLS sufferers produces a dopamine abnormality causing the RLS symptoms.[23,24] This putative iron-dopamine pathology for RLS is unlikely to be the only cause of RLS, but at present it is the best documented cause.

RLS has been found to produce a chronic sleep loss more severe than in almost any other condition,[4] and it impairs quality of life as much or more than other chronic diseases, such as depression, congestive heart failure, and hypertension.[25] Fortunately, current treatments with mostly dopamine agonists provide reasonably good improvement in sleep[26] and in quality of life.[27] One dopamine agonist, ropinirole, has recently been approved by the US Food and Drug Administration (FDA) for the treatment of moderate-to-severe RLS. Unfortunately, some patients develop a worsening or augmentation of their underlying RLS conditions while on dopaminergic treatment, which limits the use of these medications for this minority of patients. Other drugs, particularly opiates and some anticonvulsants, such as gabapentin, also provide reasonable treatment for RLS and can be used instead of or in conjunction with a dopamine agonist.

Conclusion

RLS, a distressing sensory motor disorder with some known biological causes, can be diagnosed and usually effectively treated in primary care with referral to a sleep specialist or neurologist when needed for difficult cases. Further understanding of the causes of RLS may eventually lead to even better treatments or even to the prevention of the disorder.

References

  1. Willis T. De Animae Brutorum. London, United Kingdom: Wells and Scott; 1672.
  2. Ekbom KA. Restless legs: a clinical study. Acta Med Scand (Supp). 1945;158:1-122.
  3. Allen RP, Picchietti D, Hening WA, Trenkwalder C, Walters AS, Montplaisi J. Restless legs syndrome: diagnostic criteria, special considerations, and epidemiology. A report from the restless legs syndrome diagnosis and epidemiology workshop at the National Institutes of Health. Sleep Med. 2003;4:101-119. Abstract
  4. Allen RP, Earley CJ. Validation of the Johns Hopkins Restless Legs Severity Scale (JHRLSS). Sleep Med. 2001;2:239-242. Abstract
  5. Garcia-Borreguero D, Larrosa O, de la Llave Y, Jose Granizo J, Allen R. Correlation between rating scales and sleep laboratory measurements in restless legs syndrome. Sleep Med. 2004;5:561-565. Abstract
  6. Allen RP, Earley CJ. Defining the phenotype of the restless legs syndrome (RLS) using age-of-symptom-onset. Sleep Med. 2000;1:11-19. Abstract
  7. Allen RP, La Buda MC, Becker P, Earley CJ. Family history study of the restless legs syndrome. Sleep Med. 2002;3(suppl):S3-S7. Abstract
  8. Berger K, Luedemann J, Trenkwalder C, John U, Kessler C. Sex and the risk of restless legs syndrome in the general population. Arch Intern Med. 2004;164:196-202. Abstract
  9. Ulfberg J, Nystrom B, Carter N, Edling C. Restless legs syndrome among working-aged women. Eur Neurol. 2001;46:17-19. Abstract
  10. Ulfberg J, Nystrom B, Carter N, Edling C. Prevalence of restless legs syndrome among men aged 18 to 64 years: an association with somatic disease and neuropsychiatric symptoms. Mov Disord. 2001;16:1159-1163. Abstract
  11. Allen R, Walters A, Montplaisir J, et al. Restless legs syndrome prevalence and impact: REST population-based study. Arch Intern Med. In press.
  12. Sevim S, Dogu O, Camdeviren H, et al. Unexpectedly low prevalence and unusual characteristics of RLS in Mersin, Turkey. Neurology. 2003;61:1562-1569. Abstract
  13. Tan EK, Seah A, See SJ, Lim E, Wong MC, Koh KK. Restless legs syndrome in an Asian population: a study in Singapore. Mov Disord. 2001;16:577-579. Abstract
  14. Earley CJ, Hyland K, Allen RP. CSF dopamine, serotonin, and biopterin metabolites in patients with restless legs syndrome. Mov Disord. 2001;16:144-149. Abstract
  15. Mizuno S, Mihara T, Miyaoka T, Inagaki T, Horiguchi J. CSF iron, ferritin and transferrin levels in restless legs syndrome. J Sleep Res. 2005;14:43-47. Abstract
  16. Allen RP, Barker PB, Wehrl F, Song HK, Earley CJ. MRI measurement of brain iron in patients with restless legs syndrome. Neurology. 2001;56:263-265. Abstract
  17. Connor JR, Boyer PJ, Menzies SL, Dellinger B, Allen RP, Earley CJ. Neuropathological examination suggests impaired brain iron acquisition in restless legs syndrome. Neurology. 2003;61:304-309. Abstract
  18. Connor JR, Wang XS, Patton SM, et al. Decreased transferrin receptor expression by neuromelanin cells in restless legs syndrome. Neurology. 2004;62:1563-1567. Abstract
  19. Allen RP, Earley CJ. Restless legs syndrome: a review of clinical and pathophysiologic features. J Clin Neurophysiol. 2001;18:128-147. Abstract
  20. Akpinar S. Treatment of restless legs syndrome with levodopa plus benserazide [letter]. Arch Neurol. 1982;39:739.
  21. Montplaisir J, Godbout R, Poirier G, Bedard MA. Restless legs syndrome and periodic movements in sleep: physiopathology and treatment with L-dopa. Clin Neuropharmacol. 1986;9:456-463. Abstract
  22. Beard J, Erikson KM, Jones BC. Neonatal iron deficiency results in irreversible changes in dopamine function in rats. J Nutr. 2003;133:1174-1179. Abstract
  23. Allen R. Dopamine and iron in the pathophysiology of restless legs syndrome (RLS). Sleep Med. 2004;5:385-391. Abstract
  24. Earley CJ, Allen RP, Beard JL, Connor JR. Insight into the pathophysiology of restless legs syndrome. J Neurosci Res. 2000;62:623-628. Abstract
  25. Abetz L, Allen R, Follet A, et al. Evaluating the quality of life of patients with restless legs syndrome. Clin Ther. 2004;26:925-935. Abstract
  26. Allen R, Becker PM, Bogan R, et al. Ropinirole decreases periodic leg movements and improves sleep parameters in patients with restless legs syndrome. Sleep. 2004;27:907-914. Abstract
  27. Trenkwalder C, Garcia-Borreguero D, Montagna P, et al. Ropinirole in the treatment of restless legs syndrome: results from the TREAT RLS 1 study, a 12 week, randomised, placebo controlled study in 10 European countries. J Neurol Neurosurg Psychiatry. 2004;75:92-97. Abs

Sidebar

The diagnosis of RLS requires meeting all 4 of the following criteria[2]:

  • An urge to move the legs occurs, usually accompanied or caused by uncomfortable and unpleasant sensations in the legs;
  • The urge to move or unpleasant sensations begin or worsen during periods of rest or inactivity, such as lying or sitting;
  • The urge to move or unpleasant sensations are partially or totally relieved by movement, such as walking or stretching, at least as long as the activity continues; and
  • The urge to move or unpleasant sensations are worse in the evening or night than during the day, or only occur in the evening or night.

The following are examples of the common subjective reports of the sensations in the legs given by RLS patients:

  • "Creepy, crawly";
  • "Worms crawling in veins";
  • "Pepsi-Cola in the veins";
  • "Nervous feet";
  • "Itchy bones";
  • "Crazy legs or heebie-jeebies";
  • "Elvis legs";
  • "Toothache feeling -- can't leave it alone";
  • "Electric-like shocks";
  • "Excited nerves"; and
  • " Just need to move ."

Patients complain of pain in as many as 35% of clinical cases.

Funding Information

Supported by an independent educational grant from GlaxoSmithKline.

Richard P. Allen, PhD , Assistant Professor of Neurology, The Johns Hopkins University, Baltimore, Maryland; Co-Director, Johns Hopkins Sleep Disorders Center, Baltimore, Maryland

Disclosure: Richard P. Allen, MD, has disclosed that he has received grants for clinical research from GlaxoSmithKline, and has served as an advisor or consultant for GlaxoSmithKline, Boehringer Ingelheim, Sepracor, Orion Pharmaceuticals, and Schwarz Pharma. Dr. Allen has also disclosed that he has received travel support from GlaxoSmithKline, Boehringer Ingelheim, Sepracor, Orion Pharmaceuticals, and Schwarz Pharma.

Learning, Your Memory, and Cholesterol

July, 2005
By Chris Masterjohn

One of the many important roles cholesterol plays in the body is in our nervous system, enabling learning and memory to take place. In fact, one of the reasons that sleep is beneficial to our learning and memory is because it enables our brain to make more cholesterol!

While the war on cholesterol is waged full-speed ahead, and many web sites are now touting low-fat, low-cholesterol diets as "brain-healthy" due to unfortunate misinterpretations about the role of a cholesterol byproduct in Alzheimer's disease, science is continually showing that cholesterol is one of the most important parts of our brains.

Sleep, Memory, Learning... and Cholesterol

Evidence to date strongly supports the concept that sleep plays an important role in increasing performance of newly learned activities, consolidating memories, and increasing brain plasticity-- which is the ability to form new, as well as break, connections between neurons called synapses.

These benefits of sleep are not simply the absence of stress from sleep-deprivation, but an independent, critical role, in the actual process of learning and memory-formation.1

But why?

Exciting research was published in the pages of Neuron last year (2004),2 identified about 100 genes that increase their activity during sleep. They found about as many that increased their activity during wake, and others whose activity varied with circadian rhythm, independent of sleep or wakefulness.

While there are many important cellular and molecular events that happen during sleep, and we are only cracking the surface in our understanding of them, one of the things this study showed is that cholesterol synthesis increases during sleep-- which, given the research described below, undoubtedly is part of the reason sleep is beneficial to mental functioning.

Cholesterol is abundant in the tissue of the brain and nervous system. Myelin, which covers nerve axons to help conduct the electrical impulses that make movement, sensation, thinking, learning, and remembering possible, is over one fifth cholesterol by weight.3

Even though the brain only makes up 2% of the body's weight, it contains 25% of its cholesterol.4

One of the groups of genes that the above study found to be upregulated during sleep were genes important for the synthesis and maintenance of myelin, including myelin structural proteins and genes relating to the synthesis and transport of cholesterol.

But the benefits of cholesterol extend beyond both sleep and myelin. In fact, in 2001, cholesterol was found to be the most important factor in the formation of synapses, the basis of our learning and memory.

Memories and Learning are Directly Dependent on Cholesterol

In the late 1990s and early 2000s, research was pointing to an unknown compound made by glial cells that was responsible for the ability of neurons to form synapses, or connections between each other.

Thoughts, memories, learning, and all mental function is dependent on the formation of synapses, so the ability to form them will directly impact mental functioning and health.

In the absence of this-- as yet unknown-- "glial factor," neurons formed few synapses, and the synapses they formed were inefficient and poorly functioning. In the presence of glial cells, which secrete the unknown factor, neurons formed many, highly efficient synapses.

So what is this "glial factor"?

Research in 2001, by Mauch, et al., published in volume 294 of Science magazine, determined that the unknown glial factor is cholesterol, which is released by the glial cells in a carrier called "apolipoprotein E."5

Initially, the researchers thought that the apolipoprotein E (apoE) may have been the glial factor itself. But it turned out that when neurons were treated with apoE, the beneficial effects on synapse formation were not observed.

The researchers then reasoned that, since apoE fit the bill in some ways, but did not have the desired effect, some of the lipids it carried may have been the elusive glial factor.

As it turned out, treating the neurons with a 10 mcg/mL solution of cholesterol increased synapse formation by 12 times! Other lipids, carried by apoE, such as phosphatidylcholine and sphingomyelin, did not have a significant effect, and were even toxic to the neurons at very high doses.

On the other hand, when low-cholesterol glial secretions were produced by using the cholesterol-lowering drug, mevastatin, the effect of the glial secretion on synapse formation was strongly diminished. When cholesterol was added back to the low-cholesterol secretion, the positive effect on synapse formation was fully restored.

The authors identified cholesterol as a limiting factor of synpase formation. In other words, the need for cholesterol in the brain is large enough relative to the supply of cholesterol that the availability of cholesterol can directly limit the ability to form synapses.

Neurites Lose Their Way Without Cholesterol

"Neurites" refer to the extensions from the cell of a neuron that connect with other neurons or muscles. The type of neurite that sends impulses away from the cell is an axon, and the type of neurite that receives impulses is a dendrite.

Connections between neurons, called synapses, are contantly being formed and broken in our brains, where dendrites and axons meet.

But they can't just grow randomly. Neurites grow in response to a stimulus given by a signaling protein in the membranes of neurons.

These signaling proteins rely on lipid rafts, which are beds of cholesterol and phospholipids made from long-chain saturated fatty acids that secure some proteins.

A 2004 study found that disrupting lipid rafts by extracting some of the cholesterol from the membrane of a neuron completely destroyed the ability of neurons to find the signaling proteins attracting them!6

Statins Could Kill Your Memory -- Eggs Could Cure It

We now know that the formation of synapses, or connections between neurons, is directly dependent on the availability of cholesterol.

The formation of these synapses are what give us the ability to remember and learn. The benefits of sleep for memory formation and learning are in part a result of increased cholesterol synthesis during sleep.

The implications are important and powerful. In our society's quest to lower cholesterol at all costs and without second thought, could some of the methods we use, such as taking cholesterol-lowering drugs, or eating low-fat, low-cholesterol diets, be limiting the availability of cholesterol to our nervous system?

The authors of the 2001 Science study described above concluded that the "results imply that genetic or age-related defects in the synthesis, transport, or uptake of cholesterol in the CNS may directly impair the development and plasticity of the synaptic circuitry."

But the authors left out one thing: in addition to genetic or age-related defects, there is currently a boombing industry founded upon the deliberate inhibition of the synthesis of cholesterol using pharmaceutical drugs. Some statins cross the blood-brain barrier (BBB); others may affect cholesterol levels in the brain without necessarily crossing the BBB; still, without crossing the BBB, the peripheral nervous system could likewise be damaged.

In fact, amnesia and cognitive dysfunction are reported as side-effects in some statin users. Dr. Duane Graveline, MD, former NASA scientist and astronaut, describes his bout of statin-induced memory loss, in which his wife caught him wandering aimlessly in his yard while he failed to recognize her.

In an article on the side-effects of cholesterol-lowering drugs, the Geriatric Times cites two randomized trials, several case reports, and one large case series pointing to memory loss as a result of statin drugs.

Conversely, dietary cholesterol can help reverse the effects of declining memory with age.

Mary Enig, PhD, cites a study in her book, Know Your Fats, that found that the cholesterol in eggs can help improve memory in the elderly.5

Remember...

One of the basic parts of ourselves that defines us as humans is our mind. We imagine, think, study, learn, remember, come up with new ideas, remember old faces, and form friendships and familial relationships that are based, in large part, on our memories of those people.

Cholesterol is a central building block of the connections within our brain that hold these memories and learning processes together. Remember that... thanks to cholesterol, you can!

Endnotes

1. Walker, et al., "Sleep-Dependent Learning and Memory Consolidation" (Review), Neuron, Vol. 44, 121-133, September 30, 2004.

2. Cirelli, et al., "Extensive and Divergent Effects of Sleep and Wakefulness on Brain Gene Expression," Nueron, Vol. 41, 35-43, January 8, 2004.

3. Alberts, et al., Molecular Biology of the Cell: Fourth Edition, New York: Garland Science, 2002, p589F.

4. Dietschy JM, Turley SD, "Cholesterol metabolism in the brain," Curr Opin Lipidol, 2001, 12: 105-112.

5. Mauch et al., "CNS Synaptogenesis Promoted by Glia-Derived Cholesterol," Science, Vol. 294, 1354-1357.

6. Guirland, et al., "Lipid Rafts Mediate Chemotropic Guidance of Nerve Growth Cones," Neuron, Vol. 42, 51-62, April 8, 2004.

7. Enig, Mary G., PhD, Know Your Fats: The Complete Primer for Understanding the Nutrition of Fats, Oils, and Cholesterol, Silver Spring: Bethseda Press, 2000, p. 57.

Tuesday, October 9, 2007

Crime and Punishment: Why Do We Conform to Society?

Scientific American
October 05, 2007

By Nikhil Swaminathan

A pair of brain regions work together to assess the threat of punishment and override our selfish tendencies

Whether you subscribe to the Ten Commandments, the Golden Rule or some instinctive moral code, society functions largely because most of its denizens adhere to a set of norms that allow them to live together in relative tranquility.

But, why is it that we put a vast amount of social resources into keeping stealing, murdering and other unfair (not to mention violent and illegal) acts to a minimum? Seems it all comes down to the fact that most of us don't cotton to being punished by our peers.

"The reason why punishment for norm violations is important is that it disciplines the potential norm violators," says Ernst Fehr, an economist at the University of Zurich and the senior author of a paper on the issue published this week in Neuron.

In the new study, Fehr and colleagues uncovered activity in two areas of the brain underlying the neural mechanism involved in conforming to society's values. They further determined that subjects with Machiavellian personalities—a strong sense of self-interest, opportunism and manipulation—have heightened activity in one of these regions, which the authors believe is related to assessing the threat of punishment.

During the study, which also involved scientists at the University of Ulm in Germany, 23 male students were instructed to play a version of the "ultimatum game" while their brains were scanned via functional magnetic resonance imaging (fMRI). Each participant was given a sum of money (100 monetary units) to split however he chose with an anonymous partner. In some cases the recipient simply had to accept any offer made. Other times, after an offer was made, the recipient had the option penalize the giver by taking some or all of their money, if the latter had not shared generously.

The subjects' brains were only scanned when they played the giver role. Before each trial, both players were told whether the recipient would be allowed to exact a punishment if he felt he got too slim a slice of the pie. Two areas of the cortex (the brain's primary processing unit) were particularly active during the trials when punishment was an option: the lateral orbitofrontal cortex, a region below the temples of the head that had, in previous research, been implicated in processing a threat stimulus, and a section just behind it called the dorsolateral prefrontal cortex.

"The lateral orbitofrontal cortex [activity] represents the punishment threat here," says Fehr, citing previous research that fingered it in threat assessment. "More specifically, how bad does the brain interpret this punishment threat?"

Alternatively, he says, "[the dorsolateral prefrontal cortex] is an area that is involved in cognitive control and overriding prepotent impulses. Here, we have a design where the prepotent impulse is not to share the money—at least to the extent that player B wants it shared."

Interestingly, the research team also had their subjects fill out a questionnaire to determine their degree of Machiavellian behavior. Those who proved to be the most ruthless of the bunch offered little to nothing when there was no threat of punishment, but within the punishment paradigm, they were generous enough to stave off retribution.

"These are socially intelligent, selfish people," Fehr says about the more calculating subjects. "They escape the punishments that are inherent in social interactions, because they seem to have a fine sense of when punishment is in the air."

Jorge Moll, principal investigator of the cognitive and behavioral neuroscience unit at the Rede Labs-D'Or Hospitals in Rio de Janeiro, says the most interesting findings were that individual scores on Machiavellianism predicted "how much a given subject will change his behavior depending on the presence of punishment," and "that the level of activity within the lateral orbitofrontal cortex is strongly related to Machiavellian personality style."


Researchers say the results could have wide-reaching implications, potentially paving the way to understand—and perhaps one day reverse—the neurobiology behind psychopathic and sociopathic personalities. They intend to repeat the study with patients suffering from antisocial anxiety and personality disorders to determine if their behavior can be explained by a lack of impulse control or a poor assessment of punishment.

Fehr argues the results could also impact the criminal justice system since the dorsolateral prefrontal cortex does not fully develop until after a person is around 20 years old.

"This area seems to be critically important in overriding self-interest," he says. Thus, "you just can't treat an immature adolescent the same way as a mature adult—that's at least my view of doing justice." It's unclear whether judges and juries see it that way, however.

Monday, October 8, 2007

Instant Expert: The Human Brain

11:58 04 September 2006
NewScientist.com news service
Helen Philips

The Human Brain - With one hundred billion nerve cells, the complexity is mind-boggling.

The brain is the most complex organ in the human body. It produces our every thought, action, memory, feeling and experience of the world. This jelly-like mass of tissue, weighing in at around 1.4 kilograms, contains a staggering one hundred billion nerve cells, or neurons.

The complexity of the connectivity between these cells is mind-boggling. Each neuron can make contact with thousands or even tens of thousands of others, via tiny structures called synapses. Our brains form a million new connections for every second of our lives. The pattern and strength of the connections is constantly changing and no two brains are alike.

It is in these changing connections that memories are stored, habits learned and personalities shaped, by reinforcing certain patterns of brain activity, and losing others.

Grey matter

While people often speak of their "grey matter", the brain also contains white matter. The grey matter is the cell bodies of the neurons, while the white matter is the branching network of thread-like tendrils - called dendrites and axons - that spread out from the cell bodies to connect to other neurons.

But the brain also has another, even more numerous type of cell, called glial cells. These outnumber neurons ten times over. Once thought to be support cells, they are now known to amplify neural signals and to be as important as neurons in mental calculations. There are many different types of neuron, only one of which is unique to humans and the other great apes, the so called spindle cells.

Brain structure is shaped partly by genes, but largely by experience. Only relatively recently it was discovered that new brain cells are being born throughout our lives - a process called neurogenesis. The brain has bursts of growth and then periods of consolidation, when excess connections are pruned. The most notable bursts are in the first two or three years of life, during puberty, and also a final burst in young adulthood.

How a brain ages also depends on genes and lifestyle too. Exercising the brain and giving it the right diet can be just as important as it is for the rest of the body.

Chemical messengers

The neurons in our brains communicate in a variety of ways. Signals pass between them by the release and capture of neurotransmitter and neuromodulator chemicals, such as glutamate, dopamine, acetylcholine, noradrenalin, serotonin and endorphins.

Some neurochemicals work in the synapse, passing specific messages from release sites to collection sites, called receptors. Others also spread their influence more widely, like a radio signal, making whole brain regions more or less sensitive.

These neurochemicals are so important that deficiencies in them are linked to certain diseases. For example, a loss of dopamine in the basal ganglia, which control movements, leads to Parkinson’s disease. It can also increase susceptibility to addiction because it mediates our sensations of reward and pleasure.

Similarly, a deficiency in serotonin, used by regions involved in emotion, can be linked to depression or mood disorders, and the loss of acetylcholine in the cerebral cortex is characteristic of Alzheimer’s disease.

Brain scanning

Within individual neurons, signals are formed by electrochemical pulses. Collectively, this electrical activity can be detected outside the scalp by an electroencephalogram (EEG).

These signals have wave-like patterns, which scientists classify from alpha (common while we are relaxing or sleeping), through to gamma (active thought). When this activity goes awry, it is called a seizure. Some researchers think that synchronising the activity in different brain regions is important in perception.

Other ways of imaging brain activity are indirect. Functional magnetic resonance imaging (fMRI) or positron emission tomography (PET) monitor blood flow. MRI scans, computed tomography (CT) scans and diffusion tensor images (DTI) use the magnetic signatures of different tissues, X-ray absorption, or the movement of water molecules in those tissues, to image the brain.

These scanning techniques have revealed which parts of the brain are associated with which functions. Examples include activity related to sensations, movement, libido, choices, regrets, motivations and even racism. However, some experts argue that we put too much trust in these results and that they raise privacy issues.

Before scanning techniques were common, researchers relied on patients with brain damagestrokes, head injuries or illnesses, to determine which brain areas are required for certain functions. This approach exposed the regions connected to emotions, dreams, memory, language and perception and to even more enigmatic events, such as religious or "paranormal" experiences. caused by

One famous example was the case of Phineas Gage, a 19th century railroad worker who lost part of the front of his brain when a 1-metre-long iron pole was blasted through his head during an explosion. He recovered physically, but was left with permanent changes to his personality, showing for the first time that specific brain regions are linked to different processes.

Structure in mind

The most obvious anatomical feature of our brains is the undulating surfac of the cerebrum - the deep clefts are known as sulci and its folds are gyri. The cerebrum is the largest part of our brain and is largely made up of the two cerebral hemispheres. It is the most evolutionarily recent brain structure, dealing with more complex cognitive brain activities.

It is often said that the right hemisphere is more creative and emotional and the left deals with logic, but the reality is more complex. Nonetheless, the sides do have some specialisations, with the left dealing with speech and language, the right with spatial and body awareness.

See our Interactive Graphic for more on brain structure

Further anatomical divisions of the cerebral hemispheres are the occipital lobe at the back, devoted to vision, and the parietal lobe above that, dealing with movement, position, orientationcalculation. and

Behind the ears and temples lie the temporal lobes, dealing with sound and speech comprehension and some aspects of memory. And to the fore are the frontal and prefrontal lobes, often considered the most highly developed and most "human" of regions, dealing with the most complex thought, decision making, planning, conceptualising, attention control and working memory. They also deal with complex social emotions such as regret, morality and empathy.

Another way to classify the regions is as sensory cortex and motor cortex, controlling incoming information, and outgoing behaviour respectively.

Below the cerebral hemispheres, but still referred to as part of the forebrain, is the cingulate cortex, which deals with directing behaviour and pain. And beneath this lies the corpus callosum, which connects the two sides of the brain. Other important areas of the forebrain are the basal ganglia, responsible for movement, motivation and reward.

Urges and appetites

Beneath the forebrain lie more primitive brain regions. The limbic system, common to all mammals, deals with urges and appetites. Emotions are most closely linked with structures called the amygdala, caudate nucleus and putamen. Also in the limbic brain are the hippocampusthalamus - a kind of sensory relay station; and the hypothalamus, which regulates bodily functions via hormone release from the pituitary gland. - vital for forming new memories; the

The back of the brain has a highly convoluted and folded swelling called the cerebellum, which stores patterns of movement, habits and repeated tasks - things we can do without thinking about them.

The most primitive parts, the midbrain and brain stem, control the bodily functions we have no conscious control of, such as breathing, heart rate, blood pressure, sleep patterns, and so on. They also control signals that pass between the brain and the rest of the body, through the spinal cord.

Though we have discovered an enormous amount about the brain, huge and crucial mysteriesconscious experiences? remain. One of the most important is how does the brain produces our

The vast majority of the brain’s activity is subconscious. But our conscious thoughts, sensations and perceptions - what define us as humans - cannot yet be explained in terms of brain activity.

Sunday, October 7, 2007

Searching for God in the Brain

By David Biello
Scientific American Mind

October 2007 Issue

Researchers are unearthing the roots of religious feeling in the neural commotion that accompanies the spiritual epiphanies of nuns, Buddhists and other people of faith.

The doughnut-shaped machine swallows the nun, who is outfitted in a plain T-shirt and loose hospital pants rather than her usual brown habit and long veil. She wears earplugs and rests her head on foam cushions to dampen the device’s roar, as loud as a jet engine. Supercooled giant magnets generate intense fields around the nun’s head in a high-tech attempt to read her mind as she communes with her deity.

The Carmelite nun and 14 of her Catholic sisters have left their cloistered lives temporarily for this claustrophobic blue tube that bears little resemblance to the wooden prayer stall or sparse room where such mystical experiences usually occur. Each of these nuns answered a call for volunteers “who have had an experience of intense union with God” and agreed to participate in an experiment devised by neuroscientist Mario Beauregard of the University of Montreal. Using functional magnetic resonance imaging (fMRI), Beauregard seeks to pinpoint the brain areas that are active while the nuns recall the most powerful religious epiphany of their lives, a time they experienced a profound connection with the divine. The question: Is there a God spot in the brain?

The spiritual quest may be as old as humankind itself, but now there is a new place to look: inside our heads. Using fMRI and other tools of modern neuroscience, researchers are attempting to pin down what happens in the brain when people experience mystical awakenings during prayer and meditation or during spontaneous utterances inspired by religious fervor.

Such efforts to reveal the neural correlates of the divine—a new discipline with the warring titles “neurotheology” and “spiritual neuroscience”—not only might reconcile religion and science but also might help point to ways of eliciting pleasurable otherworldly feelings in people who do not have them or who cannot summon them at will. Because of the positive effect of such experiences on those who have them, some researchers speculate that the ability to induce them artificially could transform people’s lives by making them happier, healthier and better able to concentrate. Ultimately, however, neuroscientists study this question because they want to better understand the neural basis of a phenomenon that plays a central role in the lives of so many. “These experiences have existed since the dawn of humanity. They have been reported across all cultures,” Beauregard says. “It is as important to study the neural basis of [religious] experience as it is to investigate the neural basis of emotion, memory or language.”

Mystical Misfirings
Scientists and scholars have long speculated that religious feeling can be tied to a specific place in the brain. In 1892 textbooks on mental illness noted a link between “religious emotionalism” and epilepsy. Nearly a century later, in 1975, neurologist Norman Geschwind of the Boston Veterans Administration Hospital first clinically described a form of epilepsy in which seizures originate as electrical misfirings within the temporal lobes, large sections of the brain that sit over the ears. Epileptics who have this form of the disorder often report intense religious experiences, leading Geschwind and others, such as neuropsychiatrist David Bear of Vanderbilt University, to speculate that localized electrical storms in the brain’s temporal lobe might sometimes underlie an obsession with religious or moral issues.

Exploring this hypothesis, neuroscientist Vilayanur S. Ramachandran of the University of California, San Diego, asked several of his patients who have temporal lobe epilepsy to listen to a mixture of religious, sexual and neutral words while he tested the intensity of their emotional reactions using a measure of arousal called the galvanic skin response, a fluctuation in the electrical resistance of the skin. In 1998 he reported in his book Phantoms in the Brain (William Morrow), co-authored with journalist Sandra Blakeslee, that the religious words, such as “God,” elicited an unusually large emotional response in these patients, indicating that people with temporal lobe epilepsy may indeed have a greater propensity toward religious feeling.

The key, Ramachandran speculates, may be the limbic system, which comprises interior regions of the brain that govern emotion and emotional memory, such as the amygdala and hypothalamus. By strengthening the connection between the temporal lobe and these emotional centers, epileptic electrical activity may spark religious feeling.

To seal the case for the temporal lobe’s involvement, Michael Persinger of Laurentian University in Ontario sought to artificially re-create religious feelings by electrically stimulating that large subdivision of the brain. So Persinger created the “God helmet,” which generates weak electromagnetic fields and focuses them on particular regions of the brain’s surface.

In a series of studies conducted over the past several decades, Persinger and his team have trained their device on the temporal lobes of hundreds of people. In doing so, the researchers induced in most of them the experience of a sensed presence—a feeling that someone (or a spirit) is in the room when no one, in fact, is—or of a profound state of cosmic bliss that reveals a universal truth. During the three-minute bursts of stimulation, the affected subjects translated this perception of the divine into their own cultural and religious language—terming it God, Buddha, a benevolent presence or the wonder of the universe.

Persinger thus argues that religious experience and belief in God are merely the results of electrical anomalies in the human brain. He opines that the religious bents of even the most exalted figures—for instance, Saint Paul, Moses, Muhammad and Buddha—stem from such neural quirks. The popular notion that such experiences are good, argues Persinger in his book Neuropsychological Bases of God Beliefs (Praeger Publishers, 1987), is an outgrowth of psychological conditioning in which religious rituals are paired with enjoyable experiences. Praying before a meal, for example, links prayer with the pleasures of eating. God, he claims, is nothing more mystical than that.

Expanded Horizons
Although a 2005 attempt by Swedish scientists to replicate Persinger’s God helmet findings failed, researchers are not yet discounting the temporal lobe’s role in some types of religious experience. After all, not all such experiences are the same. Some arise from following a specific religious tradition, such as the calm Catholics feel when saying the rosary. Others bring a person into a perception of contact with the divine. Yet a third category might be mystical states that reveal fundamental truths opaque to normal consciousness. Thus, it is possible that different religious feelings arise from distinct locations in the brain. Individual differences might also exist. In some people, the neural seat of religious feeling may lie in the temporal lobe, whereas in others it could reside elsewhere.

Indeed, University of Pennsylvania neuroscientist Andrew Newberg and his late colleague, Eugene d’Aquili, have pointed to the involvement of other brain regions in some people under certain circumstances. Instead of artificially inducing religious experience, Newberg and d’Aquili used brain imaging to peek at the neural machinery at work during traditional religious practices. In this case, the scientists studied Buddhist meditation, a set of formalized rituals aimed at achieving defined spiritual states, such as oneness with the universe.

When the Buddhist subjects reached their self-reported meditation peak, a state in which they lose their sense of existence as separate individuals, the researchers injected them with a radioactive isotope that is carried by the blood to active brain areas. The investigators then photographed the isotope’s distribution with a special camera—a technique called single-photon-emission computed tomography (SPECT).

The height of this meditative trance, as they described in a 2001 paper, was associated with both a large drop in activity in a portion of the parietal lobe, which encompasses the upper back of the brain, and an increase in activity in the right prefrontal cortex, which resides behind the forehead. Because the affected part of the parietal lobe normally aids with navigation and spatial orientation, the neuroscientists surmise that its abnormal silence during meditation underlies the perceived dissolution of physical boundaries and the feeling of being at one with the universe. The prefrontal cortex, on the other hand, is charged with attention and planning, among other cognitive duties, and its recruitment at the meditation peak may reflect the fact that such contemplation often requires that a person focus intensely on a thought or object.

Neuroscientist Richard J. Davidson of the University of Wisconsin–Madison and his colleagues documented something similar in 2002, when they used fMRI to scan the brains of several hundred meditating Buddhists from around the world. Functional MRI tracks the flow of oxygenated blood by virtue of its magnetic properties, which differ from those of oxygen-depleted blood. Because oxygenated blood preferentially flows to where it is in high demand, fMRI highlights the brain areas that are most active during—and thus presumably most engaged in—a particular task.

Davidson’s team also found that the Buddhists’ meditations coincided with activation in the left prefrontal cortex, again perhaps reflecting the ability of expert practitioners to focus despite distraction. The most experienced volunteers showed lower levels of activation than did those with less training, conceivably because practice makes the task easier. This theory jibes with reports from veterans of Buddhist meditation who claim to have reached a state of “effortless concentration,” Davidson says.

What is more, Newberg and d’Aquili obtained concordant results in 2003, when they imaged the brains of Franciscan nuns as they prayed. In this case, the pattern was associated with a different spiritual phenomenon: a sense of closeness and mingling with God, as was similarly described by Beauregard’s nuns. “The more we study and compare the neurological underpinnings of different religious practices, the better we will understand these experiences,” Newberg says. “We would like to [extend our work by] recruiting individuals who engage in Islamic and Jewish prayer as well as revisiting other Buddhist and Christian practices.”

Newberg and his colleagues discovered yet another activity pattern when they scanned the brains of five women while they were speaking in tongues—a spontaneous expression of religious fervor in which people babble in an incomprehensible language. The researchers announced in 2006 that the activity in their subjects’ frontal lobes—the entire front section of the brain—declined relative to that of five religious people who were simply singing gospel. Because the frontal lobes are broadly used for self-control, the research team concluded that the decrement in activity there enabled the loss of control necessary for such garrulous outbursts.

Spiritual Networking
Although release of frontal lobe control may be involved in the mystical experience, Beauregard believes such profound states also call on a wide range of other brain functions. To determine exactly what might underlie such phenomena, the Quebecois neuroscientist and his colleagues used fMRI to study the brains of 15 nuns during three different mental states. Two of the conditions—resting with closed eyes and recollecting an intense social experience—were control states against which they compared the third: reminiscence or revival of a vivid experience with God.

As each nun switched between these states on a technician’s cue, the MRI machine recorded cross sections of her brain every three seconds, capturing the whole brain roughly every two minutes. Once the neural activity was computed and recorded, the experimenters compared the activation patterns in the two control states with those in the religious state to elucidate the brain areas that became more energized during the mystical experience. (Although Beauregard had hoped the nuns would experience a mystical union while in the scanner, the best they could do, it turned out, was to conjure up an emotionally powerful memory of union with God. “God can’t be summoned at will,” explained Sister Diane, the prioress of the Carmelite convent in Montreal.)

The researchers found six regions that were invigorated only during the nuns’ recall of communion with God. The spiritual memory was accompanied by, for example, increased activity in the caudate nucleus, a small central brain region to which scientists have ascribed a role in learning, memory and, recently, falling in love; the neuroscientists surmise that its involvement may reflect the nuns’ reported feeling of unconditional love. Another hot spot was the insula, a prune-size chunk of tissue tucked within the brain’s outermost layers that monitors body sensations and governs social emotions. Neural sparks there could be related to the visceral pleasurable feelings associated with connections to the divine.

And augmented activity in the inferior parietal lobe, with its role in spatial awareness—paradoxically, the opposite of what Newberg and Davidson witnessed—might mirror the nuns’ feeling of being absorbed into something greater. Either too much or too little activity in this region could, in theory, result in such a phenomenon, some scientists surmise. The remainder of the highlighted regions, the researchers reported in the September 25, 2006, issue of Neuroscience Letters, includes the medial orbitofrontal cortex, which may weigh the pleasantness of an experience; the medial prefrontal cortex, which may help govern conscious awareness of an emotional state; and, finally, the middle of the temporal lobe.

The quantity and diversity of brain regions involved in the nuns’ religious experience point to the complexity of the phenomenon of spirituality. “There is no single God spot, localized uniquely in the temporal lobe of the human brain,” Beauregard concludes. “These states are mediated by a neural network that is well distributed throughout the brain.”

Brain scans alone cannot fully describe a mystical state, however. Because fMRI depends on blood flow, which takes place on the order of seconds, fMRI images do not capture real-time changes in the firing of neurons, which occur within milliseconds. That is why Beauregard turned to a faster technique called quantitative electroencephalography (EEG), which measures the voltage from the summed responses of millions of neurons and can track its fluctuation in real time. His team outfitted the nuns with red bathing caps studded with electrodes that pick up electric currents from neurons. These currents merge and appear as brain waves of various frequencies that change as the nuns again recall an intense experience with another person and a deep connection with God.

Beauregard and his colleagues found that the most prevalent brain waves are long, slow alpha waves such as those produced by sleep, consistent with the nuns’ relaxed state. In work that has not yet been published, the scientists also spotted even lower-frequency waves in the prefrontal and parietal cortices and the temporal lobe that are associated with meditation and trance. “We see delta waves and theta waves in the same brain regions as the fMRI,” Beauregard says.

Fool’s Errand?
The brain mediates every human experience from breathing to contemplating the existence of God. And whereas activity in neural networks is what gives rise to these experiences, neuroimaging cannot yet pinpoint such activity at the level of individual neurons. Instead it provides far cruder anatomical information, highlighting the broad swaths of brain tissue that appear to be unusually dynamic or dormant. But using such vague structural clues to explain human feelings and behaviors may be a fool’s errand. “You list a bunch of places in the brain as if naming something lets you understand it,” opines neuropsychologist Seth Horowitz of Brown University. Vincent Paquette, who collaborated with Beauregard on his experiments, goes further, likening neuroimaging to phrenology, the practice in which Victorian-era scientists tried—and ultimately failed—to intuit clues about brain function and character traits from irregularities in the shape of the skull.

Spiritual neuroscience studies also face the profound challenge of language. No two mystics describe their experiences in the same way, and it is difficult to distinguish among the various types of mystical experiences, be they spiritual or traditionally religious. To add to the ambiguity, such feelings could also encompass awe of the universe or of nature. “If you are an atheist and you live a certain kind of experience, you will relate it to the magnificence of the universe. If you are a Christian, you will associate it with God. Who knows? Perhaps they are the same,” Beauregard muses.

Rather than attempting to define religious experience to understand it, some say we should be boiling it down to its essential components. “When we talk about phenomena like a mystical experience, we need to be a lot more specific about what we are referring to as far as changes in attention, memory and perception,” Davidson says. “Our only hope is to specify what is going on in each of those subsystems,” as has been done in studies of cognition and emotion.

Other research problems abound. None of the techniques, for example, can precisely delineate specific brain regions. And it is virtually impossible to find a perfect so-called reference task for the nuns to perform against which to compare the religious experience they are trying to capture. After all, what human experience is just one detail different from the awe and love felt in the presence of God?

Making Peace
For the nuns, serenity does not come from a sense of God in their brains but from an awareness of God with them in the world. It is that peace and calm, that sense of union with all things, that Beauregard wants to capture—and perhaps even replicate. “If you know how to electrically or neurochemically change functions in the brain,” he says, “then you [might] in principle be able to help normal people, not mystics, achieve spiritual states using a device that stimulates the brain electromagnetically or using lights and sounds.”

Inducing truly mystical experiences could have a variety of positive effects. Recent findings suggest, for example, that meditation can improve people’s ability to pay attention. Davidson and his colleagues asked 17 people who had received three months of intensive training in meditation and 23 meditation novices to perform an attention task in which they had to successively pick out two numbers embedded in a series of letters. The novices did what most people do, the investigators announced in June: they missed the second number because they were still focusing on the first—a phenomenon called attentional blink. In contrast, all the trained meditators consistently picked out both numbers, indicating that practicing meditation can improve focus.

Meditation may even delay certain signs of aging in the brain, according to preliminary work by neuroscientist Sara Lazar of Harvard University and her colleagues. A 2005 paper in NeuroReport noted that 20 experienced meditators showed increased thickness in certain brain regions relative to 15 subjects who did not meditate. In particular, the prefrontal cortex and right anterior insula were between four and eight thousandths of an inch thicker in the meditators; the oldest of these subjects boasted the greatest increase in thickness, the reverse of the usual process of aging. Newberg is now investigating whether meditation can alleviate stress and sadness in cancer patients or expand the cognitive capacities of people with early memory loss.

Artificially replicating meditative trances or other spiritual states might be similarly beneficial to the mind, brain and body. Beauregard and others argue, for example, that such mystical mimicry might improve immune system function, stamp out depression or just provide a more positive outlook on life. The changes could be lasting and even transformative. “We could generate a healthy, optimal brain template,” Paquette says. “If someone has a bad brain, how can they get a good brain? It’s really [a potential way to] rewire our brain.” Religious faith also has inherent worldly rewards, of course. It brings contentment, and charitable works motivated by such faith bring others happiness.

To be sure, people may differ in their proclivity to spiritual awakening. After all, not everyone finds God with the God helmet. Thus, scientists may need to retrofit the technique to the patient. And it is possible that some people’s brains will simply resist succumbing to the divine.

Moreover, no matter what neural correlates scientists may find, the results cannot prove or disprove the existence of God. Although atheists might argue that finding spirituality in the brain implies that religion is nothing more than divine delusion, the nuns were thrilled by their brain scans for precisely the opposite reason: they seemed to provide confirmation of God’s interactions with them. After all, finding a cerebral source for spiritual experiences could serve equally well to identify the medium through which God reaches out to humanity. Thus, the nuns’ forays into the tubular brain scanner did not undermine their faith. On the contrary, the science gave them an even greater reason to believe.