190 Clinical News Report 30 SEP 2014

190HRR25SEP2014

CNO Report 190

Release Date 30 SEP 2014

 

cnosmDraft Report Compiled by

Ralph Turchiano

http://www.clinicalnews.org

 

 

In This Issue:

  1. Schizophrenia not a single disease but multiple genetically distinct disorders
  2. Researchers debunk myth about Parkinson’s disease
  3. Healthy humans make nice homes for viruses
  4. Gut bacteria, artificial sweeteners and glucose intolerance
  5. Single dose of antidepressant changes the brain
  6. Dry roasting could help trigger peanut allergy
  7. Mothers of children with autism less likely to have taken iron supplements
  8. Common diabetes drug associated with risk of low levels of thyroid hormone
  9. Research study analyzes the best exercise for obese youths
  10. Critically ill ICU patients lose almost all of their gut microbes and the ones left aren’t good
  11. Turmeric compound boosts regeneration of brain stem cells

 

 

Schizophrenia not a single disease but multiple genetically distinct disorders

September 15, 2014

New research shows that schizophrenia isn’t a single disease but a group of eight genetically distinct disorders, each with its own set of symptoms. The finding could be a first step toward improved diagnosis and treatment for the debilitating psychiatric illness.

The research at Washington University School of Medicine in St. Louis is reported online Sept. 15 in The American Journal of Psychiatry.

About 80 percent of the risk for schizophrenia is known to be inherited, but scientists have struggled to identify specific genes for the condition. Now, in a novel approach analyzing genetic influences on more than 4,000 people with schizophrenia, the research team has identified distinct gene clusters that contribute to eight different classes of schizophrenia.

“Genes don’t operate by themselves,” said C. Robert Cloninger, MD, PhD, one of the study’s senior investigators. “They function in concert much like an orchestra, and to understand how they’re working, you have to know not just who the members of the orchestra are but how they interact.”

Cloninger, the Wallace Renard Professor of Psychiatry and Genetics, and his colleagues matched precise DNA variations in people with and without schizophrenia to symptoms in individual patients. In all, the researchers analyzed nearly 700,000 sites within the genome where a single unit of DNA is changed, often referred to as a single nucleotide polymorphism (SNP). They looked at SNPs in 4,200 people with schizophrenia and 3,800 healthy controls, learning how individual genetic variations interacted with each other to produce the illness.

ROBERT BOSTON

Igor Zwir, PhD, one of the senior investigators, helped match precise DNA variations in people with and without schizophrenia to symptoms in individual patients.

In some patients with hallucinations or delusions, for example, the researchers matched distinct genetic features to patients’ symptoms, demonstrating that specific genetic variations interacted to create a 95 percent certainty of schizophrenia. In another group, they found that disorganized speech and behavior were specifically associated with a set of DNA variations that carried a 100 percent risk of schizophrenia.

 

“What we’ve done here, after a decade of frustration in the field of psychiatric genetics, is identify the way genes interact with each other, how the ‘orchestra’ is either harmonious and leads to health, or disorganized in ways that lead to distinct classes of schizophrenia,” Cloninger said.

 

Although individual genes have only weak and inconsistent associations with schizophrenia, groups of interacting gene clusters create an extremely high and consistent risk of illness, on the order of 70 to 100 percent. That makes it almost impossible for people with those genetic variations to avoid the condition. In all, the researchers identified 42 clusters of genetic variations that dramatically increased the risk of schizophrenia.

 

“In the past, scientists had been looking for associations between individual genes and schizophrenia,” explained Dragan Svrakic, PhD, MD, a co-investigator and a professor of psychiatry at Washington University. “When one study would identify an association, no one else could replicate it. What was missing was the idea that these genes don’t act independently. They work in concert to disrupt the brain’s structure and function, and that results in the illness.”

Svrakic said it was only when the research team was able to organize the genetic variations and the patients’ symptoms into groups that they could see that particular clusters of DNA variations acted together to cause specific types of symptoms.

Then they divided patients according to the type and severity of their symptoms, such as different types of hallucinations or delusions, and other symptoms, such as lack of initiative, problems organizing thoughts or a lack of connection between emotions and thoughts. The results indicated that those symptom profiles describe eight qualitatively distinct disorders based on underlying genetic conditions.

 

The investigators also replicated their findings in two additional DNA databases of people with schizophrenia, an indicator that identifying the gene variations that are working together is a valid avenue to explore for improving diagnosis and treatment.

By identifying groups of genetic variations and matching them to symptoms in individual patients, it soon may be possible to target treatments to specific pathways that cause problems, according to co-investigator Igor Zwir, PhD, research associate in psychiatry at Washington University and associate professor in the Department of Computer Science and Artificial Intelligence at the University of Granada, Spain.

And Cloninger added it may be possible to use the same approach to better understand how genes work together to cause other common but complex disorders.

“People have been looking at genes to get a better handle on heart disease, hypertension and diabetes, and it’s been a real disappointment,” he said. “Most of the variability in the severity of disease has not been explained, but we were able to find that different sets of genetic variations were leading to distinct clinical syndromes. So I think this really could change the way people approach understanding the causes of complex diseases.”

Researchers debunk myth about Parkinson’s disease

Using advanced computer models, neuroscience researchers at the University of Copenhagen have gained new knowledge about the complex processes that cause Parkinson’s disease. The findings have recently been published in the prestigious Journal of Neuroscience.

The defining symptoms of Parkinson’s disease are slow movements, muscular stiffness and shaking. There is currently no cure for the condition, so it is essential to conduct innovative research with the potential to shed some light on this terrible disruption to the central nervous system. Using advanced computer models, neuroscience researchers at the University of Copenhagen have gained new knowledge about the complex processes that cause Parkinson?s disease.

Dopamine is an important neurotransmitter which affects physical and psychological functions such as motor control, learning and memory. Levels of this substance are regulated by special dopamine cells. When the level of dopamine drops, nerve cells that constitute part of the brain’s ‘stop signal’ are activated.

“This stop signal is rather like the safety lever on a motorised lawn mower: if you take your hand off the lever, the mower’s motor stops. Similarly, dopamine must always be present in the system to block the stop signal. Parkinson’s disease arises because for some reason the dopamine cells in the brain are lost, and it is known that the stop signal is being over-activated somehow or other. Many researchers have therefore considered it obvious that long-term lack of dopamine must be the cause of the distinctive symptoms that accompanies the disease. However, we can now use advanced computer simulations to challenge the existing paradigm and put forward a different theory about what actually takes place in the brain when the dopamine cells gradually die,” explains Jakob Kisbye Dreyer, Postdoc at the Department of Neuroscience and Pharmacology, University of Copenhagen.

 

Healthy humans make nice homes for viruses

September 16, 2014

By Caroline Arbanas

 

The same viruses that make us sick can take up residence in and on the human body without provoking a sneeze, cough or other troublesome symptom, according to new research at Washington University School of Medicine in St. Louis.

On average, healthy individuals carry about five types of viruses on their bodies, the researchers report online in BioMed Central Biology. The study is the first comprehensive analysis to describe the diversity of viruses in healthy people.

The research was conducted as part of the Human Microbiome Project, a major initiative funded by the National Institutes of Health (NIH) that largely has focused on cataloging the body’s bacterial ecosystems.

“Most everyone is familiar with the idea that a normal bacterial flora exists in the body,” said study co-author Gregory Storch, MD, a virologist and chief of the Division of Pediatric Infectious Diseases. “Lots of people have asked whether there is a viral counterpart, and we haven’t had a clear answer. But now we know there is a normal viral flora, and it’s rich and complex.”

In 102 healthy young adults ages 18 to 40, the researchers sampled up to five body habitats: nose, skin, mouth, stool and vagina. The study’s subjects were nearly evenly split by gender.

At least one virus was detected in 92 percent of the people sampled, and some individuals harbored 10 to 15 viruses.

“We were impressed by the number of viruses we found,” said lead author Kristine M. Wylie, PhD, an instructor of pediatrics. “We only sampled up to five body sites in each person and would expect to see many more viruses if we had sampled the entire body.”

Scientists led by George Weinstock, PhD, at Washington University’s Genome Institute, sequenced the DNA of the viruses recovered from the body, finding that each individual had a distinct viral fingerprint. (Weinstock is now at The Jackson Laboratory in Connecticut.) About half of people were sampled at two or three points in time, and the researchers noted that some of the viruses established stable, low-level infections.

The researchers don’t know yet whether the viruses have a positive or negative effect on overall health but speculate that in some cases, they may keep the immune system primed to respond to dangerous pathogens while in others, lingering viruses increase the risk of disease.

Study volunteers were screened carefully to confirm they were healthy and did not have symptoms of acute infection. They also could not have been diagnosed in the past two years with human papillomavirus infection (HPV), which can cause cervical and throat cancer, or have an active genital herpes infection.

Analyzing the samples, the scientists found seven families of viruses, including strains of herpes viruses that are not sexually transmitted. For example, herpesvirus 6 or herpesvirus 7 was found in 98 percent of individuals sampled from the mouth. Certain strains of papillomaviruses were found in about 75 percent of skin samples and 50 percent of samples from the nose. Novel strains of the virus were found in both sites.

Not surprisingly, the vagina was dominated by papillomaviruses, with 38 percent of female subjects carrying such strains. Some of the women harbored certain high-risk strains that increase the risk of cervical cancer. These strains were more common in women with communities of vaginal bacteria that had lower levels of Lactobacillus and an increase in bacteria such as Gardnerella, which is associated with bacterial vaginosis.

Adenoviruses, the viruses that cause the common cold and pneumonia, also were common at many sites in the body.

It’s possible that some of the viruses the researchers uncovered were latent infections acquired years ago. But many viruses were found in body secretions where the presence of a virus is an indicator of an active infection. Dormant or latent viruses hide in cells, not in body fluids such as saliva or nasal secretions, Wylie explained.

A further direction for researchers is to distinguish between active viral infections that aren’t causing symptoms and those that are making a person sick.

“It’s very important to know what viruses are present in a person without causing a problem and what viruses could be responsible for serious illnesses that need medical attention,” said Storch, the Ruth L. Siteman Professor of Pediatrics. “While more research remains, we now have a much clearer picture of the communities of viruses that naturally exist in healthy people.”

 

Gut bacteria, artificial sweeteners and glucose intolerance

A new study reveals that certain gut bacteria may induce metabolic changes following exposure to artificial sweeteners

Artificial sweeteners, promoted as aids to weight loss and diabetes prevention, could actually hasten the development of glucose intolerance and metabolic disease; and they do it in a surprising way: by changing the composition and function of the gut microbiota – the substantial population of bacteria residing in our intestines. These findings, the results of experiments in mice and humans, were published today in Nature. Among other things, says Dr. Eran Elinav of the Weizmann Institute’s Immunology Department, who led this research together with Prof. Eran Segal of Computer Science and Applied Mathematics Department, the widespread use of artificial sweeteners in drinks and food may be contributing to the obesity and diabetes epidemic that is sweeping much of the world.

For years researchers have been puzzling over the fact that non-caloric artificial sweeteners do not seem to assist in weight loss, and some studies have suggested they may even have an opposite effect. Graduate student Jotham Suez in Elinav’s lab, who led the study, collaborated with graduate students Tal Korem and David Zeevi in Segal’s lab and Gili Zilberman-Shapira in Elinav’s lab in discovering that artificial sweeteners, even though they do not contain sugar, nonetheless have a direct effect on the body’s ability to utilize glucose. Glucose intolerance – generally thought to occur when the body cannot cope with large amounts of sugar in the diet – is the first step on the path to metabolic syndrome and adult-onset diabetes.

The scientists gave mice water laced with the three most commonly used artificial sweeteners – in the equivalent amounts to those permitted by the FDA. These mice developed glucose intolerance, as compared to mice that drank water, or even sugar water. Repeating the experiment with different types of mice and different doses of the sweeteners produced the same results – these substances were somehow inducing glucose intolerance.

Next, the researchers investigated a hypothesis that the gut microbiota are involved in this phenomenon. They thought the bacteria might do this by reacting to new substances like artificial sweeteners, which the body itself may not recognize as “food.” Indeed, artificial sweeteners are not absorbed in the gastrointestinal tract, but in passing through they encounter trillions of the bacteria in the gut microbiota.

The researchers treated mice with antibiotics to eradicate many of their gut bacteria; this resulted in a full reversal of the artificial sweeteners’ effects on glucose metabolism. Next, they transferred the microbiota from mice that consumed artificial sweeteners to ‘germ-free’ mice – resulting in a complete transmission of the glucose intolerance into the recipient mice. This, in itself, was conclusive proof that changes to the gut bacteria are directly responsible for the harmful effects to their host’s metabolism. The group even found that incubating the microbiota outside the body, together with artificial sweeteners, was sufficient to induce glucose intolerance in the sterile mice. A detailed characterization of the microbiota in these mice revealed profound changes to their bacterial populations, including new microbial functions that are known to infer a propensity to obesity, diabetes and complications of these problems in both mice and humans.

Does the human microbiome function in the same way? Elinav and Segal had a means to test this as well. As a first step, they looked at data collected from their Personalized Nutrition Project, the largest human trial to date to look at the connection between nutrition and microbiota. Here, they uncovered a significant association between self-reported consumption of artificial sweeteners, personal configurations of gut bacteria and the propensity for glucose intolerance. They next conducted a controlled experiment, asking a group of volunteers who did not generally eat or drink artificially sweetened foods to consume them for a week and then undergo tests of their glucose levels as well as their gut microbiota compositions.

The findings showed that many – but not all – of the volunteers had begun to develop glucose intolerance after just one week of artificial sweetener consumption. The composition of their gut microbiota explained the difference: The researchers discovered two different populations of human gut bacteria – one that induced glucose intolerance when exposed to the sweeteners, the second that had no effect either way. Elinav believes that certain bacteria in the guts of those who developed glucose intolerance reacted to the chemical sweeteners by secreting substances that then provoked an inflammatory response similar to sugar overdose, promoting changes in the body’s ability to utilize sugar.

Segal: “The results of our experiments highlight the importance of personalized medicine and nutrition to our overall health. We believe that an integrated analysis of individualized ‘big data’ from our genome, microbiome and dietary habits could transform our ability to understand how foods and nutritional supplements affect a person’s health and risk of disease.”

Elinav: “Our relationship with our own individual mix of gut bacteria is a huge factor in determining how the food we eat affects us. Especially intriguing is the link between use of artificial sweeteners – through the bacteria in our guts – to a tendency to develop the very disorders they were designed to prevent; this calls for reassessment of today’s massive, unsupervised consumption of these substances.”

 

Single dose of antidepressant changes the brain

A single dose of antidepressant is enough to produce dramatic changes in the functional architecture of the human brain. Brain scans taken of people before and after an acute dose of a commonly prescribed SSRI (serotonin reuptake inhibitor) reveal changes in connectivity within three hours, say researchers who report their observations in the Cell Press journal Current Biology on September 18.

“We were not expecting the SSRI to have such a prominent effect on such a short timescale or for the resulting signal to encompass the entire brain,” says Julia Sacher of the Max Planck Institute for Human Cognitive and Brain Sciences.

While SSRIs are among the most widely studied and prescribed form of antidepressants worldwide, it’s still not entirely clear how they work. The drugs are believed to change brain connectivity in important ways, but those effects had generally been thought to take place over a period of weeks, not hours.

The new findings show that changes begin to take place right away. Sacher says what they are seeing in medication-free individuals who had never taken antidepressants before may be an early marker of brain reorganization.

Study participants let their minds wander for about 15 minutes in a brain scanner that measures the oxygenation of blood flow in the brain. The researchers characterized three-dimensional images of each individual’s brain by measuring the number of connections between small blocks known as voxels (comparable to the pixels in an image) and the change in those connections with a single dose of escitalopram (trade name Lexapro).

Their whole-brain network analysis shows that one dose of the SSRI reduces the level of intrinsic connectivity in most parts of the brain. However, Sacher and her colleagues observed an increase in connectivity within two brain regions, specifically the cerebellum and thalamus.

The researchers say the new findings represent an essential first step toward clinical studies in patients suffering from depression. They also plan to compare the functional connectivity signature of brains in recovery and those of patients who fail to respond after weeks of SSRI treatment.

Understanding the differences between the brains of individuals who respond to SSRIs and those who don’t “could help to better predict who will benefit from this kind of antidepressant versus some other form of therapy,” Sacher says. “The hope that we have is that ultimately our work will help to guide better treatment decisions and tailor individualized therapy for patients suffering from depression.”

###

Current Biology, Schaefer et al.: “Serotonergic modulation of intrinisic functional connectivity.”

 

Dry roasting could help trigger peanut allergy

Dry roasted peanuts are more likely to trigger an allergy to peanuts than raw peanuts, suggests an Oxford University study involving mice.

The researchers say that specific chemical changes caused by the high temperatures of the dry roasting process are recognised by the body’s immune system, ‘priming’ the body to set off an allergic immune response the next time it sees any peanuts.

The results might explain the difference in the number of people with peanut allergies in the Western world compared to populations in East Asia, the researchers say. In the West, where roasted and dry-roasted peanuts are common, there are far more people with peanut allergies than in the East, where peanuts are more often eaten raw, boiled or fried. Numbers of people with other food allergies show no such difference.

The study is published in the Journal of Allergy and Clinical Immunology and was funded by the National Institute for Health Research (NIHR) Oxford Biomedical Research Centre, the US National Institutes of Health and the Swiss National Science Foundation.

The researchers purified proteins from dry roasted peanuts and from raw peanuts. They introduced the peanut proteins to mice in three different ways – injected under the skin, applied to broken skin, and introduced directly into the stomach. The immune responses of the mice to further peanut extracts given later were measured.

The mice that had been initially exposed to dry roasted peanuts generated greatly increased immune responses to peanuts, compared to mice that had been exposed to raw peanut proteins. The types of immune responses seen are characteristic of allergic reactions.

Professor Quentin Sattentau, who led the research at the Dunn School of Pathology at the University of Oxford, says: ‘This is the first time, to our knowledge, that a potential trigger for peanut allergy has been directly shown.’

Previous studies have shown that roasting modifies peanut proteins leading to altered recognition by the immune system, but they did not show that roasted peanuts can trigger an allergic immune response.

First author Dr Amin Moghaddam of Oxford University says: ‘Our results in mice suggest that dry roasted peanuts may be more likely to lead to peanut allergy than raw peanuts: the dry roasting causes a chemical modification of peanut proteins that appears to activate the immune system against future exposure to peanuts.

‘Allergies in people are driven by multiple factors including family genetic background and exposure to environmental triggers. In the case of peanut allergy, we think we may have discovered an environmental trigger in the way that peanuts are processed by high-temperature roasting.’

Professor Sattentau says: ‘We know that children in families with other allergies are more likely to develop peanut allergy. However our research is at an early stage and we think that it would be premature to avoid roasted peanuts and their products until further work has been carried out to confirm this result.’

He adds: ‘We think we have identified the chemical modifications involved in triggering an allergic response to peanuts, and are currently exploring methods that are food industry-friendly to eliminate these groups.’

###

Notes

  • Dry roasting involves temperatures of 160-170°C and above. Above 130°C, the Maillard chemical reaction efficiently leads to modification of specific chemical groups in proteins. The products of the Maillard reaction in dry-roasted peanuts can activate a strong allergic immune response, the researchers suggest.
  • Allergy UK says allergy to peanut and tree nuts is the most common food allergy in adults and children, with peanut allergy estimated to affect 1 in 50 young infants. The majority of allergic reactions to peanut and tree nuts are mild. However, some allergic reactions to peanut or tree nuts can be severe, causing difficulty in breathing due to asthma-like symptoms or throat swelling, or a drop in blood pressure. This is known as anaphylaxis, and allergy to peanut or tree nuts is one of the most common triggers.
  • http://www.allergyuk.org/peanut-and-tree-nut-allergy/peanut-and-tree-nut-allergy
  • There is currently no evidence-based advice for prevention of food allergies. It used to be believed that avoiding eating peanuts during pregnancy and when breastfeeding could help reduce the risk but this theory has now been questioned.
  • http://www.nhs.uk/Conditions/food-allergy/Pages/living-with.aspx
  • Children are more likely to develop a peanut allergy if they already have a known allergy (such as eczema or a diagnosed food allergy), or there’s a history of allergy in their immediate family (such as asthma, eczema or hay fever).

NHS Choices offers advice on food allergies in children here: http://www.nhs.uk/conditions/pregnancy-and-baby/pages/food-allergies-in-children.aspx#close

The paper ‘Dry roasting enhances peanut allergic sensitization across mucosal and cutaneous routes in mice’ by Amin Moghaddam and colleagues is to be published in the Journal of Allergy and Clinical Immunology with an embargo of 00:01 UK time on Monday 22 September 2014 / 19:01 US Eastern Time on Sunday 21 September 2014.

The study was funded by the National Institute for Health Research (NIHR) Oxford Biomedical Research Centre, the US National Institutes of Health and the Swiss National Science Foundation.

The National Institute for Health Research (NIHR) is funded by the Department of Health to improve the health and wealth of the nation through research. Since its establishment in April 2006, the NIHR has transformed research in the NHS. It has increased the volume of applied health research for the benefit of patients and the public, driven faster translation of basic science discoveries into tangible benefits for patients and the economy, and developed and supported the people who conduct and contribute to applied health research. The NIHR plays a key role in the Government’s strategy for economic growth, attracting investment by the life-sciences industries through its world-class infrastructure for health research. Together, the NIHR people, programmes, centres of excellence and systems represent the most integrated health research system in the world. For further information, visit the NIHR website .

Oxford University’s Medical Sciences Division is one of the largest biomedical research centres in Europe, with over 2,500 people involved in research and more than 2,800 students. The University is rated the best in the world for medicine, and it is home to the UK’s top-ranked medical school.

From the genetic and molecular basis of disease to the latest advances in neuroscience, Oxford is at the forefront of medical research. It has one of the largest clinical trial portfolios in the UK and great expertise in taking discoveries from the lab into the clinic. Partnerships with the local NHS Trusts enable patients to benefit from close links between medical research and healthcare delivery.

A great strength of Oxford medicine is its long-standing network of clinical research units in Asia and Africa, enabling world-leading research on the most pressing global health challenges such as malaria, TB, HIV/AIDS and flu. Oxford is also renowned for its large-scale studies which examine the role of factors such as smoking, alcohol and diet on cancer, heart disease and other conditions.

Mothers of children with autism less likely to have taken iron supplements

Five-fold greater risk found in children whose mothers had low supplemental iron and other risk factors for delivering a child with ASD

(SACRAMENTO, Calif.) —Mothers of children with autism are significantly less likely to report taking iron supplements before and during their pregnancies than the mothers of children who are developing normally, a study by researchers with the UC Davis MIND Institute has found.

Low iron intake was associated with a five-fold greater risk of autism in the child if the mother was 35 or older at the time of the child’s birth or if she suffered from metabolic conditions such as obesity hypertension or diabetes.

The research is the first to examine the relationship between maternal iron intake and having a child with autism spectrum disorder, the authors said. The study, “Maternal intake of supplemental iron and risk for autism spectrum disorders,” is published online today in the American Journal of Epidemiology.

“The association between lower maternal iron intake and increased ASD risk was strongest during breastfeeding, after adjustment for folic acid intake,” said Rebecca J. Schmidt, assistant professor in the Department of Public Health Sciences and a researcher affiliated with the MIND Institute.

The authors of the current study in 2011 were the first to report associations between supplemental folic acid and reduced risk for autism spectrum disorder, a finding later replicated in larger scale investigations.

“Further, the risk associated with low maternal iron intake was much greater when the mother was also older and had metabolic conditions during her pregnancy.”

The study was conducted in mother-child pairs enrolled in the Northern California-based Childhood Autism Risks from Genetics and the Environment (CHARGE) Study between 2002 and 2009. The participants included mothers of children with autism and 346 mothers of children with typical development.

The researchers examined maternal iron intake among the study’s participants, including vitamins, other nutritional supplements, and breakfast cereals during the three months prior to through the end of the women’s pregnancies and breastfeeding. The mothers’ daily iron intake was examined, including the frequency, dosages and the brands of supplements that they consumed.

“Iron deficiency, and its resultant anemia, is the most common nutrient deficiency, especially during pregnancy, affecting 40 to 50 percent of women and their infants,” Schmidt said. “Iron is crucial to early brain development, contributing to neurotransmitter production, myelination and immune function. All three of these pathways have been associated with autism.”

“Iron deficiency is pretty common, and even more common among women with metabolic conditions,” Schmidt said. “However we want to be cautious and wait until this study has been replicated.

“In the meantime the takeaway message for women is do what your doctor recommends. Take vitamins throughout pregnancy, and take the recommended daily dosage. If there are side effects, talk to your doctor about how to address them.”

Mothers of children with autism less likely to have taken iron supplements

Five-fold greater risk found in children whose mothers had low supplemental iron and other risk factors for delivering a child with ASD

(SACRAMENTO, Calif.) —Mothers of children with autism are significantly less likely to report taking iron supplements before and during their pregnancies than the mothers of children who are developing normally, a study by researchers with the UC Davis MIND Institute has found.

Low iron intake was associated with a five-fold greater risk of autism in the child if the mother was 35 or older at the time of the child’s birth or if she suffered from metabolic conditions such as obesity hypertension or diabetes.

The research is the first to examine the relationship between maternal iron intake and having a child with autism spectrum disorder, the authors said. The study, “Maternal intake of supplemental iron and risk for autism spectrum disorders,” is published online today in the American Journal of Epidemiology.

“The association between lower maternal iron intake and increased ASD risk was strongest during breastfeeding, after adjustment for folic acid intake,” said Rebecca J. Schmidt, assistant professor in the Department of Public Health Sciences and a researcher affiliated with the MIND Institute.

The authors of the current study in 2011 were the first to report associations between supplemental folic acid and reduced risk for autism spectrum disorder, a finding later replicated in larger scale investigations.

“Further, the risk associated with low maternal iron intake was much greater when the mother was also older and had metabolic conditions during her pregnancy.”

The study was conducted in mother-child pairs enrolled in the Northern California-based Childhood Autism Risks from Genetics and the Environment (CHARGE) Study between 2002 and 2009. The participants included mothers of children with autism and 346 mothers of children with typical development.

The researchers examined maternal iron intake among the study’s participants, including vitamins, other nutritional supplements, and breakfast cereals during the three months prior to through the end of the women’s pregnancies and breastfeeding. The mothers’ daily iron intake was examined, including the frequency, dosages and the brands of supplements that they consumed.

“Iron deficiency, and its resultant anemia, is the most common nutrient deficiency, especially during pregnancy, affecting 40 to 50 percent of women and their infants,” Schmidt said. “Iron is crucial to early brain development, contributing to neurotransmitter production, myelination and immune function. All three of these pathways have been associated with autism.”

“Iron deficiency is pretty common, and even more common among women with metabolic conditions,” Schmidt said. “However we want to be cautious and wait until this study has been replicated.

“In the meantime the takeaway message for women is do what your doctor recommends. Take vitamins throughout pregnancy, and take the recommended daily dosage. If there are side effects, talk to your doctor about how to address them.”

Common diabetes drug associated with risk of low levels of thyroid hormone

Metformin, a commonly used drug for treating type 2 diabetes, is linked to an increased risk of low thyroid-stimulating hormone (TSH) levels in patients with underactive thyroids (hypothyroidism), according to a study in CMAJ (Canadian Medical Association Journal). Low levels of TSH can cause harm, such as cardiovascular conditions and fractures.

Metformin is used to lower blood glucose levels by reducing glucose production in the liver. However, some previous studies have raised concerns that metformin may lower thyroid-stimulating hormone levels.

Researchers looked at data on 74 300 patient who received metformin and sulfonylurea, another common diabetes drug, over a 25-year study period. Of these people, 5689 had treated hypothyroidism, and 59 937 had normal thyroid function. In the group with hypothyroidism, there were 495 incidences of low thyroid-stimulating hormone (119.7 per 1000) per year compared with 322 in the normal group (4.5 per 1000).

In patients with treated hypothyroidism, metformin monotherapy was associated with a 55% increased risk of low TSH levels compared with treatment with sulfonylurea. Metformin therapy did not appear to affect people with normal thyroid function.

“The results of this longitudinal study confirmed that the use of metformin was associated with an increased risk of low TSH levels in patients with treated hypothyroidism,” says Dr. Laurent Azoulay, Lady Davis Institute, Jewish General Hospital and the Department of Oncology, McGill University, Montréal, Quebec.

“Given the relatively high incidence of low TSH levels in patients taking metformin, it is imperative that future studies assess the clinical consequences of this effect.”

Research study analyzes the best exercise for obese youths

What exercise program can best fight the “epidemic” of teen obesity? According to a study published in the Journal of the American Medical Association (JAMA) Pediatrics, by combining aerobic exercise with resistance training.

The Healthy Eating Aerobic and Resistance Training in Youth (HEARTY) study, led by researchers at the University of Calgary and University of Ottawa, involved 304 overweight teens in the Ottawa/Gatineau area between the ages of 14 to 18. All were given the same four weeks of diet counseling to promote healthy eating and weight loss before being randomly placed into four groups. The first group performed resistance training involving weight machines and some free weights; the second performed only aerobic exercise on treadmills, elliptical machines and stationary bikes; the third underwent combined aerobic and resistance training; and the last group did no exercise training.

“Obesity is an epidemic among youth,” says Dr. Ron Sigal of the University of Calgary’s Institute for Public Health and Libin Cardiovascular Institute of Alberta. “Adolescents who are overweight are typically advised to exercise more, but there is limited evidence on what type of exercise is best in order to lose fat.”

In the overall study population, each type of exercise reduced body fat significantly and similarly. All three exercise programs caused significantly more fat loss than in the diet-only control group. Among youths who completed at least 70 per cent of the study’s exercise sessions, the percentage of body fat decreased “significantly more in those who did combined aerobic and resistance exercise than in those who only did aerobic exercise,” says co-principal researcher Dr. Glen Kenny of the University of Ottawa. “Remarkably, among participants who completed at least 70 per cent of the prescribed exercise sessions, waist circumference decreased close to seven centimeters in those randomized to combined aerobic plus resistance exercise, versus about four centimeters in those randomized to do just one type of exercise, with no change in those randomized to diet alone.”

Supervised by personal trainers, youths in the three exercise groups were asked to train four times per week for 22 weeks at community-based facilities. Changes in body fat were measured using Magnetic Resonance Imaging (MRI) machines. Because aerobic exercises such as cycling or jogging can be challenging for overweight people, resistance training is potentially attractive because excess body weight poses far less of a disadvantage, and gains in strength come much more quickly than gains in aerobic fitness.

Researchers hope that the study will contribute to a national debate about childhood and teenage obesity, potentially leading to a consistent, long-term strategy on how to best deal with the problem. Eighty per cent of overweight youth typically continue to be obese as adults, adversely affecting the quality of their lives and contributing to chronic disease problems. Adult obesity increases risk of diabetes, heart disease, cancer and disability.

Critically ill ICU patients lose almost all of their gut microbes and the ones left aren’t good

Researchers at the University of Chicago have shown that after a long stay in the Intensive Care Unit (ICU) only a handful of pathogenic microbe species remain behind in patients’ intestines. The team tested these remaining pathogens and discovered that some can become deadly when provoked by conditions that mimic the body’s stress response to illness.

The findings, published in mBio®, the online open-access journal of the American Society for Microbiology, may lead to better monitoring and treatment of ICU patients who can develop a life-threatening systemic infection known as sepsis.

“I have watched patients die from sepsis—it isn’t their injuries or mechanical problems that are the problem,” says John Alverdy, a gastrointestinal surgeon and one of two senior authors on the study.

“Our hypothesis has always been that the gut microflora in these patients are very abnormal, and these could be the culprits that lead to sepsis,” he says.

The current study supports this idea. Alverdy and Olga Zaborina, a microbiologist, wanted to know what happens to the gut microbes of ICU patients, who receive repeated courses of multiple antibiotics to ward off infections.

They found that patients with stays longer than a month had only one to four types of microbes in their gut, as measured from fecal samples—compared to about 40 different types found in healthy volunteers.

Four of these patients had gut microbe communities with just two members– an infectious Candida yeast strain and a pathogenic bacterial strain, such as Enterococcus faecium or Staphylococcus aureus and other bugs associated with hospital-associated infections. Not surprisingly, almost all of the pathogenic bacteria in these patients were antibiotic resistant.

“They’ve got a lot of bad guys in there, but the presence of bad guys alone doesn’t tell you who’s going to live or die,” says Alverdy. “It’s not only which microbes are there, but how they behave when provoked by the harsh and hostile conditions of critical illness.”

To check that behavior, the team cultured microbe communities from ICU patients and tested their ability to cause harm in a laboratory model of virulence. The tiny Caenorhabditis elegans worm normally feeds on soil microbes, but when fed pathogenic microbes in the lab, the worms act as a canary-in-the-coalmine indicator of virulence. The more virulent a microbe, the more worms it kills.

Feeding the worms the yeast-plus-bacteria communities did not kill many worms, but when the bacteria were removed, the yeast alone became deadly. In some cases, simply changing the bacterial partner caused virulence. This suggests that even though the two microbes in these communities are both pathogens, they exist in a communal balance in the gut that does not always lead to virulence.

“During host stress, these two microbes suppress the virulence of each other,” says Zaborina. “But if you do something to one of them, then that can change their behavior.”

For example, the team found that adding an opioid drug to the mix—which mimics stress signals released by sick patients—could also switch behavior from a peaceful coexistence called commensalism to virulence for some microbe pairs. The team could prevent this switch to virulence by feeding the worms a molecule that created high phosphate levels in their gut.

Although the study was too small for statistical significance, there was a correlation between microbe behavior and whether a patient lived or died: two patients who were discharged had microbes that coexisted peacefully, but the three who died of sepsis had at least one sample that displayed pathogenic behavior.

The work suggests that doctors should try to find ways to minimize the excessive use of antibiotics and stabilize the microbes that do remain in ICU patients’ guts. This might be achieved by delivering phosphate or reducing the stress signals in the gut. Such efforts could keep microbes calm and non-virulent, leading to better patient outcomes.

Turmeric compound boosts regeneration of brain stem cells

A bioactive compound found in turmeric promotes stem cell proliferation and differentiation in the brain, reveals new research published today in the open access journal Stem Cell Research & Therapy. The findings suggest aromatic turmerone could be a future drug candidate for treating neurological disorders, such as stroke and Alzheimer’s disease.

The study looked at the effects of aromatic (ar-) turmerone on endogenous neutral stem cells (NSC), which are stem cells found within adult brains. NSC differentiate into neurons, and play an important role in self-repair and recovery of brain function in neurodegenerative diseases. Previous studies of ar-turmerone have shown that the compound can block activation of microglia cells. When activated, these cells cause neuroinflammation, which is associated with different neurological disorders. However, ar-turmerone’s impact on the brain’s capacity to self-repair was unknown.

Researchers from the Institute of Neuroscience and Medicine in Jülich, Germany, studied the effects of ar-turmerone on NSC proliferation and differentiation both in vitro and in vivo. Rat fetal NSC were cultured and grown in six different concentrations of ar-turmerone over a 72 hour period. At certain concentrations, ar-turmerone was shown to increase NSC proliferation by up to 80%, without having any impact on cell death. The cell differentiation process also accelerated in ar-turmerone-treated cells compared to untreated control cells.

To test the effects of ar-turmerone on NSC in vivo, the researchers injected adult rats with ar-turmerone. Using PET imaging and a tracer to detect proliferating cells, they found that the subventricular zone (SVZ) was wider, and the hippocampus expanded, in the brains of rats injected with ar-turmerone than in control animals. The SVZ and hippocampus are the two sites in adult mammalian brains where neurogenesis, the growth of neurons, is known to occur.

Lead author of the study, Adele Rueger, said: “While several substances have been described to promote stem cell proliferation in the brain, fewer drugs additionally promote the differentiation of stem cells into neurons, which constitutes a major goal in regenerative medicine. Our findings on aromatic turmerone take us one step closer to achieving this goal.”

Ar-turmerone is the lesser-studied of two major bioactive compounds found in turmeric. The other compound is curcumin, which is well known for its anti-inflammatory and neuroprotective properties.

 

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%