Editors Top Five:
1. Big Tobacco knew radioactive particles in cigarettes posed cancer risk but kept quiet
2. Low zinc and copper levels might cause spontaneous abortion
3. Single dose of hallucinogen may create lasting personality change
4. Glucosamine-like supplement suppresses multiple sclerosis attacks
5. New study shows inflammatory food toxins found in high levels in infants
In this Issue:
1. Alcohol can reduce asthma risk
2. Low vitamin B12 levels may lead to brain shrinkage, cognitive problems
3. Increased risk of bleeding with combined use of SSRIs and antiplatelet therapy after heart attacks
4. Increased caffeinated coffee consumption associated with decreased risk of depression in women
5. Pregnant women who exercise protect their offspring against long-term neurodegenerative diseases
6. Marker for Alzheimer’s disease rises during day and falls with sleep
7. New analysis of the cardiovascular risks of common non-steroidal anti-inflammatory drugs
8. Environmental health risks of livestock farming
9. Women have stronger immune systems than men and it’s all down to a single chromosome
10. Low zinc and copper levels might cause spontaneous abortion
11. Popular colorectal cancer drug may cause permanent nerve damage
12. Big Tobacco knew radioactive particles in cigarettes posed cancer risk but kept quiet
13. Commonly used supplement may improve recovery from spinal cord injuries
14. Single dose of hallucinogen may create lasting personality change
15. Red wine ingredient resveratrol stops breast cancer growth
16. Cocaine users have 45 percent increased risk of glaucoma
17. Glucosamine-like supplement suppresses multiple sclerosis attacks
18. METABOLIC DISEASE: Antioxidants combat risk factor for type 2 diabetes in mice
19. Mayo Clinic study: multiple surgeries and anesthesia exposure
20. BPA exposure in utero may increase predisposition to breast cancer
21. Study in Lancet finds use of hormonal contraception doubles HIV risk
22. Pale people may need vitamin D supplements
23. Vitamin D deficiency common in cancer patients
24. New research shows $6.7 billion spent on unnecessary tests and treatments in one year
25. Natural compound helps reverse diabetes in mice
26. Green tea helps mice keep off extra pounds
27. Zinc’s role in the brain
28. New study shows inflammatory food toxins found in high levels in infants
Public release date: 25-Sep-2011
Alcohol can reduce asthma risk
Amsterdam, The Netherlands: Drinking alcohol in moderate quantities can reduce the risk of asthma, according to Danish researchers.
The study, which will be presented today (25 September 2011) at the European Respiratory Society’s Annual Congress in Amsterdam, found that drinking 1-6 units of alcohol a week could reduce the risk of developing the condition.
The research examined 19,349 twins between the ages of 12 and 41 yrs of age. All participants completed a questionnaire at the start and end of the study to compare alcohol intake with the risk of developing asthma over 8 yrs.
The results showed that the lowest risk of asthma was seen in the group which had a moderate intake of alcohol, as less than 4% of those who drank 1-6 units per week developed asthma.
The highest risk of asthma was observed in people who drunk rarely or never, as they were 1.4-times more likely to develop the condition. Heavy drinkers also had an increased risk of asthma development and were 1.2-times more likely to develop asthma.
The results also suggested that a preference for beer drinking was associated with an increased risk of asthma when compared with no preference.
Previous studies have found a link between excessive intake of alcohol and asthma attacks; however, this is the first study of its kind to show a link between alcohol intake and the onset of asthma for adults over a long period of time.
Sofie Lieberoth, from the Bispebjerg Hospital in Denmark, said: “Whilst excessive alcohol intake can cause health problems, the findings of our study suggest that a moderate intake of 1-6 units can reduce the risk of developing asthma. By examining all the factors linked with the development of asthma, we can understand more about what causes the condition and how to prevent it.”
Public release date: 26-Sep-2011
Low vitamin B12 levels may lead to brain shrinkage, cognitive problems
(CHICAGO) – Older people with low blood levels of vitamin B12 markers may be more likely to have lower brain volumes and have problems with their thinking skills, according to researchers at Rush University Medical Center.
The results of the study are published in the Sept. 27 issue of Neurology, the medical journal of the American Academy of Neurology.
Foods that come from animals, including fish, meat, especially liver, milk, eggs and poultry are usual sources of vitamin B12.
The study involved 121 older residents of the South side of Chicago who are a part of the Chicago Health and Aging Project (CHAP), which is a large, ongoing prospective Rush a biracial cohort of 10,000 subjects over the age of 65.
The 121 participants had blood drawn to measure levels of vitamin B12 and B12-related markers that can indicate a B12 deficiency. The same subjects took tests measuring their memory and other cognitive skills.
An average of four-and-a-half years later, MRI scans of the participants’ brains were taken to measure total brain volume and look for other signs of brain damage.
Having high levels of four of five markers for vitamin B12 deficiency was associated with having lower scores on the cognitive tests and smaller total brain volume.
“Our findings definitely deserve further examination,” said Christine C. Tangney, PhD, associate professor in the department of clinical nutrition at Rush University Medical Center, and lead author of the study. “It’s too early to say whether increasing vitamin B12 levels in older people through diet or supplements could prevent these problems, but it is an interesting question to explore. Findings from a British trial with B vitamin supplementation are also supportive of these outcomes.”
On the cognitive tests, the scores ranged from -2.18 to 1.42, with an average of 0.23. For each increase of one micromole per liter of homocysteine—one of the markers of B12 deficiency—the cognitive scores decreasedby 0.03 standardized units or points.
Tangney noted that the level of vitamin B12 itself in the blood was not associated with cognitive problems or loss in brain volume. She said that low vitamin B12 can be difficult to detect in older people when looking only at blood levels of the vitamin.
“Our findings lend support for the contention that poor vitamin B12 status is a potential risk factor for brain atrophy and may contribute to cognitive impairment,” said Tangney.
Public release date: 26-Sep-2011
Increased risk of bleeding with combined use of SSRIs and antiplatelet therapy after heart attacks
Heart attack patients taking selective serotonin reuptake inhibitors (SSRIs) in combination with antiplatelet therapy — acetylsalicylic acid (ASA), clopidogrel or both (dual antiplatelet therapy) — are at higher risk of bleeding than patients taking ASA alone, according to a study in CMAJ (Canadian Medical Association Journal) (pre-embargo link only) http://www.cmaj.ca/site/embargo/cmaj100912.pdf.
Antiplatelet therapy is commonly prescribed for patients who have had heart attacks to reduce the likelihood of another attack. There is, however, a risk of bleeding, which increases when certain other medications such as anticoagulants or SSRIs are taken at the same time as antiplatelet therapy.
SSRIs are commonly prescribed for depression. Many patients have symptoms of depression after a heart attack.
The study in CMAJ looked at 27 058 patients aged 50 years or older between 1997 and 2007. More than half were taking ASA alone and about 3% were taking SSRIs along with antiplatelet therapy. Researchers found that although ASA and clopidogrel taken on their own have a similar risk of bleeding, combining an SSRI with ASA increased the risk by 42%, and combining SSRI use with dual antiplatelet therapy increased the risk by 57%. Women appeared to have a decreased risk of bleeding, as did patients who had angioplasty as an intervention after their heart attack.
Bleeding includes gastrointestinal bleeding, hemorrhagic stroke or other bleeding that required hospitalization or occurred in hospital during treatment.
“Ultimately, clinicians must weigh the benefits of SSRI therapy against the risk of bleeding in patients with major depression following acute myocardial infarction,” write the authors. They conclude physicians must be cautious when prescribing antidepressants.
Public release date: 26-Sep-2011
Increased caffeinated coffee consumption associated with decreased risk of depression in women
The risk of depression appears to decrease for women with increasing consumption of caffeinated coffee, according to a report in the September 26 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.
Caffeine is the most frequently used central nervous system stimulant in the world, and approximately 80 percent of consumption is in the form of coffee, according to background information in the article. Previous research, including one prospective study among men, has suggested an association between coffee consumption and depression risk. Because depression is a chronic and recurrent condition that affects twice as many women as men, including approximately one of every five U.S. women during their lifetime, “identification of risk factors for depression among women and the development of new preventive strategies are, therefore, a public health priority,” write the authors. They sought to examine whether, in women, consumption of caffeine or certain caffeinated beverages is associated with the risk of depression.
Michel Lucas, Ph.D., R.D., from the Harvard School of Public Health, Boston, and colleagues studied 50,739 U.S. women who participated in the Nurses’ Health Study. Participants, who had a mean (average) age of 63, had no depression at the start of the study in 1996 and were prospectively followed up with through June 2006. Researchers measured caffeine consumption through questionnaires completed from May 1980 through April 2004, including the frequency that caffeinated and noncaffeinated coffee, nonherbal tea, caffeinated soft drinks (sugared or low-calorie colas), caffeine-free soft drinks (sugared or low-calorie caffeine-free colas or other carbonated beverages) and chocolate were usually consumed in the previous 12 months. The authors defined depression as reporting a new diagnosis of clinical depression and beginning regular use of antidepressants in the previous two years.
Analysis of the cumulative mean consumption included a two-year latency period; for example, data on caffeine consumption from 1980 through 1994 were used to predict episodes of clinical depression from 1996 through 1998; consumption from 1980 through 1998 were used for the 1998 through 2000 follow-up period; and so on. During the 10-year follow-up period from 1996 to 2006, researchers identified 2,607 incident (new-onset) cases of depression. When compared with women who consumed one cup of caffeinated coffee or less per week, those who consumed two to three cups per day had a 15 percent decrease in relative risk for depression, and those consuming four cups or more per day had a 20 percent decrease in relative risk. Compared with women in the lowest (less than 100 milligrams [mg] per day) categories of caffeine consumption, those in the highest category (550 mg per day or more) had a 20 percent decrease in relative risk of depression. No association was found between intake of decaffeinated coffee and depression risk.
“In this large prospective cohort of older women free of clinical depression or severe depressive symptoms at baseline, risk of depression decreased in a dose-dependent manner with increasing consumption of caffeinated coffee,” write the authors. They note that this observational study “cannot prove that caffeine or caffeinated coffee reduces the risk of depression but only suggests the possibility of such a protective effect.” The authors call for further investigations to confirm their results and to determine whether usual caffeinated coffee consumption could contribute to prevention or treatment of depression.
Public release date: 26-Sep-2011
Pregnant women who exercise protect their offspring against long-term neurodegenerative diseases
New research in the FASEB Journal suggests that prenatal exercise improves brain plasticity, decreases toxic protein deposits, inflammation and oxidative stress, which wards off Alzheimer’s and other diseases
Bethesda, MD—If you are pregnant, here’s another reason to work out: you will reduce the chances of your new baby developing neurodegenerative diseases, such as Alzheimer’s, later in life. A new research report published online in The FASEB Journal (http://www.fasebj.org) shows that mice bred to develop a neurodegenerative disease roughly equivalent to Alzheimer’s disease showed fewer signs of the disease and greater brain plasticity later in life when their mothers exercised regularly than those whose mothers did not exercise.
“This research provides an experimental rationale for the effects of beneficial behavioral stimuli experienced by the pregnant mother affecting the disease status of an as yet-unborn child. Epigenetic alterations (alterations in gene and protein expression caused by mechanisms other than changes in the underlying DNA sequence) provide a most probable mechanism by which mothers could have transferred their own behavioral experience to their progeny,” said Kathy Keyvani, M.D., a researcher involved in the work from the Institute of Pathology and Neuropathology at the University Hospital Essen in Essen, Germany. “A better understanding of the underlying pathways may provide novel treatment and/or prevention strategies for Alzheimer’s disease and bring more insight into the fascinating link between brain and behavior.”
To make this discovery, Keyvani and colleagues mated male mice that express a mutant form of the APP gene found in some Alzheimer’s patients with healthy female wild-type mice. After weaning, healthy and “Alzheimer-diseased” offspring were kept in standard cages for five months. Mouse brains were examined for signs of disease shortly thereafter. The “Alzheimer-diseased” mice whose mothers ran on a exercise wheel during pregnancy had fewer Beta-amyloid plaques, smaller plaque size, less inflammation, less oxidative stress, and a better functioning vascular network than those whose mothers did not run. Additionally, the mice whose mothers ran on the wheel also showed an up-regulation of plasticity-related molecules, which are indicators for more and better connections between the nerve cells.
“No one is resistant to the health benefits of exercise,” said Gerald Weissmann, M.D., Editor-in-Chief of The FASEB Journal, “and this research confirms that reasonable workouts can have a lifetime of benefits for your offspring. Whether you work out at home or go to the gym, you should do it for the sake of your health and that of your offspring.”
Public release date: 26-Sep-2011
Marker for Alzheimer’s disease rises during day and falls with sleep
Up-and-down cycle flattens as age disrupts pattern
A marker for Alzheimer’s disease rises and falls in the spinal fluid in a daily pattern that echoes the sleep cycle, researchers at Washington University School of Medicine in St. Louis have found.
The pattern is strongest in healthy young people and reinforces a link between increased Alzheimer’s risk and inadequate sleep that had been discovered in animal models. The brain’s relative inactivity during sleep may provide an opportunity to finish clearing away the Alzheimer’s marker, a byproduct of brain activity called amyloid beta. The body clears amyloid beta from the brain through the spinal fluid and other mechanisms.
In the new study, scientists report that the normal highs and lows of amyloid beta levels in the fluid that surrounds the brain and spinal cord begin to flatten in older adults, whose sleep periods are often shorter and more prone to disruption. In older adults with brain plaques linked to Alzheimer’s disease, the ebb and flow is eradicated, and amyloid beta levels are close to constant.
The study is now online in Archives of Neurology.
“In healthy people, levels of amyloid beta drop to their lowest point about six hours after sleep, and return to their highest point six hours after maximum wakefulness,” says Randall Bateman, MD, associate professor of neurology. “We looked at many different behaviors, and the transitions between sleep and wakefulness were the only phenomena that strongly correlated with the rise and fall of amyloid beta in the spinal fluid.”
Bateman’s laboratory conducted the study in partnership with Washington University’s Sleep Medicine Center.
“We’ve known for some time that significant sleep deprivation has negative effects on cognitive function comparable to that of alcohol intoxication,” says Stephen Duntley, MD, professor of neurology and director of the center. “But it’s recently become apparent that prolonged sleep disruption and deprivation can actually play an important role in pathological processes that underlie diseases. This connection to Alzheimer’s disease isn’t confirmed yet in humans, but it could be very important.”
Duntley notes that older adults often sleep less and have fewer periods of deep slumber. A number of factors linked to aging, such as reduced exercise levels, can disrupt the normal daily patterns of sleep and waking. These disruptions often become more pronounced as individuals age. The risk of Alzheimer’s disease also increases with age.
Scientists studied three sets of subjects: a group age 60 and older who tested positive for the presence of amyloid beta plaques in the brain; a group in the same age range who did not have plaques; and a group of healthy persons age 18-60.
Researchers used a spinal tap to monitor amyloid beta in the spinal fluid hourly for 24 to 36 hours, and videotaped patients’ activities and monitored their brain activity during that period.
In the group with brain plaques, amyloid beta levels were close to constant. But in the other two groups, the levels regularly rose and fell in a snakelike, sinusoidal pattern. The highs and lows of this pattern were much more pronounced in younger subjects.
Lead author Yafei Huang, PhD, statistical data analyst, reviewed the subjects’ activities during the monitoring period at 30-second intervals. She grouped them into categories such as eating or drinking, watching television, using the bathroom, and using a computer or text messaging.
None of these activities could be closely correlated with changes in amyloid beta levels. But peaks in sleep and wakefulness, assessed both by videotape and by records of patients’ brain activity levels, consistently occurred before the peaks and valleys of amyloid beta levels.
Researchers are currently testing if deliberate interruption of sleep in young healthy subjects disrupts the normal daily decrease in spinal amyloid beta.Scientists may follow these studies with tests of whether sleeping pills and other interventions that improve sleep help maintain the rise and fall of amyloid beta in the spinal fluid.
“It’s still speculation, but there are tantalizing hints that better sleep may be helpful in reducing Alzheimer’s disease risk,” says Duntley. “We know from a number of studies that exercise enhances sleep, and research also has shown that exercise is associated with decreased risk of Alzheimer’s. Sleep might be one link through which that effect occurs.”
Public release date: 27-Sep-2011
New analysis of the cardiovascular risks of common non-steroidal anti-inflammatory drugs
An updated study published in this week’s PLoS Medicine gives some new information on the cardiovascular risks of non-steroidal anti-inflammatory drugs (NSAIDs) and suggests that among these commonly used drugs, naproxen and low dose ibuprofen are least likely to increase cardiovascular risk whereas diclofenac, even in doses available without prescription, elevates risk.
Using only observational studies (30 case-control studies and 21 cohort studies) because randomised controlled trials have only reported small numbers of cardiovascular events, the authors, Patricia McGettigan (Hull York Medical School, Hull, UK), and David Henry (Institute for Clinical Evaluative Sciences, Toronto, Canada) also found that the new NSAID, etoricoxib, has a high risk of cardiovascular events similar to that of drugs that have been withdrawn because of safety concerns and that new evidence on cardiovascular risk of indomethacin, an older drug, casts doubt on its continued clinical use.
The authors say: “the large sizes of the studies reviewed here, the presence of consistent dose-response relationships, and general agreement with the results of randomised trials give us confidence in the results.” They add: “In our view, the results are sufficiently robust to inform clinical and regulatory decisions.”
This study highlights the importance of adequately assessing drug safety in clinical trials and in an editorial the PLoS Medicine editors write: “debates continue about the best ways to meaningfully synthesize and interpret data on the possible harmful effects of drugs – for example, how passive surveillance systems (spontaneous reports of suspected adverse reactions) should be improved, whether new drugs should go through a phased launch process with enhanced safety evaluation, and the appropriateness of risk mitigation strategies for drugs with safety concerns.”
The editors conclude: “However, these challenges should not detract investigators, regulators, and patients from demanding a higher safety standard for approved drugs. Higher standards will require both greater transparency – in revealing what studies are being conducted and what data have been generated – and greater willingness of funders to support new studies specifically addressing drug safety.”
Public release date: 27-Sep-2011
Environmental health risks of livestock farming
More exacerbations in lung patients, Q fever risk increasing with number of livestock close by
Amsterdam, The Netherlands: Emissions from livestock farms cause asthma and COPD patients living nearby to experience more exacerbations, according to research presented today at the European Respiratory Society’s Annual Congress in Amsterdam.
Also, chances of contracting Q fever from nearby sheep and goat farms increased with the number of animals rather than with the number of farms, the research found, hinting at higher health risks from ‘mega farms’.
The researchers, from Utrecht University, measured increased levels of particulate matter containing microbes and microbial toxins near livestock farms. They studied health effects by screening medical records from 50 general practitioners servicing 200,000 patients in regions with high and low densities of livestock farms.
In regions with many livestock farms, doctors reported less asthma, COPD, upper respiratory tract infections and hay fever, a result that mimics some earlier studies that saw less allergies in children who grown up on farms. In this study, the medical records did not specify whether symptoms were allergy- related, so the researchers do not know whether the effect is indeed limited to allergies.
The research however also showed that in areas with many livestock farms, people who suffer from asthma or COPD developed twice as much pneumonia and upper respiratory tract infections than people in regions with little livestock activity. The overall prevalence of pneumonia also was higher in high-density areas.
The study period included an unusually severe outbreak of Q fever, an infectious disease of cattle, sheep or goats, which can cause flu-like symptoms and pneumonia in humans. Between 2007 and 2010, close to 4,000 people in the Netherlands became ill; at least ten of them died.
The researchers found that the risk of contracting Q fever increased with the proximity of sheep farms or goat farms. An even stronger correlation was found between Q fever risk and the number of animals kept in the area, suggesting that mega farms could bring more environmental health risks than smaller farms.
The study contributes to on-going debates about intensive animal farming in densely populated regions in countries such as Germany and the Netherlands.
Lead researcher Dr Lidwien Smit said: “Our study is one of the first to show that living close to farms leads to exacerbation of symptoms for people with lung conditions and that during a Q fever outbreak, the risk of contracting Q fever increased with the number of livestock animals kept close by.”
Public release date: 27-Sep-2011
Women have stronger immune systems than men and it’s all down to a single chromosome
X-chromosome related microRNA may impact immunity and cancer, new study shows
As anyone familiar with the phrase ‘man-flu’ will know women consider themselves to be the more robust side of the species when it comes to health and illness. Now new research, published in BioEssays, seems to support the idea. The research focuses on the role of MicroRNAs encoded on the X chromosome to explain why women have stronger immune systems to men and are less likely to develop cancer.
The research, led by Dr Claude Libert from Ghent University in Belgium, focused on MicroRNA, tiny strains of ribonucleic acid which alongside DNA and proteins, make up the three major macromolecules that are essential for all known forms of life.
“Statistics show that in humans, as with other mammals, females live longer than males and are more able to fight off shock episodes from sepsis, infection or trauma,” said Libert. “We believe this is due to the X chromosome which in humans contains 10% of all microRNAs detected so far in the genome. The roles of many remain unknown, but several X chromosome-located strands of microRNA have important functions in immunity and cancer.”
Dr Libert’s team proposes that the biological mechanisms of the X chromosome have a strong impact on an individual’s genes, known as genetic imprinting, which gives an immunological advantage to females. To develop their hypothesis the team produced a detailed map of all described microRNAs which have a role in immune functions and cancer in both human and mouse X chromosomes.
“We believe this immunological advantage is due to the silencing of X-linked genes by these microRNAs,” said Libert. “Gene silencing and inactivation skewing are known mechanisms which affect X-linked genes and may influence X-linked microRNAs in the same way.”
This genetic silencing leaves males at an immunological disadvantage as a male has only one X-chromosome. The Y-Chomosone contains fewer genes so if the genes involved in immunity are silenced maternally the male is left with no compensating genetic information.
“How this unique form of genetic inheritance influences X-chromosone linked microRNAs will be a challenge for researchers for years to come,” concluded Libert, “not only from an evolutionary point of view, but also for scientists investigating the causes and cures of disease.”
Public release date: 27-Sep-2011
Low zinc and copper levels might cause spontaneous abortion
This hypothesis had never been proven before in humans, and it has been demonstrated by University of Granada researchers. Spontaneous abortion is estimated to affect 15 percent of women, mainly in the first trimester of pregnancy.
Scientists at the University of Granada have confirmed that a low plasma level of copper and zinc in pregnant women may be a factor associated with spontaneous abortion, a hypothesis that had not been confirmed to date, and which had never been proven in humans before.
For the purpose of this study, 265 pregnant women participated in the tests. From these 265 women, 132 had suffered a spontaneous miscarriage during that year. The rest (other 133) were women with evolutionary pregnancy, selected among pregnant women attending an scheduled birth control appointment. All of them underwent an ultrasound examination, and were taken a blood sample for laboratory tests. Additionally, they were asked to answer a questionnaire. In total, 131 variables were assessed from each participant.
Differences in plasma concentrations
The data obtained from the group of women who had suffered a miscarriage were compared with those obtained from the group of women with a normal process of pregnancy. The results proved the existence of differences in maternal plasma concentrations of copper and zinc. This finding suggests that maternal deficiency of one or both trace elements may be associated with the occurrence of spontaneous abortion, which opens new and interesting lines of research in this area so far unexplored.
Apart from the influence that copper and zinc may have on the occurrence of abortions, the research conducted at the UGR has provided relevant information about other variables previously studied, but significantly unknown as homocysteine, preconception and prenatal supplementation with iodine and folate, thyroid dysfunction or consumption of drugs in the first weeks of pregnancy.
This study was carried out by Jesús Joaquín Hijona Elósegui, a researcher at the Department of Pharmacology of the University of Granada, and conducted by professors Manuel García Morillas and Juan Antonio Maldonado Jurado.
UGR scientists determined that most of pregnancies (64 percent) that ended in abortion in the study were planned, although only 12 percent of patients had used the recommended supplements of iodine and folate before attempting pregnancy (These substances have been proven to decrease the rate of abortions and malformations). In addition, a third of the women who had a miscarriage reported to be regular smokers and 16.6 percent regularly consumed coffee at a dose that exceeded the abortifacient and teratogenic threshold. The consumption of tobacco and caffeine on certain doses has been strongly associated with the occurrence of spontaneous abortion.
During pregnancy, 81.07 percent of the women who suffered a miscarriage had taken some drug officially contraindicated during pregnancy, and 13.63 percent were exposed to some drug considered dangerous during pregnancy.
The most frequent complication
As doctor Hijona points out “despite the significant progress made in reproductive medicine, spontaneous abortion is still the most frequent complication during pregnancy. It is estimated to affect 15 percent of pregnant women, mainly during the first trimester. Although most of the time it is not recurring, there is a recurrence of two to five percent among women who have already suffered a miscarriage.”
There are data available showing an increase in the number of miscarriages among the Spanish population. In recent years, the number of pregnant women who suffers a miscarriage has increased gradually. This is not only due to the increase in the number of pregnancies, but also to the increase in the percentage of miscarriages –from 10.39 percent in 2003 to 13.70 percent in 2010).
The results obtained in this study were published in the Spanish journals Progresos de Obstetricia y Ginecología (the official journal of the Spanish Society of Gynaecology and Obstetrics), in Toko-Ginecología Práctica, and in Obstetrics and Gynaecology
Public release date: 28-Sep-2011
Popular colorectal cancer drug may cause permanent nerve damage
Nerve degeneration detected with skin biopsies
Oxaliplatin, a platinum-based anticancer drug that’s made enormous headway in recent years against colorectal cancer, appears to cause nerve damage that may be permanent and worsens even months after treatment ends. The chemotherapy side effect, described by Johns Hopkins researchers in the September issue of Neurology, was discovered in what is believed to be the first effort to track oxaliplatin-based nerve damage through relatively cheap and easy punch skin biopsies.
The Johns Hopkins investigators emphasize that the drug therapy clearly improves length of survival in advanced cancer by months to years, and that the goal of their new study is to find ways of preventing or slowing the damage through nerve-protective therapies identfied through simple skin testing.
Many patients who take oxaliplatin report bothersome neurological side effects, including pain in the hands and feet and a numbness or tingling in the throat that affects swallowing, according to study leader Michael Polydefkis, M.D., M.H.S., associate professor of neurology at the Johns Hopkins University School of Medicine and director of the EMG Laboratory and Cutaneous Nerve Laboratory at Johns Hopkins Bayview Medical Center. Though these symptoms develop over time in the majority of patients, some report neuropathies as early as when the drug is first infused.
To get a better sense of how oxaliplatin affects nerve cells, Polydefkis and his colleagues recruited eight cancer patients about to begin oxaliplatin treatment at The Johns Hopkins Hospital. All had been diagnosed with advanced colon cancer.
Before their first oxaliplatin infusion, each patient underwent a comprehensive neurological examination, including nerve conduction testing, a clinical exam to look for signs of nerve damage, and a punch biopsy that removed tiny (3-mm diameter) portions of skin near their knees and ankles. Once oxaliplatin treatment began, consisting of infusions over two days once every two weeks for 12 cycles, the researchers performed the same tests after 30, 90 and 180 days. Another 180 days after they finished with treatment, the patients received one final exam.
Test results showed that each of the patients’ nerve function and neuropathy symptoms worsened over time and that results from the punch skin biopsies neatly mirrored the side effect arc. Using a microscope, the researchers saw that nerve cells’ long extensions, called axons, degenerated over the course of oxaliplatin therapy. This progression persisted after treatment stopped. Even 180 days after their last doses, seven out of the eight patients’ axons continued to wither.
“This drug has rapidly become the standard of care for people with advanced colon cancer, but we really knew little about how oxaliplatin affects nerves over time,” he says. “With people living longer lives on oxaliplatin, it’s important to know more about these neurological side effects so patients and their physicians can make educated choices on how this drug is used, and perhaps suggest ways to limit the damage.”
The new study strongly suggests that punch skin biopsies could be an easy and inexpensive way to follow nerve cell degeneration, a crucial prerequisite for testing the effectiveness of drugs currently in development to trace, prevent or slow nerve damage.
“Skin biopsies can be done pretty easily, uniformly and cheaply anywhere, including hospitals, doctors’ offices and clinics, and those places can have the tissue sent to Hopkins for analysis,” Polydefkis says. “High-quality neurological testing isn’t nearly as easy or economical to do, so it’s possible that the biopsies could play a pivotal role in bringing neuroprotective drugs to fruition.”
Public release date: 28-Sep-2011
Big Tobacco knew radioactive particles in cigarettes posed cancer risk but kept quiet
Tobacco companies knew that cigarette smoke contained radioactive alpha particles for more than four decades and developed “deep and intimate” knowledge of these particles’ cancer-causing potential, but they deliberately kept their findings from the public, according to a new study by UCLA researchers.
The analysis of dozens of previously unexamined internal tobacco industry documents, made available in 1998 as the result of a legal settlement, reveals that the industry was aware of cigarette radioactivity some five years earlier than previously thought and that tobacco companies, concerned about the potential lung cancer risk, began in-depth investigations into the possible effects of radioactivity on smokers as early as the 1960s.
“The documents show that the industry was well aware of the presence of a radioactive substance in tobacco as early as 1959,” the authors write. “Furthermore, the industry was not only cognizant of the potential ‘cancerous growth’ in the lungs of regular smokers, but also did quantitative radiobiological calculations to estimate the long-term lung radiation absorption dose of ionizing alpha particles emitted from cigarette smoke.” The study, published online Sept. 27 in Nicotine & Tobacco Research, the peer-reviewed journal of the Society for Research on Nicotine and Tobacco, adds to a growing body of research detailing the industry’s knowledge of cigarette smoke radioactivity and its efforts to suppress that information.
“They knew that the cigarette smoke was radioactive way back then and that it could potentially result in cancer, and they deliberately kept that information under wraps,” said the study’s first author, Hrayr S. Karagueuzian, a professor of cardiology who conducts research at UCLA’s Cardiovascular Research Laboratory, part of the David Geffen School of Medicine at UCLA. “Specifically, we show here that the industry used misleading statements to obfuscate the hazard of ionizing alpha particles to the lungs of smokers and, more importantly, banned any and all publication on tobacco smoke radioactivity.”
The radioactive substance – which the UCLA study shows was first brought to the attention of the tobacco industry in 1959 – was identified in 1964 as the isotope polonium-210, which emits carcinogenic alpha radiation. Polonium-210 can be found in all commercially available domestic and foreign cigarette brands, Karagueuzian said, and is absorbed by tobacco leaves through naturally occurring radon gas in the atmosphere and through high-phosphate chemical fertilizers used by tobacco growers. The substance is eventually inhaled by smokers into the lungs.
The study outlines the industry’s growing concerns about the cancer risk posed by polonium-210 inhalation and the research that industry scientists conducted over the decades to assess the radioactive isotope’s potential effect on smokers – including one study that quantitatively measured the potential lung burden from radiation exposure in a two-pack-a-day smoker over a two-decade period.
Karagueuzian and his colleagues made independent calculations using industry and academic data and arrived at results that very closely mirrored those of that industry study, which was conducted nearly a quarter-century ago. They then compared those results to rates used by the Environmental Protection Agency to estimate lung cancer risk among individuals exposed to similar amounts of alpha particle-emitting radon gas in their homes.
“The gathered data from the documents on the relevant radiobiological parameters of the alpha particles – such as dose, distribution and retention time – permitted us to duplicate the industry’s secretly estimated radiation absorbed dose by regular smokers over a 20- or 25-year period, which equaled 40 to 50 rads,” he said. “These levels of rads, according to the EPA’s estimate of lung cancer risk in residents exposed to radon gas, equal 120 to 138 deaths per 1,000 regular smokers over a 25-year period.”
Despite the potential risk of lung cancer, tobacco companies declined to adopt a technique discovered in 1959 and then another developed in 1980 that could have helped eliminate polonium-210 from tobacco, the researchers said. The 1980 technique, known as an acid-wash, was found to be highly effective in removing the radioisotope from tobacco plants, where it forms a water-insoluble complex with the sticky, hair-like structures called trichomes that cover the leaves.
And while the industry frequently cited concerns over the cost and the possible environmental impact as rationales for not using the acid wash, UCLA researchers uncovered documents that they say indicate the real reason may have been far different.
“The industry was concerned that the acid media would ionize the nicotine, making it more difficult to be absorbed into the brains of smokers and depriving them of that instant nicotine rush that fuels their addiction,” Karagueuzian said. “The industry also were well aware that the curing of the tobacco leaves for more than a one-year period also would not eliminate the polonium-210, which has a half-life of 135 days, from the tobacco leaves because it was derived from its parent, lead-210, which has a half-life of 22 years.”
Karagueuzian said the insoluble alpha particles bind with resins in the cigarette smoke and get stuck and accumulate at the bronchial bifurcations of the lungs, forming “hot spots,” instead of dispersing throughout the lungs. In fact, previous research on lung autopsies in smokers who died of lung cancer showed that malignant growths were primarily located at the same bronchial bifurcations where these hot spots reside.
“We used to think that only the chemicals in the cigarettes were causing lung cancer,” Karagueuzian said. “But the case of the these hot spots, acknowledged by the industry and academia alike, makes a strong case for an increased probability of long-term development of malignancies caused by the alpha particles. If we’re lucky, the alpha particle-irradiated cell dies. If it doesn’t, it could mutate and become cancerous.”
Karagueuzian said the findings are very timely in light of the June 2009 passage of the Family Smoking Prevention and Tobacco Control Act, which grants the U.S. Food and Drug Administration broad authority to regulate and remove harmful substances – with the exception of nicotine – from tobacco products. The UCLA research, he said, makes a strong case that the FDA ought to consider making the removal of alpha particles from tobacco products a top priority.
“Such a move could have a considerable public health impact, due to the public’s graphic perception of radiation hazards,” he said.
To uncover the information, Karagueuzian and his team combed through the internal tobacco industry documents made available online as part of the landmark 1998 Tobacco Master Settlement Agreement. Documents from Philip Morris, R.J. Reynolds, Lorillard, Brown I Williamson, the American Tobacco Company, the Tobacco Institutes and the Council for Tobacco Research, as well as the Bliley documents, were examined, Karagueuzian said.
The team searched for key terms such as “polonium-210,” “atmospheric fallout,” “bronchial epithelium,” “hot particle” and “lung cancer,” among others.
Karagueuzian said the earliest causal link between alpha particles and cancer was made in around 1920, when alpha particle-emitting radium paint was used to paint luminescent numbers on watch dials. The painting was done by hand, and the workers commonly used their lips to produce a point on the tip of the paint brush. Many workers accumulated significant burdens of alpha particles through ingestion and absorption of radium-226 into the bones and subsequently developed jaw and mouth cancers. The practice was eventually discontinued.
Another example involves liver cancer in patients exposed to chronic low-dose internal alpha particles emitted from the poorly soluble deposits of thorium dioxide after receiving the contrast agent Thorotrast. It has been suggested that the liver cancers resulted from point mutations of the tumor suppressor gene p53 by the accumulated alpha particles present in the contrast media. The use of Thorotrast as contrast agent was stopped in the 1950s.
Public release date: 28-Sep-2011
Commonly used supplement may improve recovery from spinal cord injuries
LEXINGTON, Ky. — A commonly used supplement is likely to improve outcomes and recovery for individuals who sustain a spinal cord injury (SCI), according to research conducted by University of Kentucky neuroscientists.
Sasha Rabchevsky, associate professor of physiology, Patrick Sullivan, associate professor of anatomy and neurobiology, and Samir Patel, senior research scientist — all of the UK Spinal Cord and Brain Injury Research Center (SCoBIRC) — have discovered that in experimental models, severe spinal cord injury can be treated effectively by administering the supplement acetyl-L-carnitine or ALC, a derivative of essential amino acids that can generate metabolic energy, soon after injury.
The researchers previously reported that following spinal cord injury, the mitochondria, or energy-generation components of cells, are overwhelmed by chemical stresses and lose the ability to produce energy in the form of the compound adenosine triphosphate (ATP). [1,2] This leads to cell death at the injury site and, ultimately, paralysis of the body below the injury level.
Rabchevsky, Sullivan and Patel have recently demonstrated that ALC can preserve the vitality of mitochondria by acting as an alternative biofuel providing energy to cells, thus bypassing damaged mitochondrial enzymes and promoting neuroprotection. 
Results soon to be published show that systemic administration of ALC soon after a paralyzing injury promoted the milestone recovery of the ability to walk. Unlike the animal control group given no ALC, which regained only slight hindlimb movements, the group treated with ALC recovered hindlimb movements more quickly and were able to stand on all four limbs and walk a month later. Critically, such remarkable recovery was correlated with significant tissue sparing at the injury site following administration of ALC.
Because ALC can be administered orally, and is well-tolerated at relatively high doses in humans, researchers believe that their discovery may be translated easily to clinical practice as an early intervention for people with traumatic spinal cord injuries.
Initial funding for these studies was provided by the Kentucky Spinal Cord and Head Injury Research Trust (KSCHIRT). Based on their findings, the research team has been awarded additional grant funding from the National Institutes of Health (NIH) and the Craig H. Neilsen Foundation, with the aim of enabling the investigators to study the beneficial effects of combining ALC with an antioxidant agent known as N-acetylcysteine amide (NACA). The results were reported at the recent National Neurotrauma Society Symposium in July 2011, and will be presented again at the Society for Neuroscience meeting in November 2011.
When translated into clinical practice, this research is expected to offer a viable pharmacological option for promoting neuroprotection and maximizing functional recover following traumatic spinal cord injury.
Public release date: 29-Sep-2011
Single dose of hallucinogen may create lasting personality change
Johns Hopkins study of ingredient in ‘magic mushrooms’ found participants exhibited more ‘openness’
A single high dose of the hallucinogen psilocybin, the active ingredient in so-called “magic mushrooms,” was enough to bring about a measureable personality change lasting at least a year in nearly 60 percent of the 51 participants in a new study, according to the Johns Hopkins researchers who conducted it.
Lasting change was found in the part of the personality known as openness, which includes traits related to imagination, aesthetics, feelings, abstract ideas and general broad-mindedness. Changes in these traits, measured on a widely used and scientifically validated personality inventory, were larger in magnitude than changes typically observed in healthy adults over decades of life experiences, the scientists say. Researchers in the field say that after the age of 30, personality doesn’t usually change significantly.
“Normally, if anything, openness tends to decrease as people get older,” says study leader Roland R. Griffiths, a professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine.
The research, approved by Johns Hopkins’ Institutional Review Board, was funded in part by the National Institute on Drug Abuse and published in the Journal of Psychopharmacology.
The study participants completed two to five eight-hour drug sessions, with consecutive sessions separated by at least three weeks. Participants were informed they would receive a “moderate or high dose” of psilocybin during one of their drug sessions, but neither they nor the session monitors knew when.
During each session, participants were encouraged to lie down on a couch, use an eye mask to block external visual distraction, wear headphones through which music was played and focus their attention on their inner experiences.
Personality was assessed at screening, one to two months after each drug session and approximately 14 months after the last drug session. Griffiths says he believes the personality changes found in this study are likely permanent since they were sustained for over a year by many.
Nearly all of the participants in the new study considered themselves spiritually active (participating regularly in religious services, prayer or meditation). More than half had postgraduate degrees. The sessions with the otherwise illegal hallucinogen were closely monitored and volunteers were considered to be psychologically healthy
“We don’t know whether the findings can be generalized to the larger population,” Griffiths says.
As a word of caution, Griffiths also notes that some of the study participants reported strong fear or anxiety for a portion of their daylong psilocybin sessions, although none reported any lingering harmful effects. He cautions, however, that if hallucinogens are used in less well supervised settings, the possible fear or anxiety responses could lead to harmful behaviors.
Griffiths says lasting personality change is rarely looked at as a function of a single discrete experience in the laboratory. In the study, the change occurred specifically in those volunteers who had undergone a “mystical experience,” as validated on a questionnaire developed by early hallucinogen researchers and refined by Griffiths for use at Hopkins. He defines “mystical experience” as among other things, “a sense of interconnectedness with all people and things accompanied by a sense of sacredness and reverence.”
Personality was measured on a widely used and scientifically validated personality inventory, which covers openness and the other four broad domains that psychologists consider the makeup of personality: neuroticism, extroversion, agreeableness and conscientiousness. Only openness changed during the course of the study.
Griffiths says he believes psilocybin may have therapeutic uses. He is currently studying whether the hallucinogen has a use in helping cancer patients handle the depression and anxiety that comes along with a diagnosis, and whether it can help longtime cigarette smokers overcome their addiction.
“There may be applications for this we can’t even imagine at this point,” he says. “It certainly deserves to be systematically studied.”
Public release date: 29-Sep-2011
Red wine ingredient resveratrol stops breast cancer growth
New research in the FASEB Journal shows that resveratrol blocks the growth effects of estrogen by reducing the specific breast cancer receptors
Bethesda, MD—Cheers! A new research report appearing in the October 2011 issue of The FASEB Journal (https://www.fasebj.org) shows that resveratrol, the “healthy” ingredient in red wine, stops breast cancer cells from growing by blocking the growth effects of estrogen. This discovery, made by a team of American and Italian scientists, suggests for the first time that resveratrol is able to counteract the malignant progression since it inhibits the proliferation of hormone resistant breast cancer cells. This has important implications for the treatment of women with breast cancer whose tumors eventually develop resistance to hormonal therapy.
“Resveratrol is a potential pharmacological tool to be exploited when breast cancer become resistant to the hormonal therapy,” said Sebastiano Andò, a researcher involved in the work from the Faculty of Pharmacy at the University of Calabria in Italy.
To make this discovery, Andò and colleagues used several breast cancer cell lines expressing the estrogen receptor to test the effects of resveratrol. Researchers then treated the different cells with resveratrol and compared their growth with cells left untreated. They found an important reduction in cell growth in cells treated by resveratrol, while no changes were seen in untreated cells. Additional experiments revealed that this effect was related to a drastic reduction of estrogen receptor levels caused by resveratrol itself.
“These findings are exciting, but in no way does it mean that should people go out and start using red wine or resveratrol supplements as a treatment for breast cancer,” said Gerald Weissmann, M.D., Editor-in-Chief of The FASEB Journal. “What it does mean, however, is that scientists haven’t finished distilling the secrets of good health that have been hidden in natural products such as red wine.”
Public release date: 29-Sep-2011
Cocaine users have 45 percent increased risk of glaucoma
Cocaine users diagnosed with glaucoma two decades earlier than nonusers
INDIANAPOLIS – A study of the 5.3 million men and women seen in Department of Veterans Affairs outpatient clinics in a one-year period found that use of cocaine is predictive of open-angle glaucoma, the most common type of glaucoma.
The study revealed that after adjustments for race and age, current and former cocaine users had a 45 percent increased risk of glaucoma. Men with open-angle glaucoma also had significant exposures to amphetamines and marijuana, although less than cocaine.
Patients with open-angle glaucoma and history of exposure to illegal drugs were nearly 20 years younger than glaucoma patients without a drug exposure history (54 years old versus 73 years old).
Study results appear in the September issue of Journal of Glaucoma.
“The association of illegal drug use with open-angle glaucoma requires further study, but if the relationship is confirmed, this understanding could lead to new strategies to prevent vision loss,” said study first author Regenstrief Institute investigator Dustin French, Ph.D., a research scientist with the Center of Excellence on Implementing Evidence-Based Practice, Department of Veterans Affairs, Veterans Health Administration, Health Services Research and Development Service in Indianapolis. A health economist who studies health outcomes, he is also an assistant professor of medicine at the Indiana University School of Medicine.
Glaucoma is the second most common cause of blindness in the United States. Although the mechanism of vision loss in glaucoma is not fully understood, most research has focused on an increase in eye pressure gradually injuring the optic nerve. Most individuals who develop open-angle glaucoma have no symptoms until late in the disease process when substantial peripheral vision has been lost.
Dr. French and colleagues found that among the 5.3 million veterans (91 percent of whom were male) who used VA outpatient clinics in fiscal year 2009, nearly 83,000 (about 1.5 percent) had glaucoma. During the same fiscal year, nearly 178,000 (about 3.3 percent) of all those seen in the outpatient clinics had a diagnosis of cocaine abuse or dependency.
Although this study determined significant increased risk for glaucoma in those with a history of drug use, it does not prove a causal relationship. It is unlikely that glaucoma preceded the use of illegal drugs, since substance use typically begins in the teens or twenties.
“The Veterans Health Administration substance use disorder treatment program is the largest and most comprehensive program of its kind in the country,” said Dr. French. He believes that the reliability of the data used in the glaucoma study reflects the overall scope and high quality of the VHA substance use program.
The long-term effects of cocaine use on intraocular pressure, the only modifiable risk factor for glaucoma, requires further study. Should the association of cocaine use and glaucoma be confirmed in other studies, substance abuse would present another modifiable risk factor for this blinding disease.
Public release date: 29-Sep-2011
Glucosamine-like supplement suppresses multiple sclerosis attacks
UCI study shows promise of metabolic therapy for autoimmune diseases
— Irvine, Calif., September 30, 2011
A glucosamine-like dietary supplement suppresses the damaging autoimmune response seen in multiple sclerosis, according to a UC Irvine study.
UCI’s Dr. Michael Demetriou, Ani Grigorian and others found that oral N-acetylglucosamine (GlcNAc), which is similar to but more effective than the widely available glucosamine, inhibited the growth and function of abnormal T-cells that in MS incorrectly direct the immune system to attack and break down central nervous system tissue that insulates nerves.
Study results appear online in The Journal of Biological Chemistry.
Earlier this year, Demetriou and colleagues discovered that environmental and inherited risk factors associated with MS — previously poorly understood and not known to be connected — converge to affect how specific sugars are added to proteins regulating the disease.
“This sugar-based supplement corrects a genetic defect that induces cells to attack the body in MS,” said Demetriou, associate professor of neurology and microbiology & molecular genetics, “making metabolic therapy a rational approach that differs significantly from currently available treatments.”
Virtually all proteins on the surface of cells, including immune cells such as T-cells, are modified by complex sugar molecules of variable sizes and composition. Recent studies have linked changes in these sugars to T-cell hyperactivity and autoimmune disease.
In mouse models of MS-like autoimmune disease, Demetriou and his team found that GlcNAc given orally to those with leg weakness suppressed T-cell hyperactivity and autoimmune response by increasing sugar modifications to the T-cell proteins, thereby reversing the progression to paralysis.
The study comes on the heels of others showing the potential of GlcNAc in humans. One reported that eight of 12 children with treatment-resistant autoimmune inflammatory bowel disease improved significantly after two years of GlcNAc therapy. No serious adverse side effects were noted.
“Together, these findings identify metabolic therapy using dietary supplements such as GlcNAc as a possible treatment for autoimmune diseases,” said Demetriou, associate director of UCI’s Multiple Sclerosis Research Center. “Excitement about this strategy stems from the novel mechanism for affecting T-cell function and autoimmunity — the targeting of a molecular defect promoting disease — and its availability and simplicity.”
He cautioned that more human studies are required to assess the full potential of the approach. GlcNAc supplements are available over the counter and differ from commercially popular glucosamine. People who purchase GlcNAc should consult with their doctors before use.
Lindsey Araujo and Dylan Place of UCI and Nandita N. Naidu and Biswa Choudhury of UC San Diego also participated in the research, which was funded by the National Institutes of Health and the National Multiple Sclerosis Society.
Public release date: 3-Oct-2011
METABOLIC DISEASE: Antioxidants combat risk factor for type 2 diabetes in mice
The number of individuals with type 2 diabetes is reaching epidemic proportions. One of the main risk factors for developing type 2 diabetes is resistance of the cells in the body to the effects of the hormone insulin. Chu-Xia Deng and colleagues, at the National Institutes of Health, Bethesda, have now identified a new molecular pathway that helps mice remain sensitive to the effects of insulin. Disruption of this pathway in liver cells gradually led to whole-body insulin resistance in older mice that could largely be reversed by treatment with an antioxidant. Deng and colleagues therefore suggest that antioxidants might help protect older humans from progressive insulin resistance and thereby reduce their risk of developing type 2 diabetes.
TITLE: Hepatic Sirt1 deficiency in mice impairs mTorc2/Akt signaling and results in hyperglycemia, oxidative damage, and insulin resistance
Public release date: 3-Oct-2011
Mayo Clinic study: multiple surgeries and anesthesia exposure
ROCHESTER, Minn. — Every year millions of babies and toddlers receive general anesthesia for procedures ranging from hernia repair to ear surgery. Now, researchers at Mayo Clinic in Rochester have found a link among children undergoing multiple surgeries requiring general anesthesia before age 2 and learning disabilities later in childhood.
The study, which will be published in the November 2011 issue of Pediatrics (published online Oct. 3), was conducted with existing data of 5,357 children from the Rochester Epidemiology Project and examined the medical and educational records of 1,050 children born between 1976 and 1982 in a single school district in Rochester.
“After removing factors related to existing health issues, we found that children exposed more than once to anesthesia and surgery prior to age 2 were approximately three times as likely to develop problems related to speech and language when compared to children who never underwent surgeries at that young age,” says David Warner, M.D., Mayo Clinic anesthesiologist and co-author of the study.
Among the 5,357 children in the cohort, 350 underwent surgeries with general anesthesia before their second birthday and were matched with 700 children who did not undergo a procedure with anesthesia. Of those exposed to anesthesia, 286 experienced only one surgery and 64 had more than one. Among those children who had multiple surgeries before age 2, 36.6 percent developed a learning disability later in life. Of those with just one surgery, 23.6 percent developed a learning disability, which compares to 21.2 percent of the children who developed learning disabilities but never had surgery or anesthesia before age 2. However, researchers saw no increase in behavior disorders among children with multiple surgeries.
“Our advice to parents considering surgery for a child under age 2 is to speak with your child’s physician,” says Randall Flick, M.D., Mayo Clinic pediatric anesthesiologist and lead author of the study. “In general, this study should not alter decision-making related to surgery in young children. We do not yet have sufficient information to prompt a change in practice and want to avoid problems that may occur as a result of delaying needed procedures. For example, delaying ear surgery for children with repeated ear infections might cause hearing problems that could create learning difficulties later in school.”
This study, funded by the U.S. Food and Drug Administration, examines the same population data used in a 2009 study by Mayo Clinic researchers, which reviewed records for children under age 4 and was published in the medical journal Anesthesiology.
The 2009 Mayo Clinic study was the first complete study in humans to suggest that exposure of children to anesthesia might affect development of the brain. Several previous studies suggested that anesthetic drugs might cause abnormalities in the brains of young animals. The study released today is significant because it examines children experiencing anesthesia and surgeries under age 2 and removes factors associated with existing health issues.
Public release date: 3-Oct-2011
BPA exposure in utero may increase predisposition to breast cancer
Study finds perinatal exposure to BPA has effect on mammary hormone response
Chevy Chase, MD—A recent study accepted for publication in Molecular Endocrinology, a journal of The Endocrine Society, found that perinatal exposure to environmentally relevant doses of bisphenol A (BPA) alters long-term hormone response and breast development in mice that may increase the propensity to develop cancer.
BPA, a man-made chemical produced and marketed largely for specific industrial purposes, is detected in body fluids of more than 90 percent of the human population. It was originally synthesized as an estrogenic compound and there has been concern that exposure to BPA could have developmental effects on various hormone-responsive organs including the mammary gland.
“I want it to be clear that we do not provide evidence that BPA exposure causes breast cancer per se,” said Cathrin Brisken, MD, of the Swiss Institute for Experimental Cancer Research and co-author of the study. “We do provide evidence that BPA exposure alters mammary gland development and that this may increase the predisposition of the breast to breast cancer.”
In this study, researchers mimicked human exposure to BPA as it occurs with beverages and food from BPA containing vessels (such as plastics and the lining of tin cans) by adding the compound to the drinking water of breeding mice. Female pups born from BPA-consuming parents were transferred to a BPA-free environment at weaning and followed over time.
Researchers analyzed changes in the mammary gland of female offspring that were exposed to BPA through their mothers in utero and while being breast fed. The mammary glands of BPA exposed females showed an increased response to the hormone progesterone. Lifetime exposure to progesterone has been linked to increase breast cancer risk.
Furthermore, researchers found that adult females who had been exposed to BPA in utero and while breast fed, showed a 1.5 fold increase in cell numbers in their milk ducts. This is comparable to what is seen upon similar exposure to another estrogenic compound, diethyllbestrol (DES). Uterine exposure to DES in the human population has been shown to increase the relative risk of getting breast cancer two-fold as women reach their fifties.
“While we cannot extrapolate these results directly from mice to humans, the possibility that some of the increase in breast cancer incidence observed over the past decades may be attributed to exposure to BPA cannot be dismissed,” said Brisken. “Our study suggests that pregnant and breastfeeding mothers should avoid exposure to BPA as it may affect their daughters’ breast tissue.”
Public release date: 3-Oct-2011
Study in Lancet finds use of hormonal contraception doubles HIV risk
University of Washington researchers conducted trial in Africa
Women using hormonal contraception –such as a birth control pill or a shot like Depo-Provera – are at double the risk of acquiring HIV, and HIV-infected women who use hormonal contraception have twice the risk of transmitting the virus to their HIV-uninfected male partners, according to a University of Washington-led study in Africa of nearly 3,800 couples. The study was published in The Lancet Infectious Diseases.
The research, first presented in July in Rome at the meeting of the International AIDS Society, emphasizes the need for couples to use condoms in addition to other forms of contraception in order to prevent pregnancy and HIV, said lead study author Renee Heffron, an epidemiology doctoral student working with the International Clinical Research Center at UW.
“Women should be counseled about potentially increased risk of HIV acquisition and transmission with hormonal contraception, particularly injectable methods, and about the importance of dual protection with condoms to decrease HIV risk,” said Heffron.
Jared Baeten, an associate professor of global health with the International Clinical Research Center, said to his knowledge this is the first prospective study to show increased HIV risk to male partners of HIV-infected women using hormonal contraception.
More than 140 million women worldwide use hormonal contraception, including daily oral pills and long-acting injectables, like Depo-Provera.
“The benefits of effective hormonal contraception are unequivocal and must be balanced with the risk for HIV infection,” said Baeten.
This study was designed to establish whether hormonal contraception increases the risk of women acquiring HIV and transmitting the virus to their male partners. The study included 3,790 heterosexual HIV serodiscordant couples (i.e. one partner with HIV infection and the other without) who were participating in two long-term studies of HIV in couples in seven African countries (Botswana, Kenya, Rwanda, South Africa, Tanzania, Uganda, and Zimbabwe).
Findings showed that using hormonal contraceptives doubled an HIV uninfected woman’s chances of becoming infected with HIV. The risk was increased for both injectable (mainly depot medroxprogeterone acetate: DMPA) and oral contraceptives, although it was not statistically significant for oral contraceptives.
Additionally, women who were HIV-positive at the beginning of the study and using hormonal contraception were twice as likely to transmit the virus to their male partner compared to women who did not use hormonal contraception.
Public release date: 3-Oct-2011
Pale people may need vitamin D supplements
Researchers at the University of Leeds, funded by Cancer Research UK, suggest that people with very pale skin may be unable to spend enough time in the sun to make the amount of vitamin D the body needs – while also avoiding sunburn.
The study, published in Cancer Causes and Control*, suggested that melanoma patients may need vitamin D supplements as well.
But researchers also noted that sunlight and supplements are not the only factors that can determine the level of vitamin D in a person’s body.
Some inherited differences in the way people’s bodies process vitamin D into the active form also have a strong effect on people’s vitamin D levels.
The study defined the optimal amount of vitamin D required by the body as at least 60nmol/L. However at present there is no universally agreed standard definition of an optimal level of vitamin D.
Professor Julia Newton-Bishop, lead author of the study based in the Cancer Research UK Centre at the University of Leeds, said: “Fair-skinned individuals who burn easily are not able to make enough vitamin D from sunlight and so may need to take vitamin D supplements.
“This should be considered for fair-skinned people living in a mild climate like the UK and melanoma patients in particular.”
Researchers took the vitamin D levels of around 1,200 people and found that around 730 people had a sub-optimal level. Those with fair-skin had significantly lower levels. Researchers chose 60nmol/L as the optimal vitamin D level in part because there is evidence that levels lower than this may be linked to greater risk of heart disease and poorer survival from breast cancer.
A consensus between health charities including Cancer Research UK says that levels below 25nmol/L are vitamin D deficient which means that these levels are associated with poor bone health. But some researchers consider that higher levels, around 60nmol/l, may be desirable for optimal health effects.
Sara Hiom, director of health information at Cancer Research UK, said: “We must be careful about raising the definition of deficiency or sufficiency to higher levels until we have more results from trials showing that maintaining such levels has clear health benefits and no health risks.
“If you are worried about your vitamin D levels, our advice is to go see your doctor.”
Public release date: 3-Oct-2011
Vitamin D deficiency common in cancer patients
Predicts advanced disease
Miami Beach, Fla., October 2, 2011 – More than three-quarters of cancer patients have insufficient levels of vitamin D (25-hydroxy-vitamin D) and the lowest levels are associated with more advanced cancer, according to a study presented on October 2, 2011, at the 53rd Annual Meeting of the American Society for Radiation Oncology (ASTRO).
“Until recently, studies have not investigated whether vitamin D has an impact on the prognosis or course of cancer. Researchers are just starting to examine how vitamin D may impact specific features of cancer, such as the stage or extent of tumor spread, prognosis, recurrence or relapse of disease, and even sub-types of cancer,” Thomas Churilla, lead author of the study and a medical student at the Commonwealth Medical College, Scranton, Pa., said.
Researchers sought to determine the vitamin D levels of patients at Northeast Radiation Oncology Center in Dunmore, Pa., a community oncology practice, and to see if vitamin D levels were related to any specific aspects of cancer. The study involved 160 patients with a median age of 64 years and a 1:1 ratio of men to women. The five most common primary diagnoses were breast, prostate, lung, thyroid and colorectal cancer. A total of 77 percent of patients had vitamin D concentrations either deficient (less than 20 ng/mL) or sub-optimal (20-30 ng/mL). The median serum vitamin D level was 23.5 ng/mL. Regardless of the age or sex of the patient, levels of vitamin D were below the median predicted for advanced stage disease in the patient group.
Patients who were found to be vitamin D deficient were administered replacement therapy, increasing serum D levels by an average of 14.9 ng/mL. Investigators will be analyzing if vitamin D supplementation had an impact on aspects of treatment or survival in the long-term.
“The benefits of vitamin D outside of improving bone health are controversial, yet there are various levels of evidence to support that vitamin D has a role in either the prevention or the prediction of outcome of cancer,” Churilla said. “Further study is needed to continue to understand the relationship between vitamin D and cancer.”
Public release date: 3-Oct-2011
New research shows $6.7 billion spent on unnecessary tests and treatments in one year
Researchers at Mount Sinai School of Medicine have found that $6.7 billion was spent in one year performing unnecessary tests or prescribing unnecessary medications in primary care, with 86 percent of that cost attributed to the prescription of brand-name statins to treat high cholesterol. The findings are published in a research letter in the October 1 Online First issue of Archives of Internal Medicine, one of the JAMA/Archives journals.
Led by Minal Kale, MD, a postdoctoral fellow in the Division of Internal Medicine in the Department of Medicine at Mount Sinai School of Medicine, the research team reviewed findings from a study published in the May 2011 issue of Archives of Internal Medicine, which identified the top five most overused clinical activities in each of three primary care specialties: pediatrics, internal medicine, and family medicine. With this information, they performed a cross-sectional analysis of separate data that were pulled from the National Ambulatory Medical Care Survey and the National Hospital Ambulatory Medical Care Survey. They found more than $6.7 billion was spent in excess healthcare spending in the primary care setting in 2009. Eighty-six percent, or more than $5.8 billion of the unnecessary spending, resulted from the prescribing of brand-name statins rather than generic versions.
“Our analysis shows astronomical costs associated with prescribing of brand name statins when effective, generic alternatives were available. Efforts to encourage prescribing of generics clearly have not gone far enough,” said Dr. Kale. “Additionally, millions are spent on unnecessary blood work, scans, and antibiotic prescriptions. Significant efforts to reduce this spending are required in order to stem these exorbitant activities.”
The remaining costs were attributable to the following:
•During physical exams, more than half of complete blood work ordered was not needed, resulting in more than $32 million in excess costs.
•Unnecessary bone density scans in younger women accounted for more than $527 million.
•CT scans, MRIs, or X-Rays in people presenting with back pain accounted for $175 million in excess healthcare costs.
•Over-prescription of antibiotics for sore throat in children, excluding cases of strep throat or fever, accounted for $116 million in unnecessary costs.
•Other excess costs included needless annual echocardiograms, urine testing, pap tests, and pediatric cough medicine prescriptions.
“We found considerable variability in the frequency of inappropriate care, however our data show that even activities with small individual costs can contribute substantially to overall healthcare costs,” said Dr. Kale. “In light of the current healthcare reform debate, we need more research examining how overuse contributes to healthcare spending. Research might focus on the potential role of reimbursement, defensive medicine practices, or lack of adherence to guidelines.”
The authors note that the analysis is limited to the data provided by the surveys and that they were conservative in their assessments. They conclude that this type of analysis should be extended to medical specialties outside of primary care and that physicians should make efforts in their own practices to evaluate costs and reduce them where necessary in order to achieve affordable, high-quality care.
Public release date: 4-Oct-2011
Natural compound helps reverse diabetes in mice
Researchers at Washington University School of Medicine in St. Louis have restored normal blood sugar metabolism in diabetic mice using a compound the body makes naturally. The finding suggests that it may one day be possible for people to take the compound much like a daily vitamin as a way to treat or even prevent type 2 diabetes.
This naturally occurring compound is called nicotinamide mononucleotide, or NMN, and it plays a vital role in how cells use energy.
“After giving NMN, glucose tolerance goes completely back to normal in female diabetic mice,” says Shin-ichiro Imai, MD, PhD, associate professor of developmental biology. “In males, we see a milder effect compared to females, but we still see an effect. These are really remarkable results. NMN improves diabetic symptoms, at least in mice.”
The research appears online Oct. 4 in Cell Metabolism.
Imai says this discovery holds promise for people because the mechanisms that NMN influences are largely the same in mice and humans.
“But whether this mechanism is equally compromised in human patients with type 2 diabetes is something we have to check,” Imai says. “We have plans to do this in the very near future.”
All cells in the body make NMN in a chain of reactions leading to production of NAD, a vital molecule that harvests energy from nutrients and puts it into a form cells can use. Among other things, NAD activates a protein called SIRT1 that has been shown to promote healthy metabolism throughout the body, from the pancreas to the liver to muscle and fat tissue.
According to the study, aging and eating a high-fat diet reduce production of NMN, slowing the body’s production of NAD and leading to abnormal metabolic conditions such as diabetes. NAD cannot be given to the mice directly because of toxic effects. But after administering NMN, levels of NAD rise and the diabetic mice show dramatically improved responses to glucose. In some cases, they return to normal.
“I’m very excited to see these results because the effect of NMN is much bigger than other known compounds or chemicals,” says first author Jun Yoshino, MD, PhD, postdoctoral research associate. “Plus, the fact that the body naturally makes NMN is promising for translating these findings into humans.”
Imai and his colleagues found that young, healthy mice on a high-fat diet developed diabetes in six months or less. In these mice, they found that NAD levels were reduced. But after administering NMN, levels of NAD increased and the female mice had normal results in glucose tolerance tests — a measure of how well the body moves glucose from the blood to the organs and tissues for use. Glucose tolerance was also improved after male diabetic mice received NMN but did not quite return to normal. The researchers are interested in learning more about these differences between male and female mice.
“We don’t have a clear answer, but we are speculating that sex hormones, such as estrogen, may be important downstream for NAD synthesis,” Yoshino says.
In older mice, they observed that about 15 percent of healthy males fed a normal diet developed diabetes.
“When we injected these older diabetic mice with NMN, they had improved glucose tolerance, even after one injection,” says Kathryn F. Mills, research lab supervisor and an equally contributing first author of the study. “We also injected older healthy mice and found that they weren’t adversely affected. It’s good to know that even if the mice are not diabetic, giving NMN is not going to hurt them.”
Imai says few studies have examined normal mice that naturally develop diabetes as a simple result of aging because the experiments take so long. In an interesting twist, few elderly female mice developed diabetes at all. But after switching to a high fat diet, older female mice quickly developed severe diabetes.
“Again, when we injected these females with NMN, we came up with a completely normal glucose tolerance curve,” Mills says. “We can also see that the NMN has completely reversed and normalized the levels of cholesterol, triglycerides and free fatty acids.”
Though the mice received NMN by injection in this study, Imai’s group is now conducting a long-term study of diabetic mice that get NMN dissolved in their drinking water. Imai calls this work a first step toward a possible “nutriceutical” that people could take almost like a vitamin to treat or even prevent type 2 diabetes.
“Once we can get a grade of NMN that humans can take, we would really like to launch a pilot human study,” Imai says.
Public release date: 4-Oct-2011
Green tea helps mice keep off extra pounds
Green tea may slow down weight gain and serve as another tool in the fight against obesity, according to Penn State food scientists.
Obese mice that were fed a compound found in green tea along with a high-fat diet gained weight significantly more slowly than a control group of mice that did not receive the green tea supplement, said Joshua Lambert, assistant professor of food science in agricultural sciences.
“In this experiment, we see the rate of body weight gain slows down,” said Lambert.
The researchers, who released their findings in the current online version of Obesity, fed two groups of mice a high-fat diet. Mice that were fed Epigallocatechin-3-gallate — EGCG — a compound found in most green teas, along with a high-fat diet, gained weight 45 percent more slowly than the control group of mice eating the same diet without EGCG.
“Our results suggest that if you supplement with EGCG or green tea you gain weight more slowly,” said Lambert.
In addition to lower weight gain, the mice fed the green tea supplement showed a nearly 30 percent increase in fecal lipids, suggesting that the EGCG was limiting fat absorption, according to Lambert.
“There seems to be two prongs to this,” said Lambert. “First, EGCG reduces the ability to absorb fat and, second, it enhances the ability to use fat.”
The green tea did not appear to suppress appetite. Both groups of mice were fed the same amount of high-fat food and could eat at any time.
“There’s no difference in the amount of food the mice are eating,” said Lambert. “The mice are essentially eating a milkshake, except one group is eating a milkshake with green tea.”
A person would need to drink ten cups of green tea each day to match the amount of EGCG used in the study, according to Lambert. However, he said recent studies indicate that just drinking a few cups of green tea may help control weight.
“Human data — and there’s not a lot at this point — shows that tea drinkers who only consume one or more cups a day will see effects on body weight compared to nonconsumers,” said Lambert.
Lambert, who worked with Kimberly Grove and Sudathip Sae-tan, both graduate students in food science, and Mary Kennett, professor of veterinary and biomedical sciences, said that other experiments have shown that lean mice did not gain as much weight when green tea is added to a high fat diet. However, he said that studying mice that are already overweight is more relevant to humans because people often consider dietary changes only when they notice problems associated with obesity.
“Most people hit middle age and notice a paunch; then you decide to eat less, exercise and add green tea supplement,” said Lambert.
Public release date: 5-Oct-2011
Zinc’s role in the brain
Research gives insight into 50-year-old mystery — zinc important for learning and memory
Zinc plays a critical role in regulating how neurons communicate with one another, and could affect how memories form and how we learn. The new research, in the current issue of Neuron, was authored by Xiao-an Zhang, now a chemistry professor at the University of Toronto Scarborough (UTSC), and colleagues at MIT and Duke University.
Researchers have been trying to pin down the role of zinc in the brain for more than fifty years, ever since scientists found high concentrations of the chemical in synaptic vesicles, a portion of the neuron that stores neurotransmitters. But it was hard to determine just what zinc’s function was.
In the new work, the researchers designed a chemical called ZX1 that would bind with zinc rapidly after it was released from the vesicles but before it could complete its journey across the synapse. Using the chemical, they were able to observe how neurons behaved when deprived of zinc.
“As a chemist, I’m proud that I can make a contribution to neuroscience,” says Zhang, who helped design the chemical while conducting postdoctoral research in Stephen J. Lippard’s lab at MIT. He was joint first author of the paper, along with Enhui Pan from James O. McNamara’s group at Duke University.
The researchers studied neurons in a brain region called the hippocampus, which is associated with learning and memory formation. They found that removing zinc interfered with a process called long-term potentiation . Long-term potentiation strengthens the connection between two neurons, and seems to be important for memory and learning.
Zhang is currently working on developing new contrast agents that could be used in medical imaging.
Public release date: 5-Oct-2011
New study shows inflammatory food toxins found in high levels in infants
Research also indicates reduction in intake of food toxins improves diabetes in adults
Researchers from Mount Sinai School of Medicine have found high levels of food toxins called Advanced Glycation End products (AGEs) in infants. Excessive food AGEs, through both maternal blood transmission and baby formula, could together significantly increase children’s risk for diseases such as diabetes from a very young age. A second study of AGEs in adults found that cutting back on processed, grilled, and fried foods, which are high in AGEs, may improve insulin resistance in people with diabetes. AGEs — toxic glucose byproducts previously tied to high blood sugar — are found in most heated foods and, in great excess, in commercial infant formulas.
The first report, published in Diabetes Care in December 2010, showed that AGEs can be elevated as early as at birth, indicating that infants are highly susceptible to the inflammation associated with insulin resistance and diabetes later in life. Helen Vlassara, MD, Professor and Director of the Division of Experimental Diabetes and Aging, working with Jaime Uribarri, MD, Professor of Medicine and colleagues at Mount Sinai School of Medicine, looked at 60 women and their infants to see if there was a passive transfer of AGEs from the blood of mothers to their babies. They found that newborn infants, expected to be practically AGE-free, had levels of AGEs in their blood as high as their adult mothers.
Within the first year of life, after switching from breast milk onto commercial formulas, the infants’ AGEs had doubled to levels seen in people with diabetes, and many had elevated insulin levels. Formulas that are processed under high heat can contain 100 times more AGEs than human breast milk, delivering a huge AGE surplus to infants, which could be toxic.
“Modern food AGEs can overwhelm the body’s defenses, a worrisome fact especially for young children,” said Dr. Vlassara. “More research is certainly needed, but the findings confirm our studies in genetic animal models of diabetes. Given the rise in the incidence of diabetes in children, safe and low cost AGE-less approaches to children’s diet should be considered by clinicians and families.”
The work led to a second report in Diabetes Care, in July 2011, which demonstrates that a modest cut in foods high in AGEs may improve insulin resistance in adults with diabetes. AGEs were found to be elevated in most grilled, fried, or baked foods. Cutting back on the consumption of foods that are heat-processed, but without reducing fat or carbohydrate consumption, improved insulin levels and overall health in patients already treated for, but remaining, insulin resistant. The findings are a dramatic departure from standard clinical recommendations for the management of diabetes.
For four months, 18 overweight people with type 2 diabetes and 18 healthy adults were assigned to an AGE-restricted diet or a standard diet consisting of the same calories and nutrients they ingested before beginning the AGE-restricted diet. An AGE-restricted diet emphasizes poached or stewed foods, such as mashed potatoes instead of fries, stewed chicken instead of grilled chicken, and boiled eggs instead of fried eggs.
The results showed that the subjects with diabetes assigned to the AGE-restricted diet had a 35 percent decrease in blood insulin levels, well beyond that achieved by their previous therapeutic regimen. This was associated with improved markers of inflammation and a restoration of compromised native defenses. This is the first study to show in humans that AGEs promote insulin resistance and possibly diabetes. The study also shows for the first time that restricting the amount of AGEs consumed with food may quickly restore the body’s defenses and reduce insulin resistance.
“This clinical study begins to expose the double role food AGEs play in obesity and in diabetes, a major concern for everyone today, particularly young children. It is especially exciting that a simple intervention such as AGE-restriction or future drugs that block AGE absorption could have a positive effect on these epidemics,” said Dr. Vlassara. “The tenets of the diet could not be simpler; turn down the heat, add water, and eat more at home.”