135 Health Research Report 10 Aug 2012

Health Technology Research Synopsis

135th Issue Date 10 AUG 2012

Compiled By Ralph Turchiano

http://www.healthresearchreport.me www.vit.bz www.youtube.com/vhfilm

http://www.facebook.com/vitaminandherbstore www.engineeringevil.com

Editors Top five:

Dangerous experiment in fetal engineering
Creatine aids women in outmuscling major depression
GW Researcher finds depressive symptoms and suicidal thoughts in former finasteride users
Internal Medicine Physicians Recommend Principles on Role of Governments and Legislation in Regulating Patient-Physician Relationship
Artificial butter flavoring  ingredient linked to key Alzheimer’s disease process
In this Issue:

1.     New coating evicts biofilms for good
2.     Vaccine research shows vigilance needed against evolution of more-virulent malaria
3.     Childhood obesity may affect puberty, create problems with reproduction
4.     Study finds people have difficulty controlling multiple chronic conditions
5.     Weight-loss clinic drop-out rates are a huge barrier to treating obesity
6.     A cup of joe may help some Parkinson’s disease symptoms
7.     New study finds strong evidence of humans surviving rabies bites
8.     Substance involved in Alzheimer’s can reverse paralysis in mice with multiple sclerosis without treatment
9.     Sleep affects potency of vaccines
10.  Artificial butter flavoring ingredient linked to key Alzheimer’s disease process
11.  Global health researchers urge integrating de-worming into HIV care in Africa
12.  Plant-based compound slows breast cancer in a mouse model
13.  Breast cancer charity under fire for overstating the benefits of screening
14.  Note to waitresses: Wearing red can be profitable
15.  Dangerous experiment in fetal engineering
16.  Strawberry extract protects against UVA rays
17.  People With Allergies May Have Lower Risk of Brain Tumors
18.  Students with strong hearts and lungs may make better grades, study finds
19.  Out of Europe
20.  Long-term use of blood pressure meds promoting sun sensitivity may raise lip cancer risk
21.  Off-label drug use common, but patients may not know they’re taking them, Mayo finds
22.  Creatine aids women in outmuscling major depression
23.  Corticosteroids not effective for treating acute sinusitis
24.  GW Researcher finds depressive symptoms and suicidal thoughts in former finasteride users
25.  Why do infants get sick so often?
26.  COI declarations and off-label drug use
27.  Thinner diabetics face higher death rate
28.  Doctors often don’t disclose all possible risks to patients before treatment
29.  Stress makes men appreciate heavier women
30.  New Kenyan fossils shed light on early human evolution
31.  Tai Chi shown to improve COPD exercise capacity
32.  Crossing 5+ time zones more than doubles illness risk for elite athletes
33.  Gum disease 4 times as common in rheumatoid arthritis patients
34.  CU-Boulder-led team discovers new atmospheric compound tied to climate change, human health
35.  Leveraging bacteria in drinking water to benefit consumers
36.  Boys appear to be more vulnerable than girls to the insecticide chlorpyrifos
37.  Internal Medicine Physicians Recommend Principles on Role of Governments and Legislation in Regulating Patient-Physician Relationship
38.  Chronic exposure to staph bacteria may be risk factor for lupus, Mayo study finds
39.  Iron, vitamins could affect physical fitness in adolescents
40.  Eating grapes may help protect heart health in men with metabolic syndrome, new study suggests
 

 

New coating evicts biofilms for good
Slippery technology shown to prevent more than 99 percent of harmful bacterial slime from forming on surfaces

Cambridge, Mass. – July 30, 2012 – Biofilms may no longer have any solid ground upon which to stand.

A team of Harvard scientists has developed a slick way to prevent the troublesome bacterial communities from ever forming on a surface. Biofilms stick to just about everything, from copper pipes to steel ship hulls to glass catheters. The slimy coatings are more than just a nuisance, resulting in decreased energy efficiency, contamination of water and food supplies, and—especially in medical settings—persistent infections. Even cavities in teeth are the unwelcome result of bacterial colonies.

In a study published in the Proceedings of the National Academy of Sciences (PNAS), lead coauthors Joanna Aizenberg, Alexander Epstein, and Tak-Sing Wong coated solid surfaces with an immobilized liquid film to trick the bacteria into thinking they had nowhere to attach and grow.

“People have tried all sorts of things to deter biofilm build-up—textured surfaces, chemical coatings, and antibiotics, for example,” says Aizenberg, Amy Smith Berylson Professor of Materials Science at the Harvard School of Engineering and Applied Sciences (SEAS) and a Core Faculty Member at the Wyss Institute for Biologically Inspired Engineering at Harvard. “In all those cases, the solutions are short-lived at best. The surface treatments wear off, become covered with dirt, or the bacteria even deposit their own coatings on top of the coating intended to prevent them. In the end, bacteria manage to settle and grow on just about any solid surface we can come up with.”

Taking a completely different approach, the researchers used their recently developed technology, dubbed SLIPS (Slippery-Liquid-Infused Porous Surfaces) to effectively create a hybrid surface that is smooth and slippery due to the liquid layer that is immobilized on it.

First described in the September 22, 2011, issue of the journal Nature, the super-slippery surfaces have been shown to repel both water- and oil-based liquids and even prevent ice or frost from forming.

“By creating a liquid-infused structured surface, we deprive bacteria of the static interface they need to get a grip and grow together into biofilms,” says Epstein, a recent Ph.D. graduate who worked in Aizenberg’s lab at the time of the study.

“In essence, we turned a once bacteria-friendly solid surface into a liquid one. As a result, biofilms cannot cling to the material, and even if they do form, they easily ‘slip’ off under mild flow conditions,” adds Wong, a researcher at SEAS and a Croucher Foundation Postdoctoral Fellow at the Wyss Institute.

Aizenberg and her collaborators reported that SLIPS reduced by 96% the formation of three of the most notorious, disease-causing biofilms—Pseudomonas aeruginosa, Escherichia coli, and Staphylococcus aureus—over a 7-day period.

The technology works in both a static environment and under flow, or natural conditions, making it ideally suited for coating implanted medical devices that interact with bodily fluids. The coated surfaces can also combat bacterial growth in environments with extreme pH levels, intense ultraviolet light, and high salinity.

SLIPS is also nontoxic, readily scalable, and—most importantly—self-cleaning, needing nothing more than gravity or a gentle flow of liquid to stay unsoiled. As previously demonstrated with a wide variety of liquids and solids, including blood, oil, and ice, everything seems to slip off surfaces treated with the technology.

To date, this may be the first successful test of a nontoxic synthetic surface that can almost completely prevent the formation of biofilms over an extended period of time. The approach may find application in medical, industrial, and consumer products and settings.

In future studies, the researchers aim to better understand the mechanisms involved in preventing biofilms. In particular, they are interested in whether any bacteria transiently attach to the interface and then slip off, if they just float above the surface, or if any individuals can remain loosely attached.

“Biofilms have been amazing at outsmarting us. And even when we can attack them, we often make the situation worse with toxins or chemicals. With some very cool, nature-inspired design tricks we are excited about the possibility that biofilms may have finally met their match,” concludes Aizenberg

Vaccine research shows vigilance needed against evolution of more-virulent malaria
Malaria parasites evolving in vaccinated laboratory mice become more virulent, according to research at Penn State University. The mice were injected with a critical component of several candidate human malaria vaccines that now are being evaluated in clinical trials. “Our research shows immunization with this particular type of malaria vaccine can create ecological conditions that favor the evolution of parasites that cause more severe disease in unvaccinated mice,” said Andrew Read, Alumni Professor of Biological Sciences at Penn State.

“We are a long way from being able to assess the likelihood of this process occurring in humans, but our research suggests the need for vigilance. It is possible that more-virulent strains of malaria might evolve if a malaria vaccine goes into widespread use,” Read said. The research, which will be published in the 31 July 2012 issue of the scientific journal PLoS Biology, showed that more-virulent malaria parasites evolved in response to vaccination, but the exact mechanism is still a mystery. It was not due to changes in the part of the parasite targeted by the vaccine.

No malaria vaccine ever has been approved for widespread use. “Effective malaria vaccines are notoriously difficult to develop because the malaria parasite is very complex. Hundreds of different malaria strains exist simultaneously within any local region where the disease is prevalent,” Read said. Most vaccine developers use only small sections of the malaria parasite to produce an antigen molecule that then becomes a key ingredient in a highly purified malaria vaccine. Read’s lab tested the antigen AMA-1, a component of several such vaccines now in various stages of clinical trials.

“Our laboratory experiments followed clues from theoretical studies and earlier experiments that suggested that some malaria vaccines could favor the evolution of more-virulent malaria parasites,” Read said. If candidate vaccines do not completely eliminate all the malaria parasites, the parasites that remain have opportunities to evolve. A mosquito then could transfer the evolved parasite from the vaccinated person into a new host — a process called leaking. “Leaky vaccines create a situation that further fosters parasite evolution,” Read said.

The Penn State study found that parasites causing worse malaria symptoms in unvaccinated mice evolved after “leaking” consecutively through as few as 10 vaccinated mice. “The parasites that are able to survive in the immunized hosts must be stronger after having survived exposure to the vaccine,” Read said. “The vaccine-induced immunity apparently removed the less virulent malaria parasites, but left the more virulent ones.”

The AMA-1 antigen used in the Penn State study triggers the body to make anti-malaria antibodies. These antibodies recognize the AMA-1 antigen on the parasites and disable the malaria infection. The shape of the antigen ensures that the antibodies can bind securely with the malaria parasite — like pieces in a jigsaw puzzle — an important step in producing immunity. Scientists already knew that vaccines become obsolete when evolutionary mutations change the parasite’s antigen structure in such a way that the antibody is not able to lock onto the targeted part of the parasite. But the Penn State study showed the malaria parasite evolved within the vaccinated mice even without any detectable changes in the antibody target on the parasite.

“We were surprised to find that more-virulent strains of malaria evolved even while the gene encoding the key antigen remained unchanged,” said Victoria Barclay, the postdoctoral scholar in Read’s lab who conducted the laboratory experiments and who is the corresponding author of the PLoS Biology paper. “We did not detect any changes in the gene sequence.” The researchers conclude that evolution must have taken place somewhere in the parasite’s genome. Read’s lab now is hunting for the exact locations on the parasite’s DNA where the mutations occurred.

“Generalizing from animal models is notoriously difficult in malaria,” Read said, so the scientists do not yet know if this newly recognized type of evolution could happen in human malaria or with other rapidly evolving diseases, such as the viruses that cause AIDS or cervical cancer. “What we do know is that in Victoria Barclay’s experiments in our lab at Penn State; with our parasites, our mice, and with this particular antigen; the malaria parasites that evolved through vaccinated hosts become more virulent,” he said.

“Vaccines are one of the most fantastically cost-effective health gains we’ve ever had, so there is no question that we should proceed on all fronts to develop a safe and effective vaccine against malaria,” Read said. “At the same time, our research is revealing new reasons to proceed with vigilant caution.” Read suggests that vaccine researchers conducting clinical trials should not only be carefully monitoring for parasite evolution at the vaccine target, but they also should watch for mutations throughout the parasite’s entire genome. “This sort of monitoring also should go on once a new vaccine goes into widespread use,” he said. “It appears that in a world with leaky vaccines, virulent pathogen strains can evolve. Different vaccines or other transmission-blocking measures might be needed to stop the spread of any evolved parasites,” Read said.

###

Childhood obesity may affect puberty, create problems with reproduction
7-31-12

CORVALLIS, Ore. – A dramatic increase in childhood obesity in recent decades may have impacts that go beyond the usual health concerns – it could be disrupting the timing of puberty and ultimately lead to a diminished ability to reproduce, especially in females.

A body of research suggests that obesity could be related to growing problems with infertility, scientists said in a recent review, in addition to a host of other physical and psycho-social concerns. The analysis was published in Frontiers in Endocrinology.

Human bodies may be scrambling to adjust to a problem that is fairly new. For thousands of years of evolution, poor nutrition or starvation were a greater concern, rather than an overabundance of food.

“The issue of so many humans being obese is very recent in evolutionary terms, and since nutritional status is important to reproduction, metabolic syndromes caused by obesity may profoundly affect reproductive capacity,” said Patrick Chappell, an assistant professor of veterinary medicine at Oregon State University and an author of the recent report.

“Either extreme of the spectrum, anorexia or obesity, can be associated with reproduction problems,” he said.

Researchers are still learning more about the overall impact of obesity on the beginning of puberty and effects on the liver, pancreas and other endocrine glands, Chappell said. While humans show natural variations in pubertal progression, the signals that control this timing are unclear.

But in general, puberty appears to be starting earlier in girls. It is being accelerated.

This may have several effects, scientists have found. One theory is an impact on kisspeptin, a recently characterized neurohormone necessary for reproduction. Normal secretions of this hormone may be disrupted by endocrine signals from fat that serve to communicate to the brain.

Another possible affect on pubertal timing, and reproduction in general, is disruption of circadian clocks, which reflect the natural rhythms of night and day. Disrupted sleep-wake cycles can affect the secretion of hormones such as cortisol, testosterone, and insulin, researchers have found.

“Any disruption of circadian clocks throughout the body can cause a number of problems, and major changes in diet and metabolism can affect these cellular clocks,” Chappell said. “Disruption of the clock through diet can even feed into a further disruption of normal metabolism, making the damage worse, as well as affecting sleep and reproduction.”

Molecular mechanisms have only started to be uncovered in the past decade, the report said, and the triggers that control pubertal development are still widely debated. For millennia, many mammals made adjustments to reduce fertility during periods of famine. But it now appears that an excess of fat can also be contributing to infertility rates and reproductive diseases.

Some studies in humans have found correlations between early puberty and the risk of reproductive cancers, adult-onset diabetes, and metabolic syndrome. Early onset puberty has also been associated with increased rates of depression and anxiety in girls, studies have found, as well as increased delinquent behavior, smoking and early sexual experiences in both girls and boys.

Other research has suggested that such problems can persist into adulthood, along with lower quality of life, higher rates of eating disorders, lower academic achievement and higher rates of substance abuse.

Additional research is needed to better understand the effect of these processes on metabolism, hormones and other development processes, the survey concluded

Study finds people have difficulty controlling multiple chronic conditions
DENVER, July 31, 2012 – Most people who have diabetes, high blood pressure, and high cholesterol have difficultly managing all three conditions; indeed, success is fleeting for those who do manage all three, according to a Kaiser Permanente Institute for Health Research study that appears online in the American Heart Association journal Circulation: Cardiovascular Quality and Outcomes.

The study of close to 29,000 individuals enrolled at Kaiser Permanente Colorado and Denver Health found that only 30.3 percent at Kaiser Permanente and 16.2 percent of individuals at Denver Health were able to simultaneously control their diabetes, hypertension, and hyperlipidemia (high cholesterol), as measured by risk factor control guidelines defined by the American Diabetes Association. But among those individuals that achieved simultaneous control of their conditions, few were able to maintain it at either institution.

Specifically, the study found that among those individuals with at least 90 days of follow-up after achieving simultaneous control, 39 percent from Kaiser Permanente and 23 percent from Denver Health subsequently lost and then regained control, and 56 percent and 64 percent lost control and never regained it. Only 5 percent at Kaiser Permanente and 13 percent at Denver Health never lost control of their chronic conditions–efforts to understand the strategies used by these individuals to care for their diabetes may provide insight into improving self-care and health outcomes, according to the researchers.

The researchers defined risk factor control using the 2002 guidelines from the American Diabetes Association, which were in place for the majority of the study period.

“Diabetes and other chronic conditions associated with it are very difficult but not impossible diseases to manage, for reasons we’re just beginning to understand,” said Emily B. Schroeder, MD, PhD, a research scientist with the Kaiser Permanente Institute for Health Research and the lead author of the study. “This research tells us to look more closely at how certain patients are able to control their multiple conditions, so clinicians can help patients recover and avoid greater risks of developing other ailments, including cardiovascular and kidney disease.”

Multiple chronic conditions pose a significant and increasing burden in the United States according to the U.S. Department of Health and Human Services, with more than 75 million people in America reporting two or more chronic conditions. These individuals must maintain good nutrition and physical activity, manage complex medication regimens, and monitor themselves for achievement of treatment goals.

“This study highlights the fact that having several chronic health problems is common. We can’t treat these conditions in isolation,” said Edward P. Havranek, MD, a physician with Denver Health and co-author on the study. “We need to learn to help patients find ways to manage difficult combinations of conditions through strong relationships with primary care providers, simplified medicine regimens, and training in good diet and exercise.”

This study marks the latest effort by Kaiser Permanente to better understand the impacts of diabetes. Earlier this year, a Kaiser Permanente study published in the Journal of the American Board of Family Medicine found that when patients with diabetes experience interruptions in health insurance coverage, they are less likely to receive the screening tests and vaccines they need to protect their health. The study found this was true even when patients received free or reduced-cost medical care at federally funded safety net clinics.

Weight-loss clinic drop-out rates are a huge barrier to treating obesity
More than 1.7 billion people worldwide may be classified as overweight and need appropriate medical or surgical treatment with the goal of sustainable weight loss. But for weight management programs to be effective, patients must complete them, states a study published in the Canadian Journal of Surgery (CJS) that analyzed drop-out rates and predictors of attrition within a publicly-funded adult weight management program.

Researchers from the Department of Surgery at the University of Alberta and the Centre for the Advancement of Minimally Invasive Surgery at the Royal Alexandria Hospital in Edmonton, Alberta, found that over a six-year period almost half (43%) of the patients of a weight-management clinic funded by Alberta Health Services dropped out of the program before achieving sustainable weight loss.

The program involves 6 months of primary care, including education on strategies for treating obesity, nutritional counselling, smoking cessation, physical activity and mental health assessment to identify untreated conditions, such as depression, that may be barriers to effective weight management. Some participants also undergo bariatric surgery.

In a group of patients who are motivated enough to participate in a program like this, a 43% drop-out rate is surprising. “Identifying the factors that predict attrition may serve as a basis for program improvement and further research,” the authors state.

Among the patients included in the study, the drop-out rate was 54% in the group treated by medical management only and 12% in the group treated surgically. These drop-out rates are similar to those reported in other studies. “We speculate that patients willing to undergo the initial bariatric surgical procedure may be more committed to complete the program,” the authors explain. They suggest that the substantial early weight loss associated with bariatric surgery may serve as additional motivation to continue in the program.

Younger patients and women were also more likely to drop out of the program.

“Further research is needed to clarify why surgical patients have lower attrition rates and how these factors can be applied to proactively decrease the drop-out rates and increase success,” the authors state.

A cup of joe may help some Parkinson’s disease symptoms
MINNEAPOLIS – While drinking caffeine each day does not appear to help improve sleepiness among people with Parkinson’s disease, it may have a benefit in controlling movement, according to new research published in the August 1, 2012, online issue of Neurology®, the medical journal of the American Academy of Neurology .

“Studies have shown that people who use caffeine are less likely to develop Parkinson’s disease, but this is one of the first studies in humans to show that caffeine can help with movement symptoms for people who already have the disease,” said study author Ronald Postuma, MD, MSc, with McGill University in Montreal and the Research Institute of the McGill University Health Center. Postuma is also a member of the American Academy of Neurology.

For the study, 61 people with Parkinson’s disease who showed symptoms of daytime sleepiness and some motor symptoms were given either a placebo pill or a pill with 100 milligrams of caffeine two times a day for three weeks, then 200 milligrams twice a day for three weeks, which was the equivalent of between two and four cups of coffee per day.

After six weeks, the half that took the caffeine supplements averaged a five-point improvement in Parkinson’s severity ratings compared to those who didn’t consume caffeine. “This is a modest improvement, but may be enough to provide benefit to patients. On the other hand, it may not be sufficient to explain the relationship between caffeine non-use and Parkinson’s, since studies of the progression of Parkinson’s symptoms early in the disease suggest that a five-point reduction would delay diagnosis by only six months,” said Postuma.

The caffeine group also averaged a three-point improvement in the speed of movement and amount of stiffness compared to the placebo group. Caffeine did not appear to help improve daytime sleepiness and there were no changes in quality of life, depression or sleep quality in study participants.

“The study is especially interesting since caffeine seems to block a malfunctioning brain signal in Parkinson’s disease and is so safe and inexpensive,” said Michael Schwarzschild, MD, PhD, of Massachusetts General Hospital in Boston, who wrote an accompanying editorial. “Although the results do not suggest that caffeine should be used as a treatment in Parkinson’s disease, they can be taken into consideration when people with Parkinson’s are discussing their caffeine use with their neurologist.” Schwarzschild is also a member of the American Academy of Neurology.

The study authors noted that the length of the study was short and that the effects of caffeine may lessen over time.

New study finds strong evidence of humans surviving rabies bites without treatment
First indication of people naturally protected against rabies found in remote Amazonian communities regularly exposed to vampire bats

Deerfield, IL (August 1, 2012) Challenging conventional wisdom that rabies infections are 100 percent fatal unless immediately treated, scientists studying remote populations in the Peruvian Amazon at risk of rabies from vampire bats found 11 percent of those tested showed protection against the disease, with only one person reporting a prior rabies vaccination. Ten percent appear to have survived exposure to the virus without any medical intervention. The findings from investigators at the U.S. Centers for Disease Control and Prevention (CDC) were published today in the August 2012 issue of the American Journal of Tropical Medicine and Hygiene.

“The overwhelming majority of rabies exposures that proceed to infections are fatal. However, our results open the door to the idea that there may be some type of natural resistance or enhanced immune response in certain communities regularly exposed to the disease,” said Amy Gilbert with the CDC’s National Center for Emerging and Zoonotic Infectious Diseases, who is the paper’s lead author. “This means there may be ways to develop effective treatments that can save lives in areas where rabies remains a persistent cause of death.”

Rabies experts estimate the disease kills 55,000 people each year in Africa and Asia alone, and appears to be on the rise in China, the former Soviet Republics, southern Africa, and Central and South America. According to the CDC, in the United States, human deaths from rabies have declined over the past century from 100 annually to an average of two per year thanks to an aggressive campaign to vaccinate domestic animals against the disease.

In general, people who believe they may have been exposed to rabies are advised to immediately seek treatment which involves post-exposure prophylaxis (PEP) – a series of injections – to prevent the exposure from causing an active infection. These preventive treatments, when administered promptly, are 100 percent successful at preventing disease. Scientists have documented only a small number of individual cases, including one last year in California, in which an exposure to rabies proceeded to infection and the victim survived. Most of those survivors still required intensive medical attention, including one case in Wisconsin in which doctors induced a coma, though this approach has not been successful in most subsequent cases.

This CDC study was conducted in collaboration with the Peruvian Ministry of Health as part of a larger project to understand better bat-human interactions and its relation to rabies and emerging diseases that may be transmitted by bats. For their research, scientists traveled to two communities (Truenococha and Santa Marta) in a remote section of the Peruvian Amazon where outbreaks of fatal infections with rabies caused by bites from vampire bats—the most common “natural reservoir” for the disease in Latin America— have occurred regularly over the last two decades. They interviewed 92 people, 50 of whom reported previous bat bites. Blood samples were taken from 63 individuals and seven (11 percent) were found to have “rabies virus neutralizing antibodies.”

One out of the seven individuals reported receiving a rabies vaccination—which generates antibodies to the rabies virus—but there was no evidence that the other six had received anti-rabies vaccine prior to the blood sampling or had sought out any medical attention for a bat bite, evidence that they had harbored the virus itself.

The researchers acknowledged that they could not conclusively determine whether the antibodies were caused by an exposure to the virus that was somehow insufficient to produce disease. But they believe their evidence “suggests that (rabies virus) exposure is not invariably fatal to humans.”

Gilbert said non-fatal exposures may happen more often than some think because “unless people have clinical symptoms of the disease they may not go to the hospital or clinic, particularly where access is limited.”

“We all still agree that nearly everyone who is found to be experiencing clinical symptoms of rabies dies,” Gilbert said. “But we may be missing cases from isolated high-risk areas where people are exposed to rabies virus and, for whatever reason, they don’t develop disease.”

In the Amazon region where the study was conducted—the Province Datem del Maranon in the Loreto Department of northern Peru—vampire bats, which live off of mammalian blood, regularly come out at night and prefer to feed on livestock. But in the absence of those food sources, they are known to seek out a meal from humans. They can use their extremely sharp teeth and the anticoagulant that naturally occurs in their saliva (appropriately referred to as “draculin”) to feed on a sleeping person without awakening them. The rabies virus circulates extensively among vampire bat colonies in the region, and when an infected bat feeds, it passes along the virus to its host.

“This type of thorough and persistent scientific rabies investigation lends continued support to the belief that even the most dangerous of infectious diseases may be amenable to treatment,” said James W. Kazura, MD noted infectious disease expert and president of the American Society of Tropical Medicine and Hygiene (ASTMH). “Continued investment of resources is essential for us to protect the health and well-being of innocent people whose lives and livelihoods are needlessly threatened by infectious diseases like rabies.”

Gilbert and her colleagues hope their findings will prompt further studies in remote, at-risk communities to see if the results are replicated. In an editorial accompanying the study, Rodney E. Willoughby, a pediatric disease specialist at Children’s Hospital of Wisconsin, said if it turns out there are distinct populations of people with “complete or relative resistance to rabies,” there could be the potential to use whole genome sequencing to help develop new, life-saving treatments for rabies infections.

“Careful, respectful genetic study of these genetically unique populations may provide information on which pathways in human biochemistry and physiology promote resistance to human rabies,” he wrote. “Equally important, knowing that there is a continuum of disease, even for infectious diseases like rabies, should push us harder to try for cures when confronted by so-called untreatable infectious diseases….”

Gilbert noted that the study was done as part of a larger public health effort to address a series of rabies outbreaks in the Amazon, where some health officials are now considering conducting pre-emptive vaccination campaigns in areas where risk of rabies is high and availability of medical care low. She said that while her study highlights people who appear to have survived an exposure to the virus, the fact remains that rabies outbreaks in small communities in the region have left tragic results.

“These are very small villages and, when they witness ten people dying from what is a horrible disease, it is incredibly traumatic,” Gilbert said. “We want to help raise awareness of the problem and try to develop a more proactive response.”

Substance involved in Alzheimer’s can reverse paralysis in mice with multiple sclerosis
STANFORD, Calif. — A molecule widely assailed as the chief culprit in Alzheimer’s disease unexpectedly reverses paralysis and inflammation in several distinct animal models of a different disorder — multiple sclerosis, Stanford University School of Medicine researchers have found.

This surprising discovery, which will be reported in a study to be published online Aug. 1 as the cover feature in Science Translational Medicine, comes on the heels of the recent failure of a large-scale clinical trial aimed at slowing the progression of Alzheimer’s disease by attempting to clear the much-maligned molecule, known as A-beta, from Alzheimer’s patients’ bloodstreams. While the findings are not necessarily applicable to the study of A-beta’s role in the pathology of that disease, they may point to promising new avenues of treatment for multiple sclerosis.

The short protein snippet, or peptide, called A-beta (or beta-amyloid) is quite possibly the single most despised substance in all of brain research. It comes mainly in two versions differing slightly in their length and biochemical properties. A-beta is the chief component of the amyloid plaques that accumulate in the brains of Alzheimer’s patients and serve as an identifying hallmark of the neurodegenerative disorder.

A-beta deposits also build up during the normal aging process and after brain injury. Concentrations of the peptide, along with those of the precursor protein from which it is carved, are found in multiple-sclerosis lesions as well, said Lawrence Steinman, MD, the new study’s senior author. In a lab dish, A-beta is injurious to many types of cells. And when it is administered directly to the brain, A-beta is highly inflammatory.

Yet little is known about the physiological role A-beta actually plays in Alzheimer’s — or in MS, said Steinman, a professor of neurology and neurological sciences and of pediatrics and a noted multiple-sclerosis researcher. He, first author Jacqueline Grant, PhD, and their colleagues set out to determine that role in the latter disease. (Grant was a graduate student in Steinman’s group when the work was done.)

Multiple sclerosis, an inflammatory autoimmune disease, occurs when immune cells invade the brain and spinal cord and attack the insulating coatings of nerve cells’ long, cable-like extensions called axons. Damage to these coatings, composed largely of a fatty substance called myelin, disrupts the transmission of signals that ordinarily travel long distances down axons to junctions with other nerve cells. This signal disruption can cause blindness, loss of muscle control and difficulties with speech, thought and attention.

Previous research by Steinman, who is also the George A. Zimmerman Professor, and others showed that both A-beta and its precursor protein are found in MS lesions. In fact, the presence of these molecules along an axon’s myelinated coating is an excellent marker of damage there.

Given the peptide’s nefarious reputation, Steinman and his associates figured that A-beta was probably involved in some foul play with respect to MS. To find out, they relied on a mouse model that mimics several features of multiple sclerosis — including the autoimmune attack on myelinated sections of the brain that causes MS.

Steinman had, some years ago, employed just such a mouse model in research that ultimately led to the development of natalizumab (marketed as Tysabri), a highly potent MS drug. That early work proved that dialing down the activation and proliferation of immune cells located outside the central nervous system (which is what natalizumab does) could prevent those cells from infiltrating and damaging nerve cells in the CNS.

Knowing that immunological events outside the brain can have such an effect within it, the Stanford scientists were keen on seeing what would happen when they administered A-beta by injecting it into a mouse’s belly, rather than directly to the brain.

“We figured it would make it worse,” Steinman said.

Surprisingly, the opposite happened. In mice whose immune systems had been “trained” to attack myelin, which typically results in paralysis, A-beta injections delivered before the onset of symptoms prevented or delayed the onset of paralysis. Even when the injections were given after the onset of symptoms, they significantly lessened the severity of, and in some cases reversed, the mice’s paralysis.

Steinman asked Grant to repeat the experiment. She did, and got the same results.

His team then conducted similar experiments using a different mouse model: As before, they primed the mice’s immune cells to attack myelin. But rather than test the effects of A-beta administration, the researchers harvested the immune cells about 10 days later, transferred them by injection to another group of mice that did not receive A-beta and then analyzed this latter group’s response. The results mirrored those of the first set of experiments, proving that A-beta’s moderating influence on the debilitating symptoms of the MS-like syndrome has nothing to do with A-beta’s action within the brain itself, but instead is due to its effect on immune cells before they penetrate the brain.

Sophisticated laboratory tests showed that A-beta countered not only visible symptoms such as paralysis, but also the increase in certain inflammatory molecules that characterizes multiple-sclerosis flare-ups. “This is the first time A-beta has been shown to have anti-inflammatory properties,” said Steinman.

Inspection of the central nervous systems of the mice with the MS-resembling syndrome showed fewer MS-like lesions in the brains and spinal cords of treated mice than in those not given A-beta. There was also no sign of increased Alzheimer’s-like plaques in the A-beta-treated animals. “We weren’t giving the mice Alzheimer’s disease” by injecting A-beta into their bellies, said Grant.

In addition, using an advanced cell-sorting method called flow cytometry, the investigators showed A-beta’s strong effects on the immune system composition outside the brain. The numbers of immune cells called B cells were significantly diminished, while those of two other immune-cell subsets — myeloid cells and memory T-helper cells — increased.

“At this point we wanted to find out what would happen if we tried pushing A-beta levels down instead of up,” Grant said. The researchers conducted a different set of experiments, this time in mice that lacked the gene for A-beta’s precursor protein, so that they could produce neither the precursor nor A-beta. These mice, when treated with myelin-sensitized immune cells to induce the MS-like state, developed exacerbated symptoms and died faster and more frequently than normal mice who underwent the same regimen.

Lennart Mucke, MD, director of the Gladstone Institute of Neurological Disease in San Francisco and a veteran Alzheimer’s researcher, noted that while A-beta’s toxicity within the brain has been established beyond reasonable doubt, many substances made in the body can have vastly different functions under different circumstances.

“A-beta is made throughout our bodies all of the time. But even though it’s been studied for decades, its normal function remains to be identified,” said Mucke, who is familiar with Steinman’s study but wasn’t involved in it. “Most intriguing, to me, is this peptide’s potential role in modulating immune activity outside the brain.”

The fact that the protection apparently conferred by A-beta in the mouse model of multiple sclerosis doesn’t require its delivery to the brain but, rather, can be attributed to its immune-suppressing effect in the body’s peripheral tissues is likewise intriguing, suggested Steinman.

“There probably is a multiple-sclerosis drug in all this somewhere down the line,” he said.

Sleep affects potency of vaccines
As moms have always known, a good night’s sleep is crucial to good health — and now a new study led by a UCSF researcher shows that poor sleep can reduce the effectiveness of vaccines.

The study is the first performed outside a sleep laboratory to show that sleep duration is directly tied to vaccine immune response, the authors said.

The study, conducted while the UCSF researcher was a doctoral student at the University of Pittsburgh, will appear in the August issue of the journal “SLEEP.”

“With the emergence of our 24-hour lifestyle, longer working hours, and the rise in the use of technology, chronic sleep deprivation has become a way of life for many Americans,” said lead author Aric Prather, PhD, a clinical health psychologist and Robert Wood Johnson Foundation Health & Society Scholar at UCSF and UC Berkeley.

“These findings should help raise awareness in the public health community about the clear connection between sleep and health,” Prather said.

Research has shown that poor sleep can make one susceptible to illnesses such as upper respiratory infections. To explore whether sleep duration, sleep efficiency, and sleep quality – assessed at home and not in a controlled sleep lab — would impact immune processes important in the protection against infection, the researchers investigated the antibody response to hepatitis B vaccinations on adults in good health. Antibodies are manufactured by the immune system to identify and neutralize foreign objects such as viruses.

The study involved 125 people (70 women, 55 men) between the ages of 40 and 60. All were nonsmokers in relatively good health, and all lived in Pennsylvania – the study was conducted at the University of Pittsburgh. Each participant was administered the standard three-dose hepatitis B vaccine; the first and second dose were administered a month apart, followed by a booster dose at six months.

Antibody levels were measured prior to the second and third vaccine injection and six months after the final vaccination to determine whether participants had mounted a “clinically protective response.”

All the participants completed sleep diaries detailing their bedtime, wake time and sleep quality, while 88 subjects also wore electronic sleep monitors known as actigraphs.

The researchers found that people who slept fewer than six hours on average per night were far less likely to mount antibody responses to the vaccine and thus were far more likely (11.5 times) to be unprotected by the vaccine than people who slept more than seven hours on average. Sleep quality did not affect response to vaccinations.

Of the 125 participants, 18 did not receive adequate protection from the vaccine. “Sleeping fewer than six hours conferred a significant risk of being unprotected as compared with sleeping more than seven hours per night,” the scientists wrote.

The researchers stressed that sleep plays an important role in the regulation of the immune system. A lack of sleep, they said, may have detrimental effects on the immune system that are integral to vaccine response.

The National Sleep Foundation recommends seven to nine hours sleep a night. (For tips on a better night’s sleep, see: http://www.ucsfhealth.org/education/tips_for_a_better_nights_sleep/index.html)

“Based on our findings and existing laboratory evidence, sleep may belong on the list of behavioral risk factors that influence vaccination efficacy,” said Prather who in September will join the UCSF faculty as an assistant professor in the Department of Psychiatry. “While there is more work to be done in this area, in time physicians and other health care professionals who administer vaccines may want to consider asking their patients about their sleep patterns, since lack of sleep may significantly affect the potency of the vaccination.”

Artificial butter flavoring ingredient linked to key Alzheimer’s disease process

A new study raises concern about chronic exposure of workers in industry to a food flavoring ingredient used to produce the distinctive buttery flavor and aroma of microwave popcorn, margarines, snack foods, candy, baked goods, pet foods and other products. It found evidence that the ingredient, diacetyl (DA), intensifies the damaging effects of an abnormal brain protein linked to Alzheimer’s disease. The study appears in ACS’ journal Chemical Research in Toxicology.

Robert Vince and colleagues Swati More and Ashish Vartak explain that DA has been the focus of much research recently because it is linked to respiratory and other problems in workers at microwave popcorn and food-flavoring factories. DA gives microwave popcorn its distinctive buttery taste and aroma. DA also forms naturally in fermented beverages such as beer, and gives some chardonnay wines a buttery taste. Vince’s team realized that DA has an architecture similar to a substance that makes beta-amyloid proteins clump together in the brain — clumping being a hallmark of Alzheimer’s disease. So they tested whether DA also could clump those proteins.

DA did increase the level of beta-amyloid clumping. At real-world occupational exposure levels, DA also enhanced beta-amyloid’s toxic effects on nerve cells growing in the laboratory. Other lab experiments showed that DA easily penetrated the so-called “blood-brain barrier,” which keeps many harmful substances from entering the brain. DA also stopped a protective protein called glyoxalase I from safeguarding nerve cells. “In light of the chronic exposure of industry workers to DA, this study raises the troubling possibility of long-term neurological toxicity mediated by DA,” say the researchers.

A diet high in choline during pregnancy may mean less stress for baby
Most women in the US consume too little choline, an essential nutrient found in eggs

Park Ridge, Ill. (August 1, 2012) – New research from Cornell University indicates that pregnant women who increase choline intake in the third trimester of pregnancy may reduce the risk of the baby developing metabolic and chronic stress-related diseases like high blood pressure and diabetes later in life.(i) The results, published in the latest edition of the Journal of the Federation of American Societies for Experimental Biology, suggest that choline, a nutrient found in high quantities in eggs, may help protect against the effects of a mother’s stress during pregnancy. Previous research indicates high exposure to the stress hormone cortisol during pregnancy, often due to maternal anxiety or depression, may make offspring vulnerable to stress-induced illness and chronic conditions.(ii, iii) This finding adds to the growing body of evidence demonstrating the importance of choline in fetal development.

A Closer Look at the Study

Twenty-four women in the third trimester of pregnancy were randomly assigned to consume either 480 milligrams (mg) choline per day or 930 mg per day for 12 weeks prior to delivery. Researchers collected maternal and placental blood samples as well as samples of placental tissue. They then compared cortisol levels and genetic differences among all the samples. The researchers observed lower levels of cortisol in the placental cord and changes in cortisol-regulating genes in both the placental and fetal tissue among women in the higher choline intake group. “The study findings raise the exciting possibility that a higher maternal choline intake may counter some of the adverse effects of prenatal stress on behavioral, neuroendocrine, and metabolic development in the offspring,” says Marie Caudill, PhD, Cornell University, who is an author of the study and a leading choline researcher.

Choline: A Vital Nutrient

Choline is especially important for pregnant women – it has been shown to play an important role in fetal and infant brain development, affecting the areas of the brain responsible for memory and life-long learning ability. In addition, research shows women with diets low in choline have four times greater risk of having babies with neural tube defects, such as spina bifida.(iv)

Emerging research also shows choline may have additional benefits in other areas, including:

Breast cancer prevention: A study funded by the National Institutes of Health concluded that dietary choline is associated with a 24 percent reduced risk of breast cancer.(v)
Anti-inflammatory: Foods rich in choline may help reduce the risk of inflammation associated with chronic diseases such as cardiovascular disease, bone loss, dementia and Alzheimer’s disease.(vi)
Brain function: Choline also promotes adult brain function by preserving the structure of brain cell membranes and is an essential component of acetylcholine, the neurotransmitter involved in memory function and muscle control.(vii)
The Incredible Excellent Source of Choline

Despite its important role in the body, only one in 10 Americans is meeting the Adequate Intake (AI) guidelines for choline.(viii) Eggs are an excellent source of choline, containing 125 mg per egg. Neva Cochran, registered dietitian and nutrition communications consultant, explains that the nutritional benefits of eggs are not merely limited to choline. “Not only are eggs an excellent source of choline, they contain many other nutrients pregnant women need most, such as high-quality protein, iron and folate—all for just about 15 cents apiece,” says Cochran. In order to get adequate amounts of choline, Cochran suggests the following tips:

Find it in Food: A great way to get your daily dose of choline is to include choline-rich foods in the diet, such as eggs, lean beef, cauliflower and peanuts. Also keep in mind most multivitamins, even prenatal vitamins, provide far less than the Adequate Intake for choline.
Don’t Skip the Yolk: Choline is found exclusively in the egg yolk, not the white. Nearly half of the protein and most of the vitamins and minerals are also contained in the yolk.
Global health researchers urge integrating de-worming into HIV care in Africa
School-based programs not as effective in reaching children with infestations

HIV care centers are an important and highly accessed point of care for HIV-infected children and their families in sub-Saharan Africa, but opportunities to address other health issues are being missed. Proven interventions, including routine deworming among young children, could be effectively integrated into HIV care according to a newly published article in PLoS by University of Washington researchers.

The article, “Integration of Deworming into HIV Care and Treatment: A Neglected Opportunity,” estimates that millions of HIV-infected individuals in sub-Saharan Africa are also infested with parasitic worms, called helminths. The parasitic infestations have enormous health consequences, including anemia, malnutrition, and impaired cognitive development, and may also increase the progression of HIV.

Helen Gerns, the lead researcher on the paper, said that the World Health Organization (WHO) recommends annual or bi-annual deworming of children in schools as a cost-effective strategy to diminish the consequences of chronic helminth infection. However, many children are not routinely being dewormed through programs in schools. This, despite the fact that standard treatment of soil-transmitted helminth infection entails only a single 400 mg dose of albendazole, making routine deworming of children a simple intervention. In addition to missing those children who are unable to attend school during deworming campaigns, school based deworming misses pre-school aged children.

“Annual deworming of preschool-aged children is safe and highly effective in reducing parasite prevalence and intensity, malnutrition, and risk of stunting, but a formal policy does not yet exist to target this age group,” said the researchers. “Because children are infected and often diagnosed with HIV while very young, preschool-aged children can easily be dewormed in HIV clinics, along with their siblings, to reduce the occurrence of reinfection.”

Deworming HIV-infected children may have additional benefits, including delaying the progression of HIV, reducing other infections, and increasing responsiveness to vaccines.

“Population-level data show that regional variations in vaccine efficacy correlate with variations in the prevalence of enteric [gut] pathogens,” according to the article. The article cites increased efficiency in the rotavirus vaccine and polio eradication efforts in countries with fewer cases of helminth infections.

“Children who failed to respond to oral poliovirus vaccinations were 25 percent more likely to harbor infections with intestinal parasites than vaccine responders,” according to the article.

“Given that deworming is standard medical care among all children in many settings, we should not miss such an effective, inexpensive and practical opportunity to deworm HIV-infected children,” said co-author Dr. Judd Walson, assistant professor of global health at the University of Washington

Plant-based compound slows breast cancer in a mouse model
The natural plant compound phenethyl isothiocyanate (PEITC) hinders the development of mammary tumors in a mouse model with similarities to human breast cancer progression, according to a study published August 2 in the Journal of the National Cancer Institute.

Edible plants are gaining ground as chemopreventative agents. PEITC has shown to be effective as a chemopreventative agent in mice for colon, intestinal, and prostate cancer, by inducing apoptosis.

In order to determine the efficacy of PEITC in mammary tumors in mice, Shivendra V. Singh, Ph.D., of the University of Pittsburgh Cancer Institute and colleagues, placed mice on two diets: a control diet, and a diet supplemented with PEITC for 29 weeks. The researchers performed histopathological assessments, and measured the incidence and size of the mammary tumors, along with cell proliferation, apoptosis, and neoangiogenesis, which were determined in tumor sections.

The researchers found that administering PEITC for 29 weeks was linked with a 56.3% reduction in mammary carcinoma lesions greater than 2mm. “Although PEITC administration does not confer complete protection against mammary carcinogenesis, mice placed on the PEITC-supplemented diet, compared with mice placed on the control diet, clearly exhibited suppression of carcinoma progression,” the authors write. PEITC was also well-tolerated. Since chemoprevention trials are both expensive and time-consuming and necessitate years of follow-up, the authors feel that, “The discovery of biomarker(s) associated with exposure and activity is critical for clinical development of promising cancer chemopreventative agents.” This study was able to identify certain biomarkers that may be useful in future clinical investigations.

The authors also point out certain limitations of their study, namely that the results may be different in humans than in mice; also both the relevance of other altered proteins from PEITC and the mechanism by which PEITC causes apoptosis are unclear.

Breast cancer charity under fire for overstating the benefits of screening
Experts challenge ‘pink ribbon’ creator for misusing statistics to generate false hope

Professors Lisa Schwartz and Steven Woloshin of the Center for Medicine and the Media at The Dartmouth Institute for Health Policy and Clinical Practice argue that last year’s breast cancer awareness month campaign by Susan G Komen for the Cure “overstates the benefit of mammography and ignores harms altogether.”

Their views are published on bmj.com today as part of an occasional series highlighting the exaggerations, distortions, and selective reporting that make some news stories, advertising, and medical journal articles “not so.”

A growing and increasingly accepted body of evidence shows that although screening may reduce a woman’s chance of dying from breast cancer by a small amount, it also causes major harms, say the authors. Yet Komen’s public advertising campaign gives women no sense that screening is a close call.

Instead it states that the key to surviving breast cancer is for women to get screened because “early detection saves lives. The 5-year survival rate for breast cancer when caught early is 98%. When it’s not? 23%.”

This benefit of mammography looks so big that it is hard to imagine why any woman would forgo screening. But the authors explain that comparing survival between screened and unscreened women is “hopelessly biased.”

For example, imagine a group of 100 women who received diagnoses of breast cancer because they felt a breast lump at age 67, all of whom die at age 70. Five year survival for this group is 0%. Now imagine the women were screened, given their diagnosis three years earlier, at age 64, but still die at age 70. Five year survival is now 100%, even though no one lived a second longer.

Overdiagnosis (the detection of cancers that will not kill or even cause symptoms during a patient’s lifetime) also distorts survival statistics because the numbers now include people who have a diagnosis of cancer but who, by definition, survive the cancer, the authors add.

“If there were an Oscar for misleading statistics, using survival statistics to judge the benefit of screening would win a lifetime achievement award hands down,” they write.

But that doesn’t stop people from misinterpreting survival statistics. Disturbingly, in a recent survey, the authors found that most US primary care doctors also mistakenly interpret improved survival as evidence that screening saves lives.

Mammography certainly sounds better when stated in terms of improving five year survival – from 23% to 98%, a difference of 75 percentage points, they say. But in terms of its actual benefit, mammography can reduce the chance that a woman in her 50s will die from breast cancer over the next 10 years from 0.53% to 0.46%, a difference of 0.07 percentage points.

The Komen advertisement also ignores the harms of screening, they add. For every life saved by mammography, around two to 10 women are overdiagnosed. These women cannot benefit from unnecessary chemotherapy, radiation, or surgery. All they do experience is harm.

“Women need much more than marketing slogans about screening: they need – and deserve – the facts,” conclude the authors. “The Komen advertisement campaign failed to provide the facts. Worse, it undermined decision making by misusing statistics to generate false hope about the benefit of mammography screening. That kind of behaviour is not very charitable.”

Note to waitresses: Wearing red can be profitable
Los Angeles, CA (02 August, 2012) In many restaurants throughout the world, wait staff’s income depends largely on the tips received from customers. According to a new study, male restaurant customers give higher tips to waitresses wearing red. This study was published in a recent issue of Journal of Hospitality and Tourism Research (published by SAGE, on behalf of the International Council on Hotel, Restaurant, and Institutional Education).

In their study of 272 restaurant customers, researchers Nicolas Guéguen and Céline Jacob found not only that male patrons gave higher tips than female patrons in general, but that men gave between 14.6% and 26.1% more to waitresses wearing red, while color had no effect on female patrons’ tipping behavior at all. The researchers explained that previous research has found that red increases the physical and sexual attractiveness of women.

Guéguen and Jacob instructed eleven waitresses in five restaurants to wear the same tee shirt in different colors (black, white, red, blue, green, and yellow) on different days over a six-week period. The waitresses were instructed to act as they normally would to all customers and to record how much they received as a tip from each customer.

The author wrote, “As red color has no negative effect on women customers, it could be in their interest to wear red clothes at work.”

Dangerous experiment in fetal engineering
Risky prenatal use of steroid to try to prevent intersex, tomboys and lesbians

CHICAGO — A new paper just published in the Journal of Bioethical Inquiry uses extensive Freedom of Information Act findings to detail an extremely troubling off-label medical intervention employed in the U.S. on pregnant women to intentionally engineer the development of their fetuses for sex normalization purposes.

The paper is authored by Alice Dreger, professor of clinical medical humanities and bioethics at Northwestern University Feinberg School of Medicine and is co-authored by Ellen Feder, associate professor of philosophy and religion at American University, and Anne Tamar-Mattis, executive director of Advocates for Informed Choice.

The pregnant women targeted are at risk for having a child born with the condition congenital adrenal hyperplasia (CAH), an endocrinological condition that can result in female fetuses being born with intersex or more male-typical genitals and brains. Women genetically identified as being at risk are given dexamethasone, a synthetic steroid, off-label starting as early as week five of the first trimester to try to “normalize” the development of those fetuses, which are female and CAH-affected. Because the drug must be administered before doctors can know if the fetus is female or CAH-affected, only one in eight of those exposed are the target type of fetus.

The off-label intervention does not prevent CAH; it aims only at sex normalization. Like Diethylstilbestrol (DES) — which is now known to have caused major fertility problems and fatal cancers among those exposed in utero — dexamethasone is a synthetic steroid. Dexamethasone is known — and in this case intended — to cross the placental barrier and change fetal development. Experts estimate the glucocorticoid dose reaching the fetus is 60 to 100 times what the body would normally experience.

The new report provides clear evidence that:

For more than 10 years, medical societies repeatedly but ultimately impotently expressed high alarm at use of this off-label intervention outside prospective clinical trials, because it is so high risk and because nearly 90 percent of those exposed cannot benefit.
Mothers offered the intervention have been told it “has been found safe for mother and child” but in fact there has never been any such scientific evidence.
The U.S. Food and Drug Administration has indicated it cannot stop advertising of this off-label use as “safe for mother and child” because the advertising is done by a clinician not affiliated with the drug maker.
A just-out report from Sweden in the Journal of Clinical Endocrinology and Metabolism documents a nearly 20 percent “serious adverse event” rate among the children exposed in utero.
Clinician proponents of the intervention have been interested in whether the intervention can reduce rates of tomboyism, lesbianism and bisexuality, characteristics they have termed “behavioral masculinization.”
The National Institutes of Health has funded research to see if these attempts to prevent “behavioral masculinization” with prenatal dexamethasone are “successful.”
The United States’ systems designed to prevent another tragedy like DES and thalidomide — involving de facto experimentation on pregnant women and their fetuses — appear to be broken and ineffectual.
Strawberry extract protects against UVA rays
An experiment has shown that strawberry extract added to skin cell cultures acts as a protector against ultraviolet radiation as well as increasing its viability and reducing damage to DNA. Developed by a team of Italian and Spanish researchers, the study opens the door to the creation of photoprotective cream made from strawberries.

“We have verified the protecting effect of strawberry extract against damage to skins cells caused by UVA rays,” as explained to SINC by Maurizio Battino, researcher at the Università Politecnica delle Marche in Italy and lead author of the jointly Spanish and Italian study. The results are published in the ‘Journal of Agricultural Food Chemistry’.

The team prepared human skin cell cultures (fibroblasts) and added strawberry extract in different concentrations (0.05, 0.25 and 0.5 mg/ml), the only exception being the control extract. Using ultraviolet light, the samples were then exposed to a dose “equivalent to 90 minutes of midday summer sun in the French Riviera.”

Data confirm that the strawberry extract, especially at a concentration of 0.5 mg/ml, displays photoprotective properties in those fibroblasts exposed to UVA radiation, it increases cell survival and viability and decreases damage in the DNA when compared with control cells.

“These aspects are of great importance as they provide protection for cell lines subject to conditions that can provoke cancer and other skin-related inflammatory and degenerative illnesses,” outlines Battino.

The researcher recognises that this is the “first step in determining the beneficial effects of strawberries in our diet or as a possible compound source for ‘food integrators’ or cosmetics for instance.”

The redness of anthocyanins

But what molecules give strawberries their photoprotective properties? Scientists suspect that it could be the anthocyanins, which are pigments that give leaves, flowers and fruits their red colour. Analyses have confirmed that extracts are rich in such substances.

“These compounds have important anti-inflammatory, antioxidant and anti-tumour properties and are capable of modulating enzymatic processes,” explains another of the authors, Sara Tulipani from the University of Barcelona. She adds that “we have not yet found a direct relationship between their presence and photoprotective properties.”

“At the moment the results act as the basis for future studies evaluating the ‘bioavailability’ and ‘bioactivity’ of anthocyanins in the dermis and epidermis layers of the human skin, whether by adding them to formulations for external use or by ingesting the fruit itself,” states Tulipani.

Also made up of researchers from the Universities of Salamanca and Granada, in its previous works the team had already demonstrated that strawberries (Fragaria x ananassa) strengthen the red bloods cells and protect the stomach from the effects of alcohol.

People With Allergies May Have Lower Risk of Brain Tumors
COLUMBUS, Ohio – New research adds to the growing body of evidence suggesting that there’s a link between allergies and reduced risk of a serious type of cancer that starts in the brain. This study suggests the reduced risk is stronger among women than men, although men with certain allergy profiles also have a lower tumor risk.

The study also strengthens scientists’ belief that something about having allergies or a related factor lowers the risk for this cancer. Because these tumors, called glioma, have the potential to suppress the immune system to allow them to grow, researchers have never been sure whether allergies reduce cancer risk or if, before diagnosis, these tumors interfere with the hypersensitive immune response to allergens.

Scientists conducting this study were able to analyze stored blood samples that were taken from patients decades before they were diagnosed with glioma. Men and women whose blood samples contained allergy-related antibodies had an almost 50 percent lower risk of developing glioma 20 years later compared to people without signs of allergies.

“This is our most important finding,” said Judith Schwartzbaum, associate professor of epidemiology at Ohio State University and lead author of the study. “The longer before glioma diagnosis that the effect of allergies is present, the less likely it is that the tumor is suppressing allergies. Seeing this association so long before tumor diagnosis suggests that antibodies or some aspect of allergy is reducing tumor risk.

“It could be that in allergic people, higher levels of circulating antibodies may stimulate the immune system, and that could lower the risk of glioma,” said Schwartzbaum, also an investigator in Ohio State’s Comprehensive Cancer Center. “Absence of allergy is the strongest risk factor identified so far for this brain tumor, and there is still more to understand about how this association works.”

Many previous studies of the link between allergies and brain tumor risk have been based on self-reports of allergy history from patients diagnosed with glioma. No previous studies have had access to blood samples collected longer than 20 years before tumor diagnosis.

The current study also suggested that women whose blood samples tested positive for specific allergy antibodies had at least a 50 percent lower risk for the most serious and common type of these tumors, called glioblastoma. This effect for specific antibodies was not seen in men. However, men who tested positive for both specific antibodies and antibodies of unknown function had a 20 percent lower risk of this tumor than did men who tested negative.

Glioblastomas constitute up to 60 percent of adult tumors starting in the brain in the United States, affecting an estimated 3 in 100,000 people. Patients who undergo surgery, radiation and chemotherapy survive, on average, for about one year, with fewer than a quarter of patients surviving up to two years and fewer than 10 percent surviving up to five years.

The study is published online in the Journal of the National Cancer Institute.

Schwartzbaum and colleagues were granted access to specimens from the Janus Serum Bank in Norway. The bank contains samples collected from citizens during their annual medical evaluations or from volunteer blood donors for the last 40 years. Norway also has registered all new cases of cancer in the country since 1953, and personal identification numbers enable cross-referencing those cases with previously collected blood samples.

The researchers analyzed stored samples from 594 people who were diagnosed with glioma (including 374 diagnosed with glioblastoma) between 1974 and 2007. They matched these samples for date of blood collection, age and sex with 1,177 samples from people who were not diagnosed with glioma for comparison.

The researchers measured the blood samples for levels of two types of proteins called IgE, or immunoglobulin E. This is a class of antibodies produced by white blood cells that mediate immune responses to allergens. Two classes of IgE participate in the allergic response: allergen-specific IgE, which recognizes specific components of an allergen, and total IgE, which recognizes these components but also includes antibodies with unknown functions.

In each sample, the scientists determined whether the serum contained elevated levels of IgE specific to the most common allergens in Norway as well as total IgE. The specific respiratory allergens included dust mites; tree pollen and plants; cat, dog and horse dander; and mold.

The researchers then conducted a statistical analysis to estimate the association between elevated concentrations of allergen-specific IgE and total IgE and the risk of developing glioma.

Among women, testing positive for elevated levels of allergen-specific IgE was associated with a 54 percent decreased risk of glioblastoma compared to women who tested negative for allergen-specific IgE. The researchers did not see this association in men.

However, the relation between total IgE levels and glioma risk was not different for men and women, statistically speaking. For men and women combined, testing positive for elevated total IgE was linked to a 25 percent decreased risk of glioma compared with testing negative for total IgE.

The analysis for effects on glioblastoma risk alone suggested a similar decreased risk for both men and women combined whose samples tested positive for high levels of IgE, but the findings were considered borderline in terms of statistical significance, meaning the association could also be attributed to chance.

“There is definitely a difference in the effect of allergen-specific IgE between men and women. And even results for total IgE suggest there still may be a difference between the sexes. The reason for this difference is unknown,” Schwartzbaum said.

What the study does provide evidence for, however, is the likelihood that the immune systems of people with respiratory allergies could have a protective effect against this type of brain cancer. The ability to investigate this association over four decades between blood sampling and tumor diagnosis gave the researchers better insight into the relationship between allergies and tumor risk, Schwartzbaum said.

For example, a positive test for elevated concentrations of total IgE was associated with a 46 percent decreased risk for developing a glioma 20 years later compared to samples testing negative for elevated IgE, according to the analysis. That decreased risk was only about 25 percent in samples that tested positive for high levels of total IgE taken two to 15 years prior to diagnosis.

“There may be a trend – the closer the samples get to the time of diagnosis, the less help the IgE is in decreasing the risk for glioma. However, if the tumor were suppressing allergy, we would expect to see a bigger difference in risk near the time of diagnosis,” Schwartzbaum said.

Schwartzbaum plans to further analyze the serum samples for concentration of cytokines, which are chemical messengers that promote or suppress inflammation as part of the immune response, to see if these proteins have a role in the relationship between elevated IgE levels and lowered tumor risk.

This work was funded by the National Cancer Institute, the National Institutes of Health and a Research Enhancement and Assistance Program grant from Ohio State’s Comprehensive Cancer Center.

Co-authors include Bo Ding, Anders Ahlbom and Maria Feychting of the Karolinska Institutet in Stockholm, Sweden; Tom Borge Johannesen and Tom Grimsrud of the Cancer Registry of Norway; Liv Osnes of Ulleval University Hospital in Oslo, Norway; and Linda Karavodin of Karavodin Preclinical Consulting in Encinitas, Calif.

Students with strong hearts and lungs may make better grades, study finds
Physically fit boys and girls scored higher on reading and math, research reveals

ORLANDO, Fla. — Having a healthy heart and lungs may be one of the most important factors for middle school students to make good grades in math and reading, according to findings presented at the American Psychological Association’s 120th Annual Convention.

“Cardiorespiratory fitness was the only factor that we consistently found to have an impact on both boys’ and girls’ grades on reading and math tests,” said study co-author Trent A. Petrie, PhD, professor of psychology and director of the Center for Sport Psychology at the University of North Texas. “This provides more evidence that schools need to re-examine any policies that have limited students’ involvement in physical education classes.”

The researchers gathered data at five Texas middle schools from 1,211 students, of whom 54 percent were female with an average age of about 12. Overall, the group was 57 percent white. Among the boys, the breakdown was 57.2 percent white, 24.2 percent Mexican-American, 9.1 percent African American, 1.1 percent Asian-American and 1.2 percent American Indian. For the girls, 58.6 percent were white, 23.4 percent were Mexican-American, 9.2 percent were African-American, 2.3 percent Asian-American and 0.6 percent were American Indian.

While previous studies have found links between being physically fit and improved academic performance, this study also examined several other potential influences, including self-esteem and social support. It also took into account the students’ socioeconomic status and their self-reported academic ability, Petrie said.

In addition to cardiorespiratory fitness, social support was related to better reading scores among boys, according to the study. It defined social support as reliable help from family and friends to solve problems or deal with emotions. For girls, having a larger body mass index was the only factor other than cardiorespiratory fitness that predicted better reading scores. For boys and girls, cardiorespiratory fitness was the only factor related to their performance on the math tests. “The finding that a larger body mass index for girls was related to better performance on the reading exam may seem counterintuitive, however past studies have found being overweight was not as important for understanding boys and girls performances on tests as was their level of physical fitness,” Petrie said.

From one to five months before the students were to take annual standardized reading and math tests, they answered questions about their level of physical activity, and how they viewed their academic ability, self-esteem and social support. The school district provided information on the students’ socioeconomic status and reading and math scores at the end of the year.

To determine students’ physical fitness, the researchers worked with physical education teachers to administer a fitness assessment program widely used in U.S. schools. The program includes a variety of tests to assess aerobic capacity, muscular strength, muscular endurance, flexibility and body composition. The assessment provides an objective measure of cardiorespiratory fitness through the Progressive Aerobic Cardiovascular Endurance Run, or PACER, and body composition through measuring BMI, the study said.

“Because this is a longitudinal study, these variables can now be considered risk factors in relation to middle school students’ performance on math and reading examinations,” Petrie said. “And that is essential to developing effective programs to support academic success.”

Out of Europe
Researchers look at the spread of dysentery from Europe to industrializing countries

Researchers have found that a bacterium that emerged centuries ago in Europe has now been spreading globally into countries undergoing rapid development and industrialization. Unlike other diarrheal diseases, this one is unlikely to be resolved by providing access to clean water. As developing countries become more industrialized the numbers of infections with dysentery-causing Shigella flexneri are known to decline, associated with improved health, lifestyle and perhaps most importantly access to clean water, but the incidence of another form of the dysentery-causing bacterium, Shigella sonnei, actually increases.

The team pinpointed that S. sonnei was first established in Europe just a few centuries ago, but in the last few decades has spread to the rest of the world. They also found that a key factor in the spread of this pathogen was a rise in multidrug resistance – the ability to survive exposure to a wide array of antibiotics. Because S. sonnei is easily transmitted and has high levels of drug resistance, the researchers suggest that drug treatment and better sanitation alone will not be sufficient for controlling the disease. Vaccine development will be crucial.

Dysentery is a disease primarily associated with developing countries and more than one million people, mostly young children, are estimated to die from dysentery caused by Shigella each year. Whilst most people have heard about dysentery, few know about the bacteria that causes it, Shigella. This is because it is relatively understudied and little is known about their population structure or its origins. Traditionally, S. flexneri has been the most common form of Shigella bacterium to cause dysentery in developing countries with S. sonnei more prevalent in industrialized countries. Yet, this is beginning to change with S. sonnei becoming increasingly common as developing countries rapidly industrialize.

“Although S. sonnei is a relatively new species of bacterium, during its spread it has diversified into an array of different distinguishable clones or strains found right across the world,” says Dr Kathryn Holt, first author from the University of Melbourne. “This is hard to see using traditional methods, but by sequencing the genomes of over 100 different forms of the bacteria, we were able to get a glimpse into its past and really start to understand how it is evolving and moving around the world.”

“We compared the S. sonnei family tree and geographical locations of the different strains to determine when and where this bacterium first emerged and why it has become such a problem in industrialized countries with increasing access to clean water. Traditionally we associate dysentery with contaminated water and lack of industrialization.”

To investigate why the bacterium was spreading so effectively, the team looked at the S. sonnei’s genetic evolution and found that only a few types of genes were selectively evolving over time, particularly those involved with drug resistance. This suggests that a major driver in the spread of this bacterium was its apparent ability to become resistant to drug treatment.

“Since S. sonnei originated, we found there have been three, independent, yet closely related lineages that have spread. The two most recent lineages have been continually evolving to become increasingly resistant to antimicrobials,” says Dr Stephen Baker, a senior author from the Oxford University Clinical Research Unit in Vietnam. “Our data is consistent with antibiotic resistance as being a main driver of the spread and persistence of S.sonnei around the world, stressing that antibiotics are not a long-term solution for the elimination of this global health problem.”

Despite the fact S. sonnei and S. flexneri are closely related they have very different surface antigens or coats that interact with the human immune system. S. sonnei has only one type of outer coat, while S. flexneri has many, all of which look very different from that of S. sonnei. It has been speculated for some time that S. sonnei acquired its outer coat from another bacterium that is commonly found in contaminated water, Plesiomonas shigelloides.

Both S. sonnei and P. shigelloides have an identical outer coat. It is believed that when a person is exposed to contaminated water containing P. shigelloides, there is an immune cross reaction and the body builds a natural immunity against S. sonnei. This theory may explain why the incidence of S. sonnei increases following economic development and improvements to water quality, and is consistent with the patterns of global spread described in the current report.

“One of the Millennium Development Goals is to improve drinking water and reduce water borne diseases, an undeniably important aim,” says Professor Nicholas Thomson, lead author from the Wellcome Trust Sanger Institute. “This may have the unforeseen result of increasing the incidence of S. sonnei dysentery in transitional countries.

“Our research emphasises the importance of a vaccine against Shigella. The combination of increased incidence and antibiotic resistance of S. sonnei, means that a vaccine will be increasingly important for the long-term control and prevention of dysentery.”

Long-term use of blood pressure meds promoting sun sensitivity may raise lip cancer risk
OAKLAND, Calif., August 6, 2012 – Long-term use of commonly used blood pressure medications that increase sensitivity to sunlight is associated with an increased risk of lip cancer in non-Hispanic whites, according to a Kaiser Permanente study that appears in the current online issue of Archives of Internal Medicine.

Funded by the National Cancer Institute, the study found that photosensitizing antihypertensive drugs such as nifedipine and hydrochlorothiazide were associated with cancer of the epithelial cells known as squamous cells—which are the main part of the outermost layer of the lips and skin.

Researchers compared 712 patients in Northern California with lip cancer to 22,904 people in a control group and found that the risk of squamous cell lip cancer was higher for those with long-term use of photosensitizing blood pressure medications.

“Lip cancer remains rare and an increased risk of developing it is generally outweighed by the benefits of these blood pressure drugs and other photosensitizing medications,” said Gary Friedman, MD, an emeritus researcher at the Kaiser Permanente Northern California Division of Research and lead author of the study. “Physicians prescribing photosensitizing drugs should ascertain whether patients are at high risk of lip cancer by virtue of fair skin and long-term sun exposure and discuss lip protection with them. Although not yet confirmed by clinical trials, likely preventive measures are simple: a hat with sufficiently wide brim to shade the lips and lip sunscreens.”

The risk of lip cancer appeared to increase with increasing duration of use of these drugs and was not explained by a history of cigarette smoking, also a known risk factor for lip cancer, according to investigators.

Photosensitizing drugs are believed to absorb energy from ultraviolet and/or visible light, causing the release of electrons. This leads to generation of reactive oxygen intermediates and free radicals which damage DNA and other components of skin and lip cells and produce an inflammatory response, Friedman said.

Researchers ascertained prescriptions dispensed and cancer occurrence from August 1994 to February 2008. They identified 712 patients with lip cancer and 22,904 controls in the susceptible group of non-Hispanic whites. Researchers determined their use at least two years before diagnosis or control index date of the commonly prescribed diuretics, HCTZ and HCTZ combined with triamterene (HCTZ/TR), the angiotensin-converting enzyme inhibitor lisinopril, the calcium channel blocker nifedipine, and the beta adrenergic blocker atenolol, the only non-photosensitizer studied. Non- photosensitizing atenolol, when used alone, was not associated with increased risk. Findings for lisinopril were not as clear-cut as those for HCTZ, HCTZ/TR and nifedipine.

Researchers analyzed use of each drug both exclusively and regardless of use of others and focused on duration of use. The analysis controlled for smoking.

Researchers were not able to include basal cell and squamous cell cancers of the skin in this study because these diagnoses have not been recorded in their cancer registry. Also, researchers were not able to adjust for sun exposure, the most important lip cancer risk factor, along with relative lack of pigmentation of the lips. Risk of developing melanoma was not associated with these drugs. This form of skin cancer has been more strongly associated with intermittent sun exposures, especially those producing sunburn, than with chronic sun exposure, so timing of use of photosensitizing drugs could be an important consideration, explain the researchers.

Off-label drug use common, but patients may not know they’re taking them, Mayo finds
Aug. 6, 2012

ROCHESTER, Minn. — Many people have probably heard of off-label drug use, but they may not know when that applies to prescriptions they are taking, a Mayo Clinic analysis found. Off-label drug use occurs when a physician prescribes medication to treat a condition before that use has been approved by the Food and Drug Administration. In a newly published article in Mayo Clinic Proceedings, researchers pose and answer 10 questions about off-label drug use.

“Since the Food and Drug Administration does not regulate the practice of medicine, off-label drug use has become very common,” says lead author Christopher Wittich, M.D., internal medicine physician at Mayo Clinic. “Health care providers and patients should educate themselves about off-label drugs to weigh the risks and benefits before a physician prescribes one or a patient takes one.”

Some highlights from the article:

•Off-label drug use is common. Within a group of commonly used medications, roughly 1 in 5 prescriptions were for an off-label use, a 2006 report found. Another study found that about 79 percent of children discharged from pediatric hospitals were taking at least one off-label medication.

•Patients may not know when drugs they have been prescribed are being used off-label. No court decision has required that physicians must disclose, through informed consent, the off-label use of a drug, the authors say. The FDA makes clear that it doesn’t regulate the practice of medicine and that the federal Food, Drug, and Cosmetic Act of 1938 doesn’t make physicians liable for off-label drug use, they note.

•Off-label drug use can become the predominant treatment for a condition. For example, some antidepressants are not approved by the FDA as a treatment for neuropathic pain, yet some drugs in this class are considered a first-line treatment option.

•Examples of widely practiced off-label drug use include morphine, used extensively to treat pain in hospitalized pediatric patients. Many inhaled bronchodilators, antimicrobials, anticonvulsants, and proton pump inhibitors also are used in children without formal FDA approval.

•Obtaining new FDA approval for a medication can be costly and time-consuming. To add additional indications for an already approved medication requires a supplemental drug application; if eventually approved, revenue from it may not offset the expense and effort for obtaining approval.

•Generic medications may not have the requisite funding resources needed to pursue FDA-approval studies. For these financial reasons, drug proprietors may never seek FDA approval for a new drug indication.

•Pharmaceutical manufacturers are not allowed to promote off-label uses of medications. However, they can respond to unsolicited questions from health care providers and distribute peer-reviewed publications about off-label use. Just this year, GlaxoSmithKline agreed to pay a record $3 billion to settle a Justice Department case involving alleged off-label drug use marketing, and Merck Sharp & Dohme was fined $322 million over its alleged promotion of the painkiller Vioxx for an off-label use.

Creatine aids women in outmuscling major depression
Muscle-building supplement vastly improves reponse time, quality of recovery

(SALT LAKE CITY)—Women battling stubborn major depression may have a surprising new ally in their fight—the muscle-building dietary supplement creatine.

In a new proof-of-concept study, researchers from three South Korean universities and the University of Utah report that women with major depressive disorder (MDD) who augmented their daily antidepressant with 5 grams of creatine responded twice as fast and experienced remission of the illness at twice the rate of women who took the antidepressant alone. The study, published Aug. 3, 2012, in the American Journal of Psychiatry online, means that taking creatine under a doctor’s supervision could provide a relatively inexpensive way for women who haven’t responded well to SSRI (selective serotonin reuptake inhibitor) antidepressants to improve their treatment outcomes.

“If we can get people to feel better more quickly, they’re more likely to stay with treatment and, ultimately, have better outcomes,” says Perry F. Renshaw, M.D., Ph.D., M.B.A, USTAR professor of psychiatry at the U of U medical school and senior author on the study.

If these initial study results are borne out by further, larger trials, the benefits of taking creatine could directly affect many Utahns. The depression incidence in Utah is estimated to be 25 percent higher than the rest of the nation, meaning the state has an even larger proportion of people with the disease. This also brings a huge economic cost to both the state and individuals.

According to numbers recently compiled at the U of U, the state of Utah paid an estimated $214 million in depression-related Medicaid and disability insurance in 2008. Add the costs of inpatient and outpatient treatment, medication, and lost productivity in the workplace, and the total price of depression in Utah reached $1.3 billion in 2008, according to the U estimate. With those large numbers, any treatment that improves outcomes not only could ease the life of thousands of Utah women but also would save millions of dollars.

“There has been a misunderstanding of how crippling and common this disease is in Utah,” says Renshaw, who’s also medical director of the Mental Illness Research, Education and Clinical Center at the Salt Lake City Veterans Affairs Health Care System. “It begs that we understand it better than we do.”

Creatine is an amino acid made in the human liver, kidneys, and pancreas. It also is found in meat and fish. Inside the body it is converted into phosphocreatine and stored in muscle. During high-intensity exercise, phosphocreatine is converted into ATP, an important energy source for cells. For this reason, creatine has become a popular supplement among bodybuilders and athletes who are trying to add muscle mass or improve athletic ability.

How creatine works against depression is not precisely known, but Renshaw and his colleagues suggest that the pro-energetic effect of creatine supplementation, including the making of more phosphocreatine, may contribute to the earlier and greater response to antidepressants.

The eight-week study included 52 South Korean women, ages 19-65, with major depressive disorder. All the women took the antidepressant Lexapro (escitalopram) during the trial. Twenty-five of the women received creatine with the Lexapro and 27 were given a placebo. Neither the study participants nor the researchers knew who received creatine or placebo. Eight women in the creatine group and five in the placebo group did not finish the trial, leaving a total of 39 participants.

Participants were interviewed at the start of the trial to establish baselines for their depression, and then were checked at two, four, and eight weeks to see how they’d responded to Lexapro plus creatine or Lexapro and a placebo. The researchers used three measures to check the severity of depression, with the primary outcomes being measured by the Hamilton Depression Rating Scale (HDRS), a widely accepted test.

The group that received creatine showed significantly higher improvement rates on the HDRS at two and four weeks (32 percent and 68 percent) compared to the placebo group (3.7 percent and 29 percent). At the end of eight weeks, half of those in the creatine group showed no signs of depression compared with one-quarter in the placebo group. There were no significant adverse side effects associated with creatine.

Antidepressants typically don’t start to work until four to six weeks. But research shows that the sooner an antidepressant begins to work, the better the treatment outcome, and that’s why Renshaw and his colleagues are excited about the results of this first study. “Getting people to feel better faster is the Holy Grail of treating depression,” he says.

Study co-author Tae-Suk Kim, M.D., Ph.D., associate professor of psychiatry at the Catholic University of Korea College of Medicine and visiting associate professor of psychiatry at the U of U, already is recommending creatine for some of his female depression patients.

In prior studies, creatine had been shown to be effective only in female rats. But that shouldn’t rule out testing the supplement in men as well, according to Renshaw.

U of U researchers expect soon to begin another trial to test creatine in adolescent and college-age females who have not responded to SSRI medications. Principal investigator Douglas G. Kondo, M.D., assistant professor of psychiatry, says he is looking for 40 females between the ages of 13-21. Recruitment of participants will begin as soon as the U of U Institutional Review Board approves the study, which is expected in early July.

After the initial eight weeks of treatment, study participants will be offered a six-month extension of close supervision and monitoring by the research team and board-certified child, adolescent, and adult psychiatrist at no charge.

 

Corticosteroids not effective for treating acute sinusitis
Clinical trial shows no clinical benefit

Corticosteroids, frequently prescribed to alleviate acute sinusitis, show no clinical benefit in treating the condition, according to a randomized controlled trial published in CMAJ (Canadian Medical Association Journal) (pre-embargo link only) http://www.cmaj.ca/site/press/cmaj.120430.pdf.

The common cold is the main cause of acute sinusitis, which is characterized by inflammation of the nasal cavities, blocked nasal passages and sometimes headaches and facial pain. Allergies and bacteria can also cause the condition, which is uncomfortable and difficult to treat. Antibiotics are a common treatment, despite the fact that the cause is often viral and will not respond to antibiotics. Corticosteroids are increasingly being used to alleviate symptoms, although the evidence for efficacy is inconclusive.

To determine the effectiveness of oral corticosteroids on acute rhinosinusitis (sinusitis), researchers from the Netherlands conducted a randomized, double-blind controlled trial involving 174 adults with clinically diagnosed acute rhinosinusitis. Eighty-eight patients were randomized to a group that received 30 mg/d of prednisolone for a week and the remaining 86 received placebo. In the prednisolone group, 55 of 88 patients (62.5%) reported that their facial pain or pressure had resolved by day 7, versus 48 of 86 (55.8%) in the placebo group. Although there was a slight reduction of facial pain in the prednisolone group, the results were neither statistically nor clinically significant. Moreover, other patient-relevant outcomes revealed similar results.

“We found no clinically relevant effect of systemic corticosteroid monotherapy among patients with clinically diagnosed, uncomplicated acute rhinosinusitis,” writes Dr. Roderick Venekamp, University Medical Centre Utrecht, Utrecht, the Netherlands, with coauthors.

“There is no rationale for the use of corticosteroids in the broad population of patients with clinically diagnosed acute rhinosinusitis,” write the authors. “Future studies should focus on identifying subgroups of patients who may benefit from intranasal or systemic corticosteroid treatment.”

hiatrist at no charge.

GW Researcher finds depressive symptoms and suicidal thoughts in former finasteride users
WASHINGTON — (Aug 7, 2012) New research, to be published in the Journal of Clinical Psychiatry, finds that men who developed persistent sexual side effects while on finasteride (Propecia), a drug commonly used for male pattern hair loss, have a high prevalence of depressive symptoms and suicidal thoughts. The study, titled “Depressive Symptoms and Suicidal Thoughts Among Former Users of Finasteride With Persistent Sexual Side Effects,” was authored by Michael S. Irwig, M.D., an assistant professor of medicine in the Division of Endocrinology at the George Washington University School of Medicine and Health Sciences.

For the study, Dr. Irwig administered standardized interviews to 61 men who were former users of finasteride with persistent sexual side effects for more than three months. The interview gathered demographic information, medical and psychiatric histories, and information on medication use, sexual function, and alcohol consumption. All of the former finasteride users were otherwise healthy men with no baseline sexual dysfunction, medical conditions, psychiatric conditions or use of oral prescription medications. Dr. Irwig also conducted interviews with a control group of 29 men who had male pattern hair loss but who had never used finasteride and who denied any history of psychiatric conditions or use of psychiatric medications. Both groups self-administered the Beck Depression Inventory II (BDI-II), a widely used, validated instrument that measures the severity of depression in adults.

According to the total scores from the BDI-II, most former finasteride users exhibited some degree of depressive symptoms: 11% had mild symptoms; 28% had moderate symptoms; and 36% had severe symptoms. In addition, 44% reported suicidal thoughts. In the control group, 10%had mild depressive symptoms with no cases of moderate or severe symptoms, and 3% reported suicidal thoughts.

“The potential life-threatening side-effects associated with finasteride should prompt clinicians to have serious discussions with their patients. The preliminary findings of this study warrant further research.” said Dr. Irwig

Why do infants get sick so often?
University of Michigan Health System researchers reveal cell signaling prevents the growth of essential immune cells

Newborns and infants are more prone to viral infections. A U-M study shows this sensitivity is due in part to age-dependent suppression of natural killer cell production.

Researchers at the University of Michigan Health System are helping to quell parents’ worry about why infants seem to get sick so often.

It’s been believed that, like walking and talking, fighting viral infections is something children will develop when they get older. But a U-M study suggests the natural ability to fight infection is there early on.

Scientists learned key cell signals inhibit the growth of essential immune cells early in life. Blocking this signaling could lead to improving an infant’s response to infection, according to the study published online ahead of print in Nature Immunity.

“What happens at early age is that natural killer cells, like many other immune cells, do not complete their functional maturation until adulthood,” says study senior author Yasmina Laouar, Ph.D., assistant professor in the U-M Department of Microbiology and Immunology.

”During this time we are left with an immature immune system that cannot protect us against infections, the reason why newborns and infants are more prone to infection,” she says.

There is a large gap in understanding infant immunity, specifically why the natural killer cell responses are deficient. The study by immunologists at the U-M demonstrates the role of a cell called transforming growth factor beta that can explain why.

The study showed the production of natural killer cells is controlled by TGF-β, which is produced in the bone marrow. In infant mice, the maturation of natural killer cells progressed faster in the absence of TGF-β signaling.

By adulthood, mice had 10 times more mature natural killer cells if TGF-β signaling was blocked.

“Our overall goal was to determine the factors that constraint the production and maturation of natural killer cells early in life,” says Laouar. “To our surprise, we discovered that natural killer cells can complete maturation as early as 10 days of age if TGF-β signaling is blocked.”

Authors say it’s tempting to propose the functional inactivation TGF-β signaling as a strategy to reverse the deficit of natural killer cells early in life. Additional testing will be required.

Thinner diabetics face higher death rate
New-onset diabetics with normal BMI have higher mortality rate than heavier diabetics

CHICAGO — American adults of a normal weight with new-onset diabetes die at a higher rate than overweight/obese adults with the same disease, according to a new Northwestern Medicine study.

The study, to be published in the Aug. 7 issue of JAMA, found that normal-weight participants experienced both significantly higher total and non-cardiovascular mortality than overweight/obese participants.

Normal-weight adults with type 2 diabetes have been understudied because those who typically develop the disease are overweight or obese. In this study about 10 percent of those with new-onset diabetes were at a normal weight at the time of ascertainment.

Being overweight is a risk factor for developing this disease, but other risk factors such as family history, ethnicity and age may play a role.

“It could be that this is a very unique subset of the population who are at a particularly high risk for mortality and diabetes, and it is possible that genetics is a factor with these individuals,” said Mercedes R. Carnethon, associate professor of preventive medicine at Northwestern University Feinberg School of Medicine and first author of the study.

Older adults and nonwhite participants are more likely to experience normal-weight diabetes, according to the study.

“Many times physicians don’t expect that normal-weight people have diabetes when it is quite possible that they do and could be at a high risk of mortality, particularly if they are older adults or members of a minority group,” Carnethon said. “If you are of a normal weight and have new-onset diabetes, talk to your doctor about controlling your health risks, including cardiovascular risk factors.”

Researchers analyzed data from five cohort studies and identified 2,625 U.S. men and women over the age of 40 who were determined to have diabetes at the start of the studies. Some of these individuals already knew they were diabetic, and others found out through their participation in the studies.

Diabetes determination was based on a fasting glucose of 126 mg/dL or greater or newly initiated diabetes medication with concurrent measurements of body mass index (BMI). A participant of normal weight had a BMI of 18.5 to 24.99, while overweight/obese participants had a BMI of 25 or greater.

With the aging and diversification of the population, cases of normal weight diabetes likely will be on the rise, Carnethon said. Future studies should focus on factors such as fat distribution and genetic types in normal-weight people with diabetes, she said.

COI declarations and off-label drug use
Conflict-of-interest statements made by physicians and scientists in their medical journal articles after they had been allegedly paid by pharmaceutical manufacturers as part of off-label marketing programs are often inadequate, highlighting the deficiencies in relying on author candidness and the weaknesses in some journal practices in ensuring proper disclosure, according to a study by international researchers published in this week’s PLOS Medicine. Off-label marketing is the promotion by a manufacturer of a drug for use in a condition or age group, or in a dose or form of administration that has not been specifically approved by a drugs regulatory body; it is illegal in the United States.

In an analysis led by Aaron Kesselheim from Brigham and Women’s Hospital in Boston, the authors from the US and Australia found that overall, only one in seven authors reported by whistleblowers to be involved in off-label marketing activities fully disclosed their conflict of interest in published articles.

The authors reached these conclusions by examining 26 whistleblower complaints of illegal off-label promotion and identified the 91 doctors and scientists recorded as being involved in this practice. The researchers found 39 (43%) of these 91 experts had authored 404 publications related to the drugs at issue in the complaints. However, only 62 (15%) of these articles had adequate disclosures. Among the articles without adequate disclosures, 43% (148) had no disclosure at all, 4% had statements denying any conflicts of interest, 40% had disclosures that did not mention the drug manufacturer, and 13% had disclosures that mentioned the manufacturer but inadequately conveyed the nature of the relationship between author and drug manufacturer reported in the complaint.

The authors argue that such failures in relation to off-label marketing schemes are especially troubling because off-label use is an area of clinical practice in which opinion is likely to be divided about appropriate care and there is no guidance from regulatory authorities. Therefore, high-profile ”opinion leaders” may exert considerable influence on prescribing practices through their publications. The authors argue: “More adequate disclosure of financial ties in these situations would give readers an opportunity to weigh the potential for bias.”

The authors conclude: “Our findings suggest that approaches to controlling the effects of conflicts of interest that rely on author candidness and variable policing by journals have fallen short of the mark. Readers are left with little choice but to be sceptical.”

Doctors often don’t disclose all possible risks to patients before treatment
Most informed consent disputes involve disagreements about who said what and when, not stand-offs over whether a particular risk ought to have been disclosed. But doctors may “routinely underestimate the importance of a small set of risks that vex patients” according to international experts writing in this week’s PLOS Medicine.

Increasingly, doctors are expected to advise and empower patients to make rational choices by sharing information that may affect treatment decisions, including risks of adverse outcomes. However, authors from Australia and the US led by David Studdert from the University of Melbourne argue that doctors, especially surgeons, are often unsure which clinical risks they should disclose and discuss with patients before treatment.

To understand more about the clinical circumstances in which disputes arise between doctors and patients in this area, the authors analyzed 481 malpractice claims and patient complaints from Australia involving allegations of deficiencies in the process of obtaining informed consent.

The authors found that 45 (9%) of the cases studied were disputed duty cases—that is, they involved head-to-head disagreements over whether a particular risk ought to have been disclosed before treatment. Two-thirds of these disputed duty cases involved surgical procedures, and the majority (38/45) of them related to five specific outcomes that had quality of life implications for patients, including chronic pain and the need for re-operation.

The authors found that the most common justifications doctors gave for not telling patients about particular risks before treatment were that they considered such risks too rare to warrant discussion or the specific risk was covered by a more general risk that was discussed.

However, nine in ten of the disputes studied centered on factual disagreements—arguments over who said what, and when. The authors say: “Documenting consent discussions in the lead-up to surgical procedures is particularly important, as most informed consent claims and complaints involved factual disagreements over the disclosure of operative risks.”

The authors say: “Our findings suggest that doctors may systematically underestimate the premium patients place on understanding certain risks in advance of treatment.”

They conclude: “Improved understanding of these situations helps to spotlight gaps between what patients want to hear and what doctors perceive patients want—or should want—to hear. It may also be useful information for doctors eager to avoid medico-legal disputes.”

Stress makes men appreciate heavier women
Increased stress in men is associated with a preference for heavier women, according to research published Aug. 8 in the open access journal PLOS ONE.

The researchers, led by Viren Swami of the University of Westminster in London, compared how stressed versus non-stressed men responded to pictures of female bodies varying from emaciated to obese. They found that the stressed group gave significantly higher ratings to the normal weight and overweight figures than the non-stressed group did, and that the stressed group generally had a broader range of figures they found attractive than the non-stressed group did. These results, the authors write, are consistent with the idea that people idealize mature morphological traits like heavier body size when they experience an environmental threat such as stress.

New Kenyan fossils shed light on early human evolution
NAIROBI, KENYA – Exciting new fossils discovered east of Lake Turkana confirm that there were two additional species of our genus – Homo – living alongside our direct human ancestral species, Homo erectus, almost two million years ago. The finds, announced in the prestigious scientific journal Nature on August 9th, include a face, a remarkably complete lower jaw, and part of a second lower jaw. They were uncovered between 2007 and 2009 by the Koobi Fora Research Project (KFRP), led by Meave and Louise Leakey. KFRP’s fieldwork was facilitated by the Turkana Basin Institute (TBI), and supported by the National Geographic Society, which has funded the KFRP since 1968.

Four decades ago, the KFRP discovered the enigmatic fossil known as KNM-ER 1470 (or “1470” for short). This skull, readily distinguished by its large brain size and long flat face, ignited a longstanding debate about just how many different species of early Homo lived alongside Homo erectus during the Pleistocene epoch. 1470’s unusual morphology was attributed by some scientists to sexual differences and natural degrees of variation within a single species, whereas others interpreted the fossil as evidence of a separate species.

This decades-old dilemma has endured for two reasons. First, comparisons with other fossils have been limited due to the fact that 1470’s remains do not include its teeth or lower jaw. Second, no other fossil skull has mirrored 1470’s flat and long face, leaving in doubt just how representative these characteristics are. The new fossils address both issues.

“For the past 40 years we have looked long and hard in the vast expanse of sediments around Lake Turkana for fossils that confirm the unique features of 1470’s face and show us what its teeth and lower jaw would have looked like,” says Meave Leakey, co-leader of the KFRP and a National Geographic Explorer-in-Residence. “At last we have some answers.”

“Combined, the three new fossils give a much clearer picture of what 1470 looked like,” says Fred Spoor, leader of the scientific analyses. “As a result, it is now clear that two species of early Homo lived alongside Homo erectus. The new fossils will greatly help in unraveling how our branch of human evolution first emerged and flourished almost two million years ago.”

Found within a radius of just over 10 km from 1470’s location, the three new fossils are dated between 1.78 million and 1.95 million years old. The face KNM-ER 62000, discovered by field crew member Elgite Lokorimudang in 2008, is very similar to that of 1470, showing that the latter is not a single “odd one out” individual. Moreover, the face’s well-preserved upper jaw has almost all of its cheek teeth still in place, which for the first time makes it possible to infer the type of lower jaw that would have fitted 1470. A particularly good match can be found in the other two new fossils, the lower jaw KNM-ER 60000, found by Cyprian Nyete in 2009, and part of another lower jaw, KNM-ER 62003, found by Robert Moru in 2007. KNM-ER 60000 stands out as the most complete lower jaw of an early member of the genus Homo yet discovered.

The team working on the new finds included Christopher Kiarie (TBI), who carried out the laboratory preparation of the fossils, Craig Feibel (Rutgers University), who studied the age of the fossils, and Susan Antón (New York University), Christopher Dean (UCL, University College London), Meave and Louise Leakey (TBI, Kenya; and Stony Brook University, New York) and Fred Spoor (Max Planck Institute for Evolutionary Anthropology, Leipzig and UCL), who analysed the fossils. The National Geographic Society funded the fieldwork, the Leakey Foundation funded geological studies, and the Max Planck Society supported laboratory work.

Tai Chi shown to improve COPD exercise capacity
Tai Chi can be used as an effective form of exercise therapy for people with chronic obstructive pulmonary disease (COPD), according to new findings.

The research, which was published online today (XX August 2012) ahead of print in the European Respiratory Journal, suggests that this form of exercise can improve exercise capacity and quality of life in people with COPD and may be as beneficial as pulmonary rehabilitation.

It is well known that moderate forms of exercise can help COPD patients to improve their exercise tolerance, symptoms of breathlessness and their overall quality of life. This new study aimed to investigate whether Sun-style Tai chi could be used as an effective form of exercise therapy.

This form of Tai Chi (Sun-style) has been shown to help people with chronic conditions such as arthritis and involves less difficult movements enabling people of all ages to perform this martial art.

Researchers from the Concord Repatriation General Hospital and the University of Sydney, Sydney, Australia, worked with 42 people with COPD. Half the group attended Tai Chi lessons twice a week, as well as performing Tai Chi at home, whereas the other half followed their usual medical management which did not include exercise.

Researchers tested the exercise capacity of all participants via a walking test and also asked all participants to complete the Chronic Respiratory Disease Questionnaire, which gives an indication of how the disease affects their quality of life. The exercise intensity of Tai Chi was measured in those participants who completed the Tai Chi training to assess whether it met the training requirements suggested for COPD patients.

Compared to the group completing the usual medical management, participants completing the Tai Chi exercise training could walk significantly longer in the walking test. They also had an increased score on the questionnaire, indicating a better quality of life.

The results also showed that the intensity of the Tai Chi was moderate, which met the recommendations for exercise training for people with COPD.

Lead author, Regina Wai Man Leung from the Concord Repatriation General Hospital, said: “With increasing numbers of people being diagnosed with COPD, it is important to provide different options for exercise that can be tailored to suit each individual. The results from this small sample provide compelling evidence that Tai Chi is an effective training programme for patients with COPD, and could be considered as an alternative to the usual exercise training programmes that are available in pulmonary rehabilitation.”

Crossing 5+ time zones more than doubles illness risk for elite athletes
Risk only apparent for outward bound journeys, so unlikely to be air travel say authors

Elite athletes who cross more than five time zones to compete are around two to three times as likely to get ill as when they compete on their home turf, suggests research published online in the British Journal of Sports Medicine.

The researchers tracked the daily health of 259 elite rugby players competing in the 2010 Super 14 Rugby Tournament.

In this annual tournament, 14 teams from Australia, South Africa and New Zealand compete over 16 weeks (February to May) at venues in all three countries, and in time zones varying from 2 to 11 hours’ difference from their own.

Games are played weekly to a high intensity international standard, accompanied by three to five weekly training sessions over the 16 week period.

The 8 team physicians were asked to complete a daily log of any illness that required medical attention for each member of their squad.

The rate of illness was calculated for 1000 player days, with the total number of player days across all the teams 22,676, based on squad size x days of play.

Throughout the 16 weeks of the tournament, 469 illnesses were reported in 187 of the players (just over 72%), giving an overall incidence of just under 21 per 1000 player days.

But the rate varied considerably, depending on where the matches were played.

For matches played on home turf before international travel, the incidence was 15.4/1000 player days.

But this rose to 32.6/1000 player days for matches played in locations that were 5+ hours’ time difference away from home, irrespective of direction of travel.

For matches played on return back home after international travel, the incidence fell back to 10.6/1000 player days.

Almost one in three of all illnesses reported were respiratory conditions (just under 31%), followed by gut problems (27.5%) and skin and soft tissue conditions (22.5%). Infections accounted for most of the reported illnesses.

There was little difference in the number of infections reported for each of the months, although there was a slight fall in incidence during April.

It has been suggested that air travel might explain the higher risk of illness, but if that were the case, infection rates would also be higher after returning home, say the authors—at least for respiratory infections.

“The results from our study indicate that the illness risk is not directly related to the travel itself, but rather the arrival and location of the team at a distant destination,” write the authors.

They suggest that various stressors could be involved, including changes in pollution, temperature, allergens, humidity, altitude, as well as different food, germs, and culture.

Gum disease 4 times as common in rheumatoid arthritis patients
Might be potential trigger and/or help sustain inflammation suggest researchers

Gum disease is not only four times as common among patients with the autoimmune disease rheumatoid arthritis as it is among their healthy peers, but it also tends to be more severe, indicates a small study published online in the Annals of the Rheumatic Diseases.

The researchers base their findings on 91 adults with confirmed rheumatoid arthritis (RA) and a comparison group of 93 healthy people, matched for age and sex.

All participants were non-smokers, as smoking is a known risk factor for rheumatoid arthritis. And it is strongly associated with the production of antibodies, indicative of a systemic reaction to a person’s own proteins (ACPAs), and which often predates the development of rheumatoid arthritis by several years.

And none had been treated with arthritis drugs known as disease modifying drugs, or DMARD, for short.

Disease activity was quantified using a specific score and by measuring levels of inflammatory markers. And the extent of gum disease was assessed by quizzing participants on their symptoms.

These included swollen and bleeding gums, sensitive teeth, loose teeth, and a history of tooth loss caused by gum disease. How far the gum had receded from the surface of the tooth, known as pocketing, was also measured using a probe.

Almost two thirds of patients (just under 65%) with rheumatoid arthritis had evidence of gum disease, compared with just over one in four (28%) of their healthy peers.

Overall, patients with rheumatoid arthritis were four times as likely to have gum disease. And their gum disease also tended to be more severe.

The depth of pocketing was also significantly greater among those with rheumatoid arthritis and especially among those who tested positive for ACPA, compared with those in the healthy comparison group.

Those who tested positive for ACPA had had their rheumatoid arthritis for longer, had higher levels of disease activity, and higher levels of inflammatory markers than those who tested negative.

Porphyromonas gingivalis is one of the main bacteria behind gum disease, and it is also the only organism known to produce an enzyme capable of generating ACPA in gum tissue.

Even those with rheumatoid arthritis, but without serious gum disease, as measured by pocketing, had symptoms, such as bleeding and swollen gums and sensitive teeth, but tooth loss was seen only in those with serious gum disease.

But mild gum disease in patients with rheumatoid arthritis may become more serious, and include testing positive for ACPA, suggest the authors. Published research indicates that ACPA antibody levels increase the longer a person has had rheumatoid arthritis.

“[Gum disease] is more common and severe in rheumatoid arthritis patients than in healthy controls…and could be a potential environmental trigger in the [development] and also in the maintenance of systemic inflammation in [the disease],” they conclude.

CU-Boulder-led team discovers new atmospheric compound tied to climate change, human health
New chemical pathway for the formation of sulfuric acid a big surprise, say researchers

An international research team led by the University of Colorado Boulder and the University of Helsinki has discovered a surprising new chemical compound in Earth’s atmosphere that reacts with sulfur dioxide to form sulfuric acid, which is known to have significant impacts on climate and health.

The new compound, a type of carbonyl oxide, is formed from the reaction of ozone with alkenes, which are a family of hydrocarbons with both natural and man-made sources, said Roy “Lee” Mauldin III, a research associate in CU-Boulder’s atmospheric and oceanic sciences department and lead study author. The study charts a previously unknown chemical pathway for the formation of sulfuric acid, which can result both in increased acid rain and cloud formation as well as negative respiratory effects on humans.

“We have discovered a new and important, atmospherically relevant oxidant,” said Mauldin. “Sulfuric acid plays an essential role in Earth’s atmosphere, from the ecological impacts of acid precipitation to the formation of new aerosol particles, which have significant climatic and health effects. Our findings demonstrate a newly observed connection between the biosphere and atmospheric chemistry.”

A paper on the subject is being published in the Aug. 9 issue of Nature.

Typically the formation of sulfuric acid in the atmosphere occurs via the reaction between the hydroxyl radical OH — which consists of a hydrogen atom and an oxygen atom with unpaired electrons that make it highly reactive — and sulfur dioxide, Mauldin said. The trigger for the reactions to produce sulfuric acid is sunlight, which acts as a “match” to ignite the chemical process, he said.

But Mauldin and his colleagues had suspicions that there were other processes at work when they began detecting sulfuric acid at night, particularly in forests in Finland — where much of the research took place — when the sun wasn’t present to catalyze the reaction. “There were a number of instances when we detected sulfuric acid and wondered where it was coming from,” he said.

In the laboratory, Mauldin and his colleagues combined ozone — which is ubiquitous in the atmosphere — with sulfur dioxide and various alkenes in a gas-analyzing instrument known as a mass spectrometer hooked up with a “flow tube” used to add gases. “Suddenly we saw huge amounts of sulfuric acid being formed,” he said.

Because the researchers wanted to be sure the hydroxyl radical OH was not reacting with the sulfur dioxide to make sulfuric acid, they added in an OH “scavenger” compound to remove any traces of it. Later, one of the research team members held up freshly broken tree branches to the flow tube, exposing hydrocarbons known as isoprene and alpha-pinene — types of alkenes commonly found in trees and which are responsible for the fresh pine tree scent.

“It was such a simple little test,” said Mauldin. “But the sulfuric acid levels went through the roof. It was something we knew that nobody had ever seen before.”

Mauldin said the new chemical pathway for sulfuric acid formation is of interest to climate change researchers because the vast majority of sulfur dioxide is produced by fossil fuel combustion at power plants. “With emissions of sulfur dioxide, the precursor of sulfuric acid, expected to rise globally in the future, this new pathway will affect the atmospheric sulfur cycle,” he said.

According to the U.S. Environmental Protection Agency, more than 90 percent of sulfur dioxide emissions are from fossil fuel combustion at power plants and other industrial facilities. Other sulfur sources include volcanoes and even ocean phytoplankton. It has long been known that when sulfur dioxide reacts with OH, it produces sulfuric acid that can form acid rain, shown to be harmful to terrestrial and aquatic life on Earth.

Airborne sulfuric acid particles — which form in a wide variety of sizes — play the main role in the formation of clouds, which can have a cooling effect on the atmosphere, he said. Smaller particles near the planet’s surface have been shown to cause respiratory problems in humans.

Mauldin said the newly discovered oxidant might help explain recent studies that have shown large parts of the southeastern United States might have cooled slightly over the past century. Particulates from sulfuric acid over the forests there may be forming more clouds than normal, cooling the region by reflecting sunlight back to space.

Leveraging bacteria in drinking water to benefit consumers
Contrary to popular belief, purified drinking water from home faucets contains millions to hundreds of millions of widely differing bacteria per gallon, and scientists have discovered a plausible way to manipulate those populations of mostly beneficial microbes to potentially benefit consumers. Their study appears in ACS’ journal Environmental Science & Technology.

Lutgarde Raskin and colleagues Ameet Pinto and Chuanwu Xi explain that municipal water treatment plants typically try to minimize the growth of microbes in the huge filters that remove small particles and substances that can serve as nutrients for bacterial growth. These facilities also add chlorine or other disinfectants to kill bacteria and prevent them from thriving in water distribution pipes. Nevertheless, it’s not possible to totally eliminate bacteria with current technology, making it important to determine how the filter and other water treatment steps impact the types and amounts of bacteria that remain. That’s why the researchers set out to do this in a study at a treatment plant in Ann Arbor, Mich.

Their research provides suggestions on how to change which bacteria wind up in the drinking water. The scientists found that certain types of bacteria attach to the filters where they form biofilms from which small clumps can break off and make it into the drinking water supply. The water’s pH was a strong factor in determining which bacteria made it into the drinking water. Measures as simple as varying the water pH or changing how the filters are cleaned, for example, could help water treatment plant workers shift the balance toward bacteria that are beneficial for humans by not allowing the harmful bacteria to compete.

Boys appear to be more vulnerable than girls to the insecticide chlorpyrifos
Lower IQs seen in boys exposed in the womb to comparable amounts of the chemical

A new study is the first to find a difference between how boys and girls respond to prenatal exposure to the insecticide chlorpyrifos. Researchers at the Columbia Center for Children’s Environmental Health (CCCEH) at the Mailman School of Public Health found that, at age 7, boys had greater difficulty with working memory, a key component of IQ, than girls with similar exposures. On the plus side, having nurturing parents improved working memory, especially in boys, although it did not lessen the negative cognitive effects of exposure to the chemical.

Results are published online in the journal Neurotoxicology and Teratology.

In 2011, research led by Virginia Rauh, ScD, Co-Deputy Director of CCCEH, established a connection between prenatal exposure to chlorpyrifos and deficits in working memory and IQ at age 7. Earlier this year, a follow-up study showed evidence in MRI scans that even low to moderate levels of exposure during pregnancy may lead to long-term, potentially irreversible changes in the brain. The latest study, led by Megan Horton, PhD, explored the impact of sex differences and the home environment on these health outcomes.

Dr. Horton and colleagues looked at a subset of 335 mother-child pairs enrolled in the ongoing inner-city study of environmental exposures, including measures of prenatal chlorpyrifos in umbilical cord blood. When the children reached age 3, the researchers measured the home environment using the Home Observation for Measurement of the Environment (HOME) criteria, including two main categories: 1) environmental stimulation, defined as the availability of intellectually stimulating materials in the home and the mother’s encouragement of learning; and 2) parental nurturance, defined as attentiveness, displays of physical affection, encouragement of delayed gratification, limit setting, and the ability of the mother to control her negative reactions. The researchers tested IQ at age 7.

While home environment and sex had no moderating effect on IQ deficits related to chlorpyrifos exposure, the researchers uncovered two intriguing findings related to sex differences, albeit of borderline statistical strength: first, that chlorpyrifos exposure had a greater adverse cognitive impact in boys as compared to girls, lowering working memory scores by an average of three points more in boys than girls (96.5 vs. 99.8); and second, that parental nurturing was associated with better working memory, particularly in boys.

“There’s something about boys that makes them a little more susceptible to both bad exposures and good exposures,” says Dr. Horton. “One possible explanation for the greater sensitivity to chlorpyrifos is that the insecticide acts as an endocrine disruptor to suppress sex-specific hormones. In a study of rats, exposure to the chemical reduced testosterone, which plays a critical role in the development of the male brain.”

Going forward, Dr. Horton will look at how sex and the home environment may influence the effects of prenatal exposure to other environmental toxicants, such as those found in air pollution. “I expect this information will be useful in efforts to develop new interventions to protect children from the potentially negative consequences of early exposure to harmful chemicals,” says Dr. Horton.

The insecticide chlorpyrifos was widely used in homes until 2001 when the U.S. Environmental Protection Agency restricted indoor residential use, permitting continued commercial and agricultural applications. Since that time, a drop in residential levels of chlorpyrifos has been documented by Robin Whyatt, DrPH, Co-Deputy Director of CCCEH. The chemical continues to be present in the environment through its widespread use in agriculture (food and feed crops), wood treatments, and public spaces such as golf courses, some parks, and highway medians. People near these sources can be exposed by inhaling the chemical, which drifts on the wind. Low-level exposure can also occur by eating fruits and vegetables that have been sprayed with chlorpyrifos. Although the chemical is degraded rapidly by water and sunlight outdoors, it has been detected by the Columbia researchers in many urban residences several years after the ban went into effect. Many developing countries continue to use chlorpyrifos in the home setting.

Internal Medicine Physicians Recommend Principles on Role of Governments and Legislation in Regulating Patient-Physician Relationship
American College of Physicians paper expresses concern about laws that cross traditional boundaries and intrude into the realm of medical professionalism

August 8, 2012

(Washington) – The American College of Physicians (ACP) today released a paper, Statement of Principles on the Role of Governments in Regulating the Patient-Physician Relationship, which recommends principles for the role of federal and state governments in health care and the patient-physician relationship.

“The physician’s first and primary duty is to put the patient first,” David L. Bronson, MD, FACP, president of ACP, said. “To accomplish this duty, physicians and the medical profession have been granted by government a privileged position in society.”

Dr. Bronson noted, though, that “some recent laws and proposed legislation appear to inappropriately infringe on clinical medical practice and patient-physician relationships, crossing traditional boundaries and intruding into the realm of medical professionalism.”

Pointing to examples in ACP’s paper, he expressed concern about laws that interfere, or have the potential to interfere, with appropriate clinical practice by:

prohibiting physicians from      discussing with or asking their patients about risk factors that may      affect their health or the health of their families, as recommended by      evidence-based guidelines of care;
requiring physicians to discuss      specific practices that in the physician’s best clinical judgment are not      individualized to the patient;
requiring physicians to provide      diagnostic tests or medical interventions that are not supported by      evidence or clinical relevance; or
limiting information that      physicians can disclose to patients.
The paper, produced by ACP’s Health and Public Policy with input from ACP’s Ethics, Professionalism and Human Rights Committee, offers a framework for evaluating laws and regulations affecting the patient-physician relationship, rather than taking a position on the specific issues that are cited by lawmakers to impose particular restrictions or mandates.

ACP’s paper states that:

“Physicians      should not be prohibited by law or regulation from discussing with or      asking their patients about risk factors, or disclosing information to the      patient, which may affect their health, the health of their families,      sexual partners, and others who may be in contact with the patient.”
“Laws and      regulations should not mandate the content of what physicians may or may      not say to patients or mandate the provision or withholding of information      or care that, in the physician’s clinical judgment and based on clinical      evidence and the norms of the profession, are not necessary or appropriate      for a particular patient at the time of a patient encounter.”
ACP recommends seven questions that should be asked about any proposed law to impose restrictions on the patient-physician relationship:

Is the content and information or care consistent with the best available medical evidence on clinical effectiveness and appropriateness and professional standards of care?
Is the proposed law or regulation necessary to achieve public health objectives that directly affect the health of the individual patient, as well as population health, as supported by scientific evidence, and if so, are there no other reasonable ways to achieve the same objectives?
Could the presumed basis for a governmental role be better addressed through advisory clinical guidelines developed by professional societies?
Does the content and information or care allow for flexibility based on individual patient circumstances and on the most appropriate time, setting and means of delivering such information or care?
Is the proposed law or regulation required to achieve a public policy goal – such as protecting public health or encouraging access to needed medical care – without preventing physicians from addressing the healthcare needs of individual patients during specific clinical encounters based on the patient’s own circumstances, and with minimal interference to patient-physician relationships?
Does the content and information to be provided facilitate shared decision-making between patients and their physicians, based on the best medical evidence, the physician’s knowledge and clinical judgment, and patient values (beliefs and preferences), or would it undermine shared decision-making by specifying content that is forced upon patients and physicians without regard to the best medical evidence, the physician’s clinical judgment and the patient’s wishes?
Is there a process for appeal to accommodate individual patients’ circumstances?
By insisting that such questions be asked of proposed laws before a decision is made on their adoption, legislators will have appropriate guidance before enacting ill-considered laws that “can cause grave damage to the patient-physician relationship and medical professionalism and undermine the quality of care,” concluded Dr. Bronson.

Chronic exposure to staph bacteria may be risk factor for lupus, Mayo study finds
ROCHESTER, Minn. — Chronic exposure to even small amounts of staph bacteria could be a risk factor for the chronic inflammatory disease lupus, Mayo Clinic research shows. Staph, short for Staphylococcus aureus, is a germ commonly found on the skin or in the nose, sometimes causing infections. In the Mayo study, mice were exposed to low doses of a protein found in staph and developed a lupus-like disease, with kidney disease and autoantibodies like those found in the blood of lupus patients.

The findings are published online this month in The Journal of Immunology. The next step is to study lupus patients to see if the staph protein in question plays a similar role in humans, says co-author Vaidehi Chowdhary, M.D., a Mayo Clinic rheumatologist.

“We think this protein could be an important clue to what may cause or exacerbate lupus in certain genetically predisposed patients,” Dr. Chowdhary says. “Our hope is to confirm these findings in lupus patients and hopefully prevent flares.”

Another key question is whether treating at-risk people to eradicate staph can prevent lupus from forming in the first place. Lupus occurs when the immune system attacks tissues and joints. It may affect virtually any part of the body and can be tough to diagnose because it often mimics other ailments. There is no cure, only treatment to control symptoms. Lupus is more commonly diagnosed in women, African-Americans, Hispanics, Asians and people 15 to 40.

The cause is often unknown; it appears people genetically predisposed to lupus may develop it when something in the environment triggers it, such as infections, certain drugs or even sunlight.

Physicians do not really know what causes lupus, so the discovery of the staph protein’s possible role is exciting, Dr. Chowdhary says.

In the mice studied, a staph protein known as a staphylococcal enterotoxin B, or SEB, activated autoreactive T and B lymphocytes, a type of white blood cells, leading to an inflammatory illness mirroring lupus. Research on people has shown that carrying staph bacteria is linked to autoimmune diseases such as psoriasis, Kawasaki disease and graulomatosis with polyangiitis

Iron, vitamins could affect physical fitness in adolescents
Bethesda, Md. (Aug. 8, 2012)—Adolescence is an important time not only for growing but for acquiring healthy habits that will last a lifetime, such as choosing foods rich in vitamins and minerals, and adopting a regular exercise regimen. Unfortunately, several studies have shown that adolescents’ intake of important nutrients, as well as their performance on standard physical fitness tests, has fallen in recent years. Because nutrition and fitness are intertwined—for example, iron forms part of hemoglobin, which carries oxygen to muscles, and antioxidants such as vitamin C aid in rebuilding damage after intense training—these two findings could be related. In a new study, researchers have found that adolescents’ blood levels of various micronutrients are correlated with how well they performed in certain physical fitness tests. Though these results don’t prove causality, they suggest a new relationship between different measures of adolescent health.

The article is entitled “Iron and Vitamin Status Biomarkers and its Association with Physical Fitness in Adolescents. The HELENA Study.” and is online at http://bit.ly/Q2j6lJ. It appears in the online edition of the Journal of Applied Physiology, a publication of the American Physiological Society.

Methodology

Researcher Luis Gracia-Marco of the University of Zaragoza, Spain and his colleagues relied on data from a larger, long-term research project known as the Healthy Lifestyle in Europe by Nutrition in Adolescents Cross-Sectional Study, or HELENA-CSS. Part of this study, which involved thousands of volunteers between the ages of 12.5 and 17.5 in cities scattered across Europe, gathered nutrition and physical fitness data. Blood samples taken in one third of the volunteers (n=1089) were tested for a variety of micronutrients, including hemoglobin, indicative of iron intake, soluble transferrin receptor, serum ferritin, retinol, vitamin C, beta-carotene, alpha-tocopherol, vitamin B6, cobalamin, holo-transcobalamin, plasma folate, RCB folate and vitamin D. The volunteers’ physical fitness was also assessed through a standing long jump test, which assesses lower-body muscular strength, and a 20 meter shuttle run test, which assesses cardiovascular fitness through maximal oxygen consumption (VO2max). When looking for correlations between the micronutrient levels and physical fitness, they took into account the adolescents’ age, time of year, latitude of the city they lived in, body mass index, age of menarche in females, and amount of regular physical activity (using accelerometers).

Results

The researchers found that blood levels of certain micronutrients were intimately connected with the volunteers’ performance on the physical fitness tests. For cardiorespiratory fitness, concentrations of hemoglobin, retinol, and vitamin C in males and beta-carotene and vitamin D in females was associated with VO2max. For muscular fitness, concentrations of hemoglobin, beta-carotene, retinol, and alpha-tocopherol in males and beta-carotene and vitamin D in females was associated with performing better on the standing long jump test.

Importance of the Findings

The authors suggest that studies connecting micronutrients, such as the ones they measured, with physical fitness in any population has been controversial and limited. This is especially true for adolescents, a group that’s often difficult to gather information on. This new study, they say, is one of the first to find connections between micronutrients and physical fitness in this age group, with the strength of controlling the results for a complete set of relevant confounders. Yet, they note that more research still needs to be done.

“The associations between physical fitness and iron or vitamin status observed in this cross-sectional study in adolescents should be followed up by a study specifically designed to evaluate causal relationships,” the authors write.

Eating grapes may help protect heart health in men with metabolic syndrome, new study suggests
Grapes reduced blood pressure, improved blood flow and reduced inflammation

Fresno, Calif. – Consuming grapes may help protect heart health in people with metabolic syndrome, according to new research published in the Journal of Nutrition. Researchers observed a reduction in key risk factors for heart disease in men with metabolic syndrome: reduced blood pressure, improved blood flow and reduced inflammation. Natural components found in grapes, known as polyphenols, are thought to be responsible for these beneficial effects.

The randomized, placebo-controlled, crossover study, led by principal investigator Dr. Maria Luz Fernandez and Jacqueline Barona, a PhD student in Dr. Fernandez’ lab at the Department of Nutritional Sciences of the University of Connecticut, recruited men between 30 and 70 years of age with metabolic syndrome. The study is believed to be the first to look at the impact of grapes on metabolic syndrome.

Metabolic syndrome is a cluster of conditions that occur together – increased blood pressure, a high blood sugar level, excess body fat around the waist or low HDL (the good cholesterol) and increased blood triglycerides – significantly increasing the risk for heart disease, stroke and diabetes. Metabolic syndrome is a major public health concern, and is on the rise in the U.S.

In this study, participants were randomly assigned to consume grapes, in the form of a freeze-dried whole grape powder, or a placebo powder, for four weeks. Then, following a 3-week “washout” period where neither grapes nor placebo were consumed, individuals were allocated to the alternate treatment. This powerful study design allowed investigators to compare the response of each individual to consumption of both the placebo and grapes.

The study results showed that for each of the study’s subjects, grape consumption resulted in significant decreases in blood pressure, improved blood flow (greater vasodilation), and decreases in a compound associated with inflammation.

“These results suggest that consuming grapes can improve important risk factors associated with heart disease, in a population that is already at higher risk,” said Fernandez. “This further supports the accumulating evidence that grapes can positively influence heart health, and extends it to men with metabolic syndrome.”

These reports are done with the appreciation of all the Doctors, Scientist, and other
Medical Researchers who sacrificed their time and effort. In order to give people the
ability to empower themselves. Without the base aspirations for fame, or fortune. Just honorable people, doing honorable things.

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply