Health Technology Research Synopsis
116th Issue Date 25NOV2011
Compiled By Ralph Turchiano
Editors Top Five:
1. Milk thistle stops lung cancer in mice
2. Garlic oil component may form treatment to protect heart
3. On Track to Getting Even Fatter
4. 3 p.m. slump? Why a sugar rush may not be the answer
5. Today’s teens will die younger of heart disease
In this Issue:
1. Polio Still a Threat to Public Health
2. Tear drops may rival blood drops in testing blood sugar in diabetes
3. Do plants perform best with family or strangers? Researchers consider social interactions
4. Testing of seafood imported into the US is inadequate
5. Severe alcoholic hepatitis: An effective combination of 2 treatments
6. Diet and supplements: What’s good and bad for kidney disease patients
7. Low vitamin C levels may raise heart failure patients’ risk
8. Future obesity may be predicted at 3.5 years of age
9. Parkinson’s disease risk greater in those exposed to trichloroethylene
10. High childhood IQ linked to subsequent illicit drug use
11. Contraceptive pill associated with increased prostate cancer risk worldwide
12. Suggested link between radon and skin cancer
13. ‘Stomach flu’ may be linked to food allergies
14. Researchers confirm new cancer-causing virus
15. Is a stranger genetically wired to be trustworthy? You’ll know in 20 seconds
16. Niacin Does Not Reduce Heart Attack, Stroke Risk in Stable, Cardiovascular Patients Whose Cholesterol is Well-Controlled to Treatment Guidelines. (READ)
17. Milk thistle stops lung cancer in mice
18. Illegal drug use is associated with abnormal weight in teens
19. Garlic oil component may form treatment to protect heart
20. 3 p.m. slump? Why a sugar rush may not be the answer
21. On Track to Getting Even Fatter
22. Probiotic protects intestine from radiation injury
23. Today’s teens will die younger of heart disease
Polio Still a Threat to Public Health Monday, November 7, 2011
Live virus used in polio vaccine can evolve and infect, warns TAU researcher
Health professionals and researchers across the globe believe they are on the verge of eradicating polio, a devastating virus which can lead to paralysis and death. Despite successful eradication in most countries, there are still four countries where the virus is considered endemic — and many more in which the virus still lurks.
Dr. Lester Shulman of Tel Aviv University’s Sackler Faculty of Medicine and the Israeli Ministry of Health has spent years tracking isolated cases of live poliovirus infections, often discovered in countries that are supposedly polio-free. When the live-virus version of the vaccine, called Oral Polio Vaccine (OPV) evolves, he says, it can act like wild poliovirus and continue the threat of contagion.
Medical professionals widely believe that after the wild virus is eradicated, resources dedicated to polio immunization can be redirected. But this isn’t so, he says. He recommends that public health agencies take a three-pronged approach: Vaccination policies to maintain “herd immunity” (a 95 percent immunization rate for polio) should be maintained to prevent the spread of wild and evolved vaccine strains of the virus; environmental surveillance of sewage systems should continue; and a switch to Inactivated Polio Vaccine (IPV) instead of OPV should be implemented.
Dr. Shulman’s research was recently published in PLoS ONE. He has also been invited as an informal expert to the World Health Organization’s annual meeting on polio this fall.
A decade-long chase
While the eradication of polio is seemingly within reach, this is not the time to relax, Dr. Shulman warns. Most countries only investigate the possibility of poliovirus outbreaks when paralytic cases appear in the human population. But this doesn’t take into account a potential problem posed by the live virus vaccine. Over time, the vaccine can mutate, and even a 1 percent genomic change in the virus permits the virus to behave like a wild poliovirus. If a population isn’t sufficiently immunized, this spells trouble.
Israel is among the few countries that practice environmental surveillance for polio, beginning in 1989. Checking designated sites along sewage systems every month for evidence of the virus allows for early detection before there are paralytic cases. For the past decade, the researchers have been trying to trace the origin of the strain that infected two individuals in Central Israel. They tracked the strains to the sewage system, and have been working to pinpoint the origin. Fortunately, because Israel maintains herd immunity for the disease, the wider population has not been threatened.
Dr. Shulman says that in the lab, each strain of the virus can be identified from its genomic structure and traced back to the region from which it originated. “From the sequence of the genome, you can match it with known sequences reported by labs throughout the world,” he explains. For example, he and his colleagues traced a wild poliovirus discovered in sewage from the Gaza District to a village in Egypt.
New hope for curing persistent infections
Convinced by the efficacy of Israel’s environmental surveillance program, many other countries are starting to develop tracking programs of their own. As a result, they are finding evidence of vaccine-derived polio cases in humans. Paradoxically, Dr. Shulman sees a beacon of hope in these discoveries. As labs across the world report more cases, researchers gain a better understanding of how polioviruses establish persistent infections and can then develop effective measures to eliminate them.
The fellow researchers are now working to develop compounds that can effectively fight these rare cases of persistent poliovirus infections. So far, they have seen promising results, noting that the mutants strains have not become resistant to the drugs under investigation. But for now, Dr. Shulman recommends that health authorities continue immunization using inactivated vaccines (IPV) to keep their populations safe.
Public release date: 9-Nov-2011
Tear drops may rival blood drops in testing blood sugar in diabetes
Scientists are reporting development and successful laboratory testing of an electrochemical sensor device that has the potential to measure blood sugar levels from tears instead of blood — an advance that could save the world’s 350 million diabetes patients the discomfort of pricking their fingers for droplets of blood used in traditional blood sugar tests. Their report appears in ACS’ journal Analytical Chemistry.
Mark Meyerhoff and colleagues explain that about 5 percent of the world’s population (and about 26 million people in the U.S. alone) have diabetes. The disease is a fast-growing public health problem because of a sharp global increase in obesity, which makes people susceptible to developing type 2 diabetes. People with diabetes must monitor their blood glucose levels several times a day to make sure they are within a safe range. Current handheld glucose meters require a drop of blood, which patients draw by pricking their fingers with a small pin or lancet. However, some patients regard that pinprick as painful enough to discourage regular testing. That’s why Meyerhoff’s team is working to develop a new, pain-free device that can use tear glucose levels as an accurate reflection of blood sugar levels.
Tests of their approach in laboratory rabbits, used as surrogates for humans in such experiments, showed that levels of glucose in tears track the amounts of glucose in the blood. “Thus, it may be possible to measure tear glucose levels multiple times per day to monitor blood glucose changes without the potential pain from the repeated invasive blood drawing method,” say the researchers.
Public release date: 9-Nov-2011
Do plants perform best with family or strangers? Researchers consider social interactions
In the fight for survival, plants are capable of complex social behaviours and may exhibit altruism towards family members, but aggressively compete with strangers.
A growing body of work suggests plants recognize and respond to the presence and identity of their neighbours. But can plants cooperate with their relatives? While some studies have shown that siblings perform best — suggesting altruism towards relatives — other studies have shown that when less related plants grow together the group can actually outperform siblings. This implies the group benefits from its diversity by dividing precious resources effectively and competing less.
A team from McMaster University suggests plants can benefit from both altruism and biodiversity but when these processes occur at the same time, it is difficult to predict the outcome.
“The greatest challenge for understanding plant social interactions is we can’t interpret plant behaviours as easily as we do those of animals,” explains Susan Dudley, an associate professor in the Department of Biology at McMaster. “Though we have shown plants change traits in the presence of relatives, we need to determine if this is cooperation. Linking the plant behaviours with their benefits is challenging when multiple processes co-occur.”
Dudley and a team of researchers disentangle the sometimes contradictory research in the latest edition of the Proceedings of the Royal Society B, describing how the identity and presence of neighbours affect many processes acting on plant populations.
The problem, she says, is that plant social interactions are treated as a black box, with researchers only looking at the output, or the fitness of the plant, in sibling competition. But they need to investigate the mechanisms inside the box — by describing how traits of individuals affect fitness — to understand how the output is reached and which mechanisms are occurring to get there.
“Simply put, social environment matters to plants. If we first acknowledge that kin cooperation and resource partitioning are co-occurring, we can begin to address some very important questions,” says Amanda File, a graduate student in the Department of Biology at McMaster.
“Among these questions is whether there is a link between kin recognition and plant performance, whether plant kin recognition can improve crop yield and how kin recognition shapes communities and ecosystems” says Guillermo Murphy, a graduate student in the Department of Biology at McMaster.
Public release date: 9-Nov-2011
Testing of seafood imported into the US is inadequate
Finfish, shrimp, and seafood products are some of the most widely traded foods and about 85 percent of seafood consumed in the U.S. is imported. A new study by researchers from the Johns Hopkins Center for a Livable Future at the Bloomberg School of Public Health shows that testing of imported seafood by the U.S. Food and Drug Administration (FDA) is inadequate for confirming its safety or identifying risks. The findings, published this month in Environmental Science and Technology, highlight deficiencies in inspection programs for imported seafood across four of the world’s largest importing bodies and show which types of aquatic animals, and from which countries, are most often failing inspection. The study identified a lack of inspection in the U.S. compared to its peers: only 2 percent of all seafood imported into the U.S. is tested for contamination, while the European Union, Japan and Canada inspect as much as 50 percent, 18 percent, and 15 percent of certain imported seafood products. When testing in the U.S. does occur, residues of drugs used in aquaculture, or “fish farms,” are sometimes found; above certain concentrations, these drugs are harmful to humans.
David Love, PhD, lead author of the study, and colleagues at the Johns Hopkins Center for a Livable Future, acquired data on seafood inspection programs from governmental websites and from direct queries to governmental bodies. They analyzed the number of violations of drug residue standards as a function of species of aquatic animal, exporting country, drug type, import volume and concentration of residue.
Their findings indicate there is an insufficient body of data for evaluating the health risks associated with drug residues in U.S. seafood imports. “Data made accessible to the public by the FDA precludes estimation of exposures to veterinary drugs incurred by the U.S. population,” said Keeve Nachman, PhD, a study co-author. Researchers encountered a lack of transparency in U.S. testing protocol and policy. One example of the FDA’s opacity is that its public records do not specify when fish pass inspection or whether testing was performed on random samples or targeted samples; these distinctions are critical to accurate assessment of the prevalence of the drug residues.
Love and colleagues’ results showed that the FDA tests for 13 types of drug residues, in contrast to inspection agencies in Europe and Japan that test for 34 and 27 drugs, respectively. This discrepancy suggests that seafood producers can use many drugs for which the U.S. does not screen. Based on the authors’ findings of drug residues, it can be surmised that veterinary drugs are continuing to be used in aquaculture from developing countries, which can lead to adverse health consequences, including the development of antibiotic-resistant bacteria on fish farms and their spread in seafood products.
Imports to the U.S., E.U., Canada and Japan with the highest frequency of drug violations were shrimp or prawns, eel, crabs, catfish or pangasius, tilapia and salmon. Vietnam, China, Thailand, Indonesia, Taiwan, India, and Malaysia were identified as the exporters to the U.S., E.U., Canada and Japan with the most drug violations.
According to Love, “Consumers should be familiar with the country-of-origin and whether the animal was wild-caught or farm-raised.” Love admits, “Fortunately, this information has been listed on all raw or lightly processed seafood products in grocery stores since 2005, following the Country of Origin Labeling (COOL) law.”
“Imported seafood may carry risks in terms of food safety because the FDA does not have the resources to proactively and regularly inspect foreign facilities, and it relies on product testing as a last resort,” said Love. To minimize the risks of seafood imports and to raise U.S. testing standards to match those of other countries, the authors recommend that the FDA budget be expanded to allow for more exhaustive testing and hiring of more inspectors.
Public release date: 10-Nov-2011
Severe alcoholic hepatitis: An effective combination of 2 treatments
Acute alcoholic hepatitis is one of the most serious forms of alcoholic liver disease, affecting individuals with chronic excessive alcohol consumption, which generally equates to more than 50 grams of alcohol per day (roughly five drinks), over a period of more than three to six months. The disease is characterized by liver failure (hepatic insufficiency) and acute jaundice (icterus), which may induce a coma through liver failure (hepatic encephalopathy) and an ensuing death rate of between 40-45% within the first six months. Conventional treatment involves stopping alcohol consumption (alcohol abstinence) and administering cortisone (corticosteroid therapy over a one-month period) to fight the highly inflammatory nature of the disease. Despite this treatment, 30-35% of patients with acute alcoholic hepatitis still die within a six month period, since the cortisone-based treatment remains insufficient.
Given the given the frequent deficiency in the antioxidant capacity of the alcoholic liver (described above), combined with chronic inflammation associated with the disease, the aim of the study coordinated by Professor Nguyen-Khas (Amiens University Hospital Centre and the Inserm research team) was to combine an antioxidant treatment with the conventional anti-inflammatory treatment. The results show an improved survival rate for patients suffering from acute alcoholic hepatitis who were given both medication types, with significantly less deaths within one month of treatment compared to the group who received the cortisone- only treatment. Tolerance levels to the treatment were good. The new, very low-cost medicine (N-acetylcysteine) is a molecule that has long been used in the treatment of hepatitis caused by drugs such as paracetamol, or mucolytic to fluidify bronchial secretion.
These results improve the prognosis for severe alcoholic hepatitis patients.
Public release date: 11-Nov-2011
Diet and supplements: What’s good and bad for kidney disease patients
Pomegranate juice helps manage blood pressure; Many kidney disease patients take potentially harmful supplements
Pomegranate juice lowers kidney disease patients’ cholesterol, blood pressure, and the need for blood pressure medications.
More than 15% of kidney disease patients take herbs or dietary supplements that the National Kidney Foundation says may be harmful to their health.
Two studies presented during the American Society of Nephrology’s Annual Kidney Week provide new information on dietary benefits and dangers in kidney disease patients.
Lilach Shema, PhD (Western Galilee Medical Center in Israel) and colleagues investigated the long-term effects of drinking pomegranate juice on heart disease risk factors — such as high cholesterol and blood pressure — in kidney disease patients. Pomegranate juice is rich in antioxidants and has been touted as having a variety of health benefits.
The researchers randomized 101 dialysis patients to receive about three-and-a-half ounces of pomegranate juice or placebo, three times a week. After one year, the number of blood pressure drugs patients took decreased in 22% of patients drinking pomegranate juice compared to 7.7% in the placebo group, while an increase was documented in 12.2% of patients drinking pomegranate juice compared to 34.6% in the placebo group. Patients who drank pomegranate juice also had healthier blood pressure and cholesterol levels and less plaque build-up in their arteries. These results suggest that drinking pomegranate juice might decrease the high rates of illness and death among kidney disease patients.
Another team led by Vanessa Grubbs, MD (University of California, San Francisco) looked at the use of dietary supplements among patients with chronic kidney disease (CKD). The National Kidney Foundation identifies 39 herbs that may be harmful to CKD patients, but no one knows how many of these patients take them.
Using data from the 1999 to 2008 annual National Health and Nutrition Examination Survey (NHANES), a program of studies designed to assess the health and nutritional status of adults and children in the United State, the investigators examined the reported use of dietary supplements in the past 30 days among 21,169 adults. While an estimated 52.4% of participants reported taking any dietary supplement, the supplement was potentially harmful among 15.3%. Use of supplements was not statistically different by CKD severity.
Because many CKD patients use potentially harmful supplements, they may be unaware of their risks. “Although people tend to think of dietary supplements as healthy, many contain ingredients that can actually be harmful to the kidneys,” said Dr. Grubbs. Healthcare providers, too, may be unaware that some supplements are potentially harmful and that patients with CKD are taking them. Further research and education are warranted.
Ralph’s Note – I really don’t know where this article is going. Pomegranate Juice is great, and imaginary supplements bad?
Public release date: 13-Nov-2011
Low vitamin C levels may raise heart failure patients’ risk
Low levels of vitamin C were associated with higher levels of high sensitivity C-Reactive protein (hsCRP) and shorter intervals without major cardiac issues or death for heart failure patients, in research presented at the American Heart Association’s Scientific Sessions 2011.
Compared to those with high vitamin C intake from food, heart failure patients in the study who had low vitamin C intake were 2.4 times more likely to have higher levels of hsCRP, a marker for inflammation and a risk factor for heart disease.
The study is the first to demonstrate that low vitamin C intake is associated with worse outcomes for heart failure patients.
Study participants with low vitamin C intake and hsCRP over 3 milligrams per liter (mg/L) were also nearly twice as likely to die from cardiovascular disease within one year of follow-up.
“We found that adequate intake of vitamin C was associated with longer survival in patients with heart failure,” said Eun Kyeung Song, Ph.D., R.N., lead author of the study and assistant professor at the Department of Nursing, College of Medicine, in the University of Ulsan in Korea.
The average age among the 212 patients in the study was 61, and about one-third were women. Approximately 45 percent of the participants had moderate to severe heart failure.
Participants completed a four-day food diary verified by a registered dietitian and a software program calculated their vitamin C intake. Bloods tests measured hsCRP.
Researchers divided participants into one group with levels over 3 mg/L of hsCRP and another with lower levels. Patients were followed for one year to determine the length of time to their first visit to the emergency department due to cardiac problems or death.
Researchers found that 82 patients (39 percent) had inadequate vitamin C intake, according to criteria set by the Institute of Medicine. These criteria allowed the researchers to estimate the likelihood that the patient’s diet was habitually deficient in vitamin C based on a four day food diary. After a year follow-up, 61 patients (29 percent) had cardiac events, which included an emergency department visit or hospitalization due to cardiac problems, or cardiac death.
The researchers found that 98 patients (46 percent) had hsCRP over 3 mg/L, according to Song.
Inflammatory pathways in heart failure patients may be why vitamin C deficiency contributed to poor health outcomes, the data suggests.
“Increased levels of high-sensitivity C-reactive protein means a worsening of heart failure,” Song said. “An adequate level of vitamin C is associated with lower levels of high-sensitivity C-reactive protein. This results in a longer cardiac event-free survival in patients.”
The use of diuretics may also play a role because vitamin C is water soluble and diuretics increase the amount of water excreted from the kidneys, said Terry Lennie, Ph.D., R.N., study author and associate dean of Ph.D. studies in the College of Nursing at the University of Kentucky in Lexington, Kentucky.
“Diet is the best source of vitamin C,” Lennie said. “Eating the recommended five servings of fruits and vegetables a day provides an adequate amount.”
More randomized controlled trials and longitudinal prospective studies are needed to determine the impact of other micronutrients on survival or rehospitalization, Song said.
Public release date: 13-Nov-2011
Future obesity may be predicted at 3.5 years of age
Children of mothers who smoked and were overweight during pregnancy are most at risk
Researchers can predict which children are most likely to become obese by examining their mothers’ behaviour around their birth, according to a recent University of Montreal study published in the Archives of Pediatric and Adolescent Medicine. “Although behaviour is extremely hard to change and is also influenced by a complex tangle of influencing factors in the environment, I hope these findings will help improve the social and medical services we offer to mothers and infants,” said lead author Laura Pryor, a PhD candidate at the university’s Department of Social and Preventive Medicine. The findings come as the province of Quebec, like other societies, grapples with a surge in childhood obesity over the last generation.
Pryor and the study team, led by Sylvana Côté, analyzed data drawn from the Quebec Longitudinal Study of Child Development that ran from 1998 to 2006. Quebec is fortunate in that it is able to offer scientists this kind of data, enabling them to look at how a situation evolves over time. Scientists studying this kind of phenomena in other areas must often rely on cross-sectional studies that are based on data collected at a specific time for a specific purpose. The team focused on 1,957 children whose height and weight measurements had been taken yearly, from the age of five months to eight years old, and recorded in a database. This information enabled the team to look at the development of the children’s body mass index (BMI). BMI is calculated as weight in kilograms divided by height in meters squared. The researchers identified three trajectory groups: children with low but stable BMI, children with moderate BMI, and children whose BMI was elevated and rising, called high-rising BMI.
“We discovered the trajectories of all three groups were similar until the children were about two and a half,” Pryor said. “Around that point the BMIs of the high-rising group of children began to take off. By the time these children moved into middle childhood, more than 50 per cent of them were obese according to international criteria.” Researchers found two factors that may explain this: the mothers’ weight around the time they gave birth and whether the mothers smoked. A child with a mother who was overweight or who smoked during pregnancy was significantly more likely to be in the high-rising group. These two factors were found to be much more important than the other criteria that were studied, such as the child’s birth weight.
The risk factors identified here represent increased probabilities of becoming overweight, not direct causes. More research will be required to determine how these early-life factors and others are correlated with childhood obesity. “Our research adds to the growing evidence that the perinatal environment has an important influence on later obesity,” Pryor said. “This points to the need for early interventions with at-risk families in order to prevent the development of childhood weight problems and the intergenerational transmission of ill health. I would like to conduct further studies to find out what happens to these kids once they reach adolescence, and I hope that my research will help in the development of strategies to combat this serious public health issue.”
Public release date: 14-Nov-2011
Parkinson’s disease risk greater in those exposed to trichloroethylene
Symptoms of disease may appear 10 to 40 years following exposure
A novel study in twins found that exposure to trichloroethylene (TCE) — a hazardous organic contaminant found in soil, groundwater, and air — is significantly associated with increased risk of Parkinson’s disease (PD). Possibility of developing this neurodegenerative disease is also linked to perchloroethylene (PERC) and carbon tetrachloride (CCI4) exposure according to the study appearing today in Annals of Neurology, a journal published by Wiley-Blackwell on behalf of the American Neurological Association and Child Neurology Society.
The National Institute of Neurological Disorders and Stroke (NINDS) estimates that as many as 500,000 Americans have PD and more than 50,000 new cases are diagnosed annually. While there is much debate regarding cause of PD, studies suggest that genetic and environmental factors likely trigger the disease which is characterized by symptoms such as limb tremors, slowed movement, muscle stiffness, and speech impairment. Several studies have reported that exposure to solvents may increase risk of PD, but research assessing specific agents is limited.
The current epidemiological study, led by Drs. Samuel Goldman and Caroline Tanner with The Parkinson’s Institute in Sunnyvale, California, investigated exposure to TCE, PERC and CCI4 and risk of developing PD. The team interviewed 99 twin pairs from the National Academy of Sciences/National Research Council World War II Veteran Twins Cohort in which one twin had PD and one didn’t, inquiring about lifetime occupations and hobbies. Lifetime exposures to six specific solvents previously linked to PD in medical literature — n-hexane, xylene, toluene, CCl4, TCE and PERC — were inferred for each job or hobby.
The findings are the first to report a significant association between TCE exposure and PD — a more than 6-fold increased risk. Researchers also found that exposure to PERC and CCI4 tended toward significant risk of developing the disease. “Our study confirms that common environmental contaminants may increase the risk of developing PD, which has considerable public health implications,” commented Dr. Goldman.
TCE, PERC and CCI4 have been used extensively worldwide, with TCE noted as a common agent in dry-cleaning solutions, adhesives, paints, and carpet cleaners. Despite the Food and Drug Administration (FDA) banning the use of TCE as a general anesthetic, skin disinfectant, and coffee decaffeinating agent in 1977, it is still widely used today as a degreasing agent. In the U.S., millions of pounds of TCE are still released into the environment each year and it is the most common organic contaminant found in ground water, detected in up to 30% of drinking water supplies in the country.
While this study focused on occupational exposures, the solvents investigated are pervasive in the environment. The authors suggest that replication of well-characterized exposures in other populations is necessary. Dr. Goldman concluded, “Our findings, as well as prior case reports, suggest a lag time of up to 40 years between TCE exposure and onset of PD, providing a critical window of opportunity to potentially slow the disease process before clinical symptoms appear.”
In a release issued on September 28, 2011 the Environmental Protection Agency (EPA) announced that TCE is carcinogenic to humans.
Public release date: 14-Nov-2011
High childhood IQ linked to subsequent illicit drug use
A high childhood IQ may be linked to subsequent illegal drug use, particularly among women, suggests research published online in the Journal of Epidemiology and Community Health.
The authors base their findings on data from just under 8,000 people in the 1970 British Cohort Study, a large ongoing population based study, which looks at lifetime drug use, socioeconomic factors, and educational attainment.
The IQ scores of the participants were measured at the ages of 5 and 10 years, using a validated scale, and information was gathered on self reported levels of psychological distress and drug use at the age of 16, and again at the age of 30 (drug use only) .
Drug use included cannabis; cocaine; uppers (speed and wiz); downers (blues, tanks, barbiturates); LSD (acid); and heroin.
By the age of 30, around one in three men (35.4%) and one in six women (15.9%) had used cannabis, while 8.6% of men and 3.6% of women had used cocaine, in the previous 12 months.
A similar pattern of use was found for the other drugs, with overall drug use twice as common among men as among women.
When intelligence was factored in, the analysis showed that men with high IQ scores at the age of 5 were around 50% more likely to have used amphetamines, ecstasy, and several illicit drugs than those with low scores, 25 years later.
The link was even stronger among women, who were more than twice as likely to have used cannabis and cocaine as those with low IQ scores.
The same associations emerged between a high IQ score at the age of 10 and subsequent use of cannabis, ecstasy, amphetamines, multiple drug use and cocaine, although this last association was only evident at the age of 30.
The findings held true, irrespective of anxiety/depression during adolescence, parental social class, and lifetime household income.
“Although most studies have suggested that higher child or adolescent IQ prompts the adoption of a healthy lifestyle as an adult, other studies have linked higher childhood IQ scores to excess alcohol intake and alcohol dependency in adulthood,” write the authors.
Although it is not yet clear exactly why there should be a link between high IQ and illicit drug use, the authors point to previous research, showing that highly intelligent people are open to experiences and keen on novelty and stimulation.
Other research has also shown that brainy children are often easily bored and suffer at the hands of their peers for being different, “either of which could conceivably increase vulnerability to using drugs as an avoidant coping strategy,” explain the authors.
Public release date: 14-Nov-2011
Contraceptive pill associated with increased prostate cancer risk worldwide
Oral contraceptive use is associated with prostate cancer: An ecological study
Use of the contraceptive pill is associated with an increased risk of prostate cancer around the globe, finds research published in BMJ Open.
Prostate cancer is the most common male cancer in the developed world and the use of the contraceptive pill has soared over the past 40 years, say the authors.
The research team used data from the International Agency for Research on Cancer (IARC) and the United Nations World Contraceptive Use report to pinpoint rates of prostate cancer and associated deaths and the proportion of women using common methods of contraception for 2007.
They then analysed the data for individual nations and continents worldwide to see if there was any link between use of the contraceptive pill and illness and death caused by prostate cancer.
Their calculations showed that use of intrauterine devices, condoms, or other vaginal barriers was not associated with an increased risk of prostate cancer.
But use of the contraceptive pill in the population as a whole was significantly associated with both the number of new cases of, and deaths from, prostate cancer in individual countries around the world, the analysis showed. These findings were not affected by a nation’s wealth.
The authors emphasise that their research is speculative and designed to prompt further consideration of the issues. As such, their analysis does not confirm cause and effect, and therefore definitive conclusions cannot be drawn, as yet.
But they refer to several recent studies which have suggested that oestrogen exposure may boost the risk of prostate cancer.
Excess oestrogen exposure is known to cause cancer, and it is thought that widespread use of the Pill might raise environmental levels of endocrine disruptive compounds (EDCs) – which include by-products of oral contraceptive metabolism.
These don’t break down easily, so can be passed into the urine and end up in the drinking water supply or the food chain, exposing the general population, say the authors.
“Temporal increases in the incidence of certain cancers (breast, endometrial, thyroid, testis and prostate) in hormonally sensitive tissues in many parts of the industrialised world are often cited as evidence that widespread exposure of the general population to EDCs has had adverse impacts on human health,” they write.
Public release date: 14-Nov-2011
Suggested link between radon and skin cancer
A new study published this week suggests that a link may exist between radon exposure and non-melanoma skin cancer
A new study published this week suggests that a link may exist between radon exposure and non-melanoma skin cancer.
Researchers from the European Centre for Environment & Human Health (part of the Peninsula College of Medicine & Dentistry) have detected a connection following analysis of data on radon exposure and skin cancer cases from across southwest England. The study, which looked at small geographical areas across Devon and Cornwall, builds upon a similar study conducted 15 years ago.
Radon is a naturally occurring, radioactive gas found in soil and bedrock common in parts of the southwest. It has been recognised as a minor contributor to cases of lung cancer, but so far there has been no firm evidence to suggest it has wider health implications.
Whilst both radon levels and skin cancer incidence in the southwest are amongst the highest in the UK, the study found no association between household radon levels and malignant melanoma, or the most common form of skin cancer basal cell carcinoma. However, a link was found between areas where high radon concentrations are found and a particular type of non-melanoma skin cancer called squamous cell carcinoma.
The analysis took account of the way population characteristics, exposure to sunshine and proximity to the coast vary across the region. However, the researchers highlighted people’s exposure to ultraviolet (UV) radiation from the sun as a particularly difficult factor to account for, especially as this represents an important risk factor for developing skin cancer.
Despite the limitations of the study, researchers feel it is an important area needing further investigation. Lead author of the study, Dr Ben Wheeler said
“We know that naturally occurring radon is a contributing factor to a small proportion of lung cancers, but there is limited evidence of other health implications. These findings suggest that the issue of radon and skin cancer deserves a much closer look and we’re planning to develop a more detailed study capable of detecting a direct relationship, if one actually exists”.
Public release date: 14-Nov-2011
‘Stomach flu’ may be linked to food allergies
Researchers at the Medical College of Wisconsin have found a possible link between norovirus, a virus that causes “stomach flu” in humans, and food allergies. The findings are published in The Open Immunology Journal, Volume 4, 2011.
Mitchell H. Grayson, M.D., associate professor of pediatrics, medicine, microbiology and molecular genetics at the Medical College, and a pediatric allergist practicing at Children’s Hospital of Wisconsin, is the corresponding author of the paper.
The researchers took mice infected with norovirus and fed them egg protein. They then examined the mice for signs of an immunoglobulin E, or IgE, response against the food protein; an IgE response is what leads to an allergic reaction. The team of researchers has previously shown an IgE response to an inhaled protein during a respiratory infection in another a mouse model, which suggests early respiratory infections in children could lead to allergic diseases like asthma later in childhood. Likewise, an IgE response to a gastrointestinal virus could signify a likelihood of developing a food allergy after the viral infection.
Six million children in the United States have food allergies, and the Centers for Disease Control reports an 18 percent increase in the prevalence of food allergies from 1997 to 2007. Every three minutes, a food allergy sends a child to the emergency room.
“Food allergies are a dangerous, costly health issue not only in the United States, but worldwide,” said Dr. Grayson. “This study provides additional support for the idea that allergic disease may be related to an antiviral immune response, and further studies are planned to continue exploring the exact series of events that connect the antiviral response with allergic diseases.”
Public release date: 14-Nov-2011
Researchers confirm new cancer-causing virus
Common cytomegalovirus has central role in salivary gland cancer and possibly other malignancies
An important new study from the Laboratory for Developmental Genetics at USC has confirmed cytomegalovirus (CMV) as a cause of the most common salivary gland cancers. CMV joins a group of fewer than 10 identified oncoviruses — cancer-causing viruses — including HPV.
The findings, published online in the journal Experimental and Molecular Pathology over the weekend, are the latest in a series of studies by USC researchers that together demonstrate CMV’s role as an oncovirus, a virus that can either trigger cancer in healthy cells or exploit mutant cell weaknesses to enhance tumor formation.
Lead author Michael Melnick, professor of developmental genetics in the Ostrow School of Dentistry of USC and Co-Director of the Laboratory for Developmental Genetics, said the conclusion that CMV is an oncovirus came after rigorous study of both human salivary gland tumors and salivary glands of postnatal mice.
CMV’s classification as an oncovirus has important implications for human health. The virus, which has an extremely high prevalence in humans, can cause severe illness and death in patients with compromised immune systems and can cause birth defects if a woman is exposed to CMV for the first time while pregnant. It may also be connected to other cancers besides salivary gland cancer, Melnick added.
“CMV is incredibly common; most of us likely carry it because of our exposure to it,” he said. “In healthy patients with normal immune systems, it becomes dormant and resides inactive in the salivary glands. No one knows what reactivates it.”
This study illustrates not only that the CMV in the tumors is active but also that the amount of virus-created proteins found is positively correlated with the severity of the cancer, Melnick said.
Previous work with mice satisfied other important criteria needed to link CMV to cancer. After salivary glands obtained from newborn mice were exposed to purified CMV, cancer developed. In addition, efforts to stop the cancer’s progression identified how the virus was acting upon the cells to spark the disease.
Thus, the team not only uncovered the connection between CMV and mucoepidermoid carcinoma, the most common type of salivary gland cancer, but also identified a specific molecular signaling pathway exploited by the virus to create tumors, being the same in humans and mice.
“Typically, this pathway is only active during embryonic growth and development,” Melnick said, “but when CMV turns it back on, the resulting growth is a malignant tumor that supports production of more and more of the virus.”
The study was conducted by Melnick with Ostrow School of Dentistry of USC colleagues Tina Jaskoll, professor of developmental genetics and co-director of the Laboratory for Developmental Genetics; Parish Sedghizadeh, director of the USC Center for Biofilms and associate professor of diagnostic sciences; and Carl Allen at The Ohio State University.
Jaskoll said salivary gland cancers can be particularly problematic because they often go undiagnosed until they reach a late stage. And since the affected area is near the face, surgical treatment can be quite extensive and seriously detrimental to a patient’s quality of life.
However, with the new information about CMV’s connection to cancer comes hope for new prevention and treatment methods, perhaps akin to the development of measures to mitigate human papilloma virus (HPV) after its connection to cervical cancer was established. Jaskoll added that the mouse salivary gland model created to connect CMV to cancer might also be used to design more effective treatments.
“This could allow us to have more rational design of drugs used to treat these tumors,” she said.
Melnick said that in the not too distant future, he expects much more information about viruses and their connections to cancer and other health issues seemingly unrelated to viral infection to emerge.
“This should be a most fruitful area of investigation for a long time to come,” he said. “This is just the tip of the iceberg with viruses.”
Public release date: 14-Nov-2011
Is a stranger genetically wired to be trustworthy? You’ll know in 20 seconds
By Yasmin Anwar, Media Relations | November 14, 2011
There’s definitely something to be said for first impressions. New research from the University of California, Berkeley, suggests it can take just 20 seconds to detect whether a stranger is genetically inclined to being trustworthy, kind or compassionate.
The findings reinforce that healthy humans are wired to recognize strangers who may help them out in a tough situation. They also pave the way for genetic therapies for people who are not innately sympathetic, researchers said.
This video shows a sequence of five individuals listening to their romantic partners talk about times when they suffered. They are among two-dozen study participants who were rated on an empathy scale according to their facial expressions and body language.
“It’s remarkable that complete strangers could pick up on who’s trustworthy, kind or compassionate in 20 seconds when all they saw was a person sitting in a chair listening to someone talk,” said Aleksandr Kogan, lead author of the study and a postdoctoral student at the University of Toronto at Mississauga.
Two dozen couples participated in the UC Berkeley study, and each provided DNA samples. Researchers then documented the couples as they talked about times when they had suffered. Video was recorded only of the partners as they took turns listening.
A separate group of observers who did not know the couples were shown 20-second video clips of the listeners and asked to rate which seemed most trustworthy, kind and compassionate, based on their facial expressions and body language.
The listeners who got the highest ratings for empathy, it turned out, possess a particular variation of the oxytocin receptor gene known as the GG genotype.
“People can’t see genes, so there has to be something going on that is signaling these genetic differences to the strangers,” Kogan said. “What we found is that the people who had two copies of the G version displayed more trustworthy behaviors – more head nods, more eye contact, more smiling, more open body posture. And it was these behaviors that signaled kindness to the strangers.”
The study, which builds on previous UC Berkeley research on the human genetic predisposition to empathy, is published in the Nov. 14 online issue of the journal Proceedings of the National Academy of Sciences. An earlier UC Berkeley study looked at three combinations of gene variations of the oxytocin receptors AA, AG and GG.
It found that the people who were most empathetic – in that they were able to accurately interpret others’ emotions – had two copies of the “G allele.” In contrast, members of the AA and AG allele groups were found to be less capable of putting themselves in the shoes of others and more likely to get stressed out in difficult situations.
Widely known as the “cuddle” or “love” hormone, oxytocin is secreted into the bloodstream and the brain, where it promotes social interaction, bonding and romantic love, among other functions.
Kogan pointed out that having the AA or AG instead of the GG genotype does not mark a person as unsympathetic.
“What ultimately makes us kind and cooperative is a mixture of numerous genetic and non-genetic factors. No one gene is doing the trick. Instead, each of these many forces is a thread pulling a person in one direction or another, and the oxytocin receptor gene is one of these threads,” Kogan said.
His coauthors are UC Berkeley psychologist Dacher Keltner; Laura Saslow, a postdoctoral student at UCSF; Emily Impett, an assistant professor of psychology at the University of Toronto; Christopher Oveis, an assistant professor at UC San Diego, and Sarina Saturn, assistant professor of psychology at Oregon State University
Public release date: 14-Nov-2011
Niacin Does Not Reduce Heart Attack, Stroke Risk in Stable, Cardiovascular Patients Whose Cholesterol is Well-Controlled to Treatment Guidelines. (READ)
Release Date: November 15, 2011
ORLANDO, Fla. — In patients whose bad cholesterol is very well-controlled by statins for a long time period, the addition of high-dose, extended release niacin did not reduce the risk of cardiovascular events, including heart attack and stroke.
That is the finding being reported today (Nov. 15) at the American Heart Association annual meeting by the study’s co-principal investigator, University at Buffalo professor of medicine William E. Boden, MD; the results are also being published as the lead article in today’s New England Journal of Medicine.
Boden was co-principal investigator, along with Jeffrey Probstfield, MD, professor of cardiology and medicine at the University of Washington. Both researchers led the Atherothrombosis Intervention in Metabolic Syndrome with Low HDL/High Triglycerides: Impact on Global Health (AIM-HIGH) study.
The trial’s purpose was to find out if — in the setting of well-treated LDL (“bad” cholesterol levels) and low HDL (“good” cholesterol levels)/elevated triglycerides — there was an incremental benefit of adding extended-release niacin. Unexpectedly, Boden explains, most patients enrolled in the trial met existing guideline recommendations for LDL and non-HDL levels, and therefore would not have been considered candidates for further lipid-modifying therapy.
Many patients with stable heart and vascular disease are still at high risk for cardiac death, heart attack or stroke even after their LDL cholesterol has reached ideal levels — between 40 and 80 mg/dL on statin therapy. It is believed that this increased residual risk occurs because they have too little HDL cholesterol along with high levels of triglycerides.
In the AIM-HIGH study, 1,718 patients received a high-dose (1,500 to 2,000 mg per day) of extended-release niacin, while 1,696 patients received a placebo.
After two years, HDL and triglyceride levels improved in the niacin group, with a 25 percent increase in good cholesterol, a 29 percent drop in triglycerides and a further decrease in bad cholesterol of approximately 12 percent. By contrast, in the placebo group, there was minimal change, with a 10 percent increase in good cholesterol and an eight percent drop in triglycerides.
The trial found that adding high-dose, extended-release niacin to statin treatment in these well-controlled patients with heart and cardiovascular disease, who had low HDL did not further reduce the risk of cardiovascular events, including heart attacks and stroke.
Because of the lack of benefit, the National Heart, Lung and Blood Institute, upon the recommendation of its Data Safety Monitoring Committee, decided to stop the trial 18 months before its planned completion.
“If you are a patient with stable cardiovascular disease who has achieved and maintained very low levels of LDL cholesterol on a statin for a long time period, these research findings indicate the addition of high-dose niacin does not improve your risk for future events, and is not needed,” explains Boden.
He cautions, however, that these results do not apply to the majority of patients seen in routine clinical settings, where more than 80 percent are unable to lower their cholesterol levels to the degree seen in AIM-HIGH.
“The AIM-HIGH trial was designed to study extended-release niacin or Niaspan, in a specific, narrowly defined patient population,” says Boden. “That is why the results of AIM-HIGH cannot be extrapolated to apply to a broader patient population, especially higher-risk patients admitted for heart attack or acute coronary syndrome, for example, or those whose LDL, or non-HDL levels, are not as well-controlled as those in AIM-HIGH, where prior studies have shown benefit.
“The more relevant observation is that, in this modern era of statin therapy, we’ve made profound progress in controlling LDL,” Boden continues. “However, based on these results, physicians should not assume that boosting HDL levels with Niaspan is without merit.”
Ralph’s Note – So, they stopped the study after 18 months even though there was no negative outcomes. Even though their cholesterol profiles had a dramatic improvemnent. Sounds more like, someone was upset with the results, and wanted to end it soon. So, much for double blind cross over placebo studies.
Public release date: 15-Nov-2011
Milk thistle stops lung cancer in mice
November 15, 2011 By Garth SundemLeave a Comment
Milk thistle, parent of the compound silibinin. Image courtesy of Flickr user Eran Finckle, cc license.
Tissue with wound-like conditions allows tumors to grow and spread. In mouse lung cancer cells, treatment with silibinin, a major component of milk thistle, removed the molecular billboards that signal these wound-like conditions and so stopped the spread of these lung cancers, according to a recent study published in the journal Molecular Carcinogenesis.
Though the natural extract has been used for more than 2,000 years, mostly to treat disorders of the liver and gallbladder, this is one of the first carefully controlled and reported studies to find benefit.
Here is how it works:
Basically, in a cell there can be a chain of signals, one leading to the next, to the next, and eventually to an end product. And so if you would like to eliminate an end product, you may look to break a link in the signaling chain that leads to it. The end products COX2 and iNOS are enzymes involved with the inflammatory response to perceived wounds – both can aid tumor growth. Far upstream in the signaling chain that leads to these unwanted enzymes are STAT1 and STAT3. These transcription factors allow the blueprint of DNA to bind with proteins that continue the signal cascade, eventually leading to the production of harmful COX2 and iNOS.
Stop STAT1 and STAT3 and you break the chain that leads to COX2 and iNOS – and the growth of lung tumors along with them.
“This relatively nontoxic substance – a derivative of milk thistle, called silibinin – was able to inhibit the upstream signals that lead to the expression of COX2 and iNOS,” says Alpna Tyagi, PhD, of the University of Colorado Skaggs School of Pharmacy. Tyagi works in the lab of University of Colorado Cancer Center investigator Rajesh Agarwal, PhD.
In addition, Tyagi and collaborators compared the effects of silibinin to drugs currently in clinical trials for lung cancer. Would drugs that target other signaling pathways – other linked chains – similarly cut into the production of COX2 and iNOS?
It turned out that inhibiting the chains of JAK1/2 and MEK in combination and also inhibiting the signaling pathways of EGFR and NF-kB in combination blocked the ability of STAT1 and STAT3 to trap the energy they needed to eventually signal COX2 and iNOS production.
Compared to these multi-million dollar drugs, naturally-occurring silibinin blocked not only the expression of COX2 and iNOS, but also the migration of existing lung cancer cells.
“What we showed is that STAT1 and STAT3 may be promising therapeutic targets in the treatment of lung cancer, no matter how you target them,” Tyagi says. “And also that naturally-derived products like silibinin may be as effective as today’s best treatments.”
This work was supported by NCI RO1 grant CA113876.
Public release date: 16-Nov-2011
Illegal drug use is associated with abnormal weight in teens
A survey of more than 33,000 Italian high school students reveals that both underweight and overweight teens consume 20 to 40% more illegal drugs than their normal-weight peers.
Further analysis showed that the relationship between these two factors was largely mediated by psychosocial factors such as self-esteem, parents’ educational level, and friendships.
Based on these results, the authors conclude that abnormal weight and substance abuse are not directly related in a cause-effect relationship, but instead are likely both due to common underlying social factors and dissatisfaction.
“Eating disorders have largely increased during the last decades, and obesity is a major public-health problem, especially since the phenomenon is spreading among children. Thus we believe that the results are important to better define targeted interventions”, says Dr. Sabrina Moinaro.
Public release date: 16-Nov-2011
Garlic oil component may form treatment to protect heart
A component of garlic oil may help release protective compounds to the heart after heart attack, during cardiac surgery, or as a treatment for heart failure.
At low concentrations, hydrogen sulfide gas has been found to protect the heart from damage. However, this unstable and volatile compound has been difficult to deliver as therapy.
Now researchers at Emory University School of Medicine have turned to diallyl trisulfide, a garlic oil component, as a way to deliver the benefits of hydrogen sulfide to the heart. Their findings suggest that doctors could use diallyl trisulfide in many of the situations where researchers have proposed using hydrogen sulfide.
The data are being presented Wednesday, Nov. 16 at the American Heart Association (AHA) Scientific Sessions conference in Orlando.
“We are now performing studies with orally active drugs that release hydrogen sulfide,” says David Lefer, PhD, professor of surgery at Emory University School of Medicine and director of the Cardiothoracic Surgery Research Laboratory at Emory University Hospital, Midtown. “This could avoid the need to inject sulfide-delivery drugs outside of an emergency situation.”
Working with Lefer, postdoctoral fellow Benjamin Predmore blocked the coronary arteries of mice for 45 minutes, simulating a heart attack, and gave them diallyl sulfide just before blood flow was restored. The compound reduced the proportion of damaged heart tissue in the area at risk by 61 percent, compared with untreated animals.
“Interruption of oxygen and blood flow damages mitochondria, and loss of mitochondrial integrity can lead to cell death,” he says. “We see that diallyl sulfide can temporarily turn down the function of mitochondria, preserving them and lowering the production of reactive oxygen species.”
Additional data on diallyl trisulfide in a mouse model of heart failure is being presented by a member of Lefer’s team, postdoctoral fellow Kazuhisa Kondo Wednesday at 11:30 a.m.
Transverse aortic constriction results in enlargement of the heart and is a model of heart failure. Diallyl sulfide twice daily, given after aortic constriction, could reduce heart enlargement, Kondo found.
Also at the meeting, Lefer’s team is presenting additional data on mice deficient in the enzyme that generates hydrogen sulfide.
Public release date: 16-Nov-2011
3 p.m. slump? Why a sugar rush may not be the answer
Protein, not sugar, stimulates cells keeping us thin and awake, new study suggests
A new study has found that protein and not sugar activates the cells responsible for keeping us awake and burning calories. The research, published in the 17 November issue of the scientific journal Neuron, has implications for understanding obesity and sleep disorders.
Wakefulness and energy expenditure rely on “orexin cells”, which secrete a stimulant called orexin/hypocretin in the brain. Reduced activity in these unique cells results in narcolepsy and has been linked to weight gain.
Scientists at the University of Cambridge compared actions of different nutrients on orexin cells. They found that amino acids – nutrients found in proteins such as egg whites – stimulate orexin neurons much more than other nutrients.
“Sleep patterns, health, and body weight are intertwined. Shift work, as well as poor diet, can lead to obesity,” said lead researcher Dr Denis Burdakov of the Department of Pharmacology and Institute of Metabolic Science. “Electrical impulses emitted by orexin cells stimulate wakefulness and tell the body to burn calories. We wondered whether dietary nutrients alter those impulses.”
To explore this, the scientists highlighted the orexin cells (which are scarce and difficult to find) with genetically targeted fluorescence in mouse brains. They then introduced different nutrients, such as amino acid mixtures similar to egg whites, while tracking orexin cell impulses.
They discovered that amino acids stimulate orexin cells. Previous work by the group found that glucose blocks orexin cells (which was cited as a reason for after-meal sleepiness), and so the researchers also looked at interactions between sugar and protein. They found that amino acids stop glucose from blocking orexin cells (in other words, protein negated the effects of sugar on the cells).
These findings may shed light on previously unexplained observations showing that protein meals can make people feel less calm and more alert than carbohydrate meals.
“What is exciting is to have a rational way to ‘tune’ select brain cells to be more or less active by deciding what food to eat,” Dr Burdakov said. “Not all brain cells are simply turned on by all nutrients, dietary composition is critical.
“To combat obesity and insomnia in today’s society, we need more information on how diet affects sleep and appetite cells. For now, research suggests that if you have a choice between jam on toast, or egg whites on toast, go for the latter! Even though the two may contain the same number of calories, having a bit of protein will tell the body to burn more calories out of those consumed.”
Public release date: 16-Nov-2011
On Track to Getting Even Fatter
By 2020 majority of adults in America will be overweight, suffer from diabetic conditions
CHICAGO — In 2020, the vast majority of adults in America will be overweight or obese and more than half will suffer from diabetes or pre-diabetic conditions, according to projections presented by Northwestern Medicine researchers at the American Heart Association (AHA) Scientific Sessions Wednesday, Nov. 16, in Orlando.
The AHA has set a target to help Americans improve their overall heart health by 20 percent in 2020. However, if current trends continue, Americans can expect only a modest improvement of six percent in overall cardiovascular health in 2020.
The implications of not increasing heart health by 20 percent by 2020 could be grave. Declining rates of sickness and death from cardiovascular disease may stall, and related health care costs, already projected to reach $1.1 trillion per year by 2030, could rise even further. That’s according to study author Mark Huffman, M.D., assistant professor in preventive medicine and medicine-cardiology at Northwestern University Feinberg School of Medicine and a cardiologist at Northwestern Memorial Hospital.
Representative of all Americans, the study is based on patterns found in the National Health and Nutrition Examination Surveys (NHANES) from 1988 to 2008. The projected numbers on weight and diabetes, based on previous trends, follow.
In 2020, 83 percent of men and 72 percent of women will be overweight or obese.
Currently, 72 percent of men and 63 percent of women are overweight or obese (people who are overweight have a Body Mass Index (BMI) of 25 to 29kg/m2, people who are obese have a BMI of 30kg/m2 or greater).
In 2020, 77 percent of men and 53 percent of women will have dysglycemia (either diabetes or pre-diabetes). Currently, 62 percent of men and 43 percent of women have dysglycemia.
“To increase overall heart health by 20 percent, American adults would need to rapidly reverse these unhealthy trends — starting today,” Huffman said. “In concert with individual choices, public health policies can be and should be effective tools to reduce smoking, increase access to healthy foods, and increase physical activity in daily life.”
More people would need to improve health behaviors related to diet, physical activity, body weight and smoking and health factors, related to glucose, cholesterol and blood pressure.
“We’ve been dealing with the obesity trend for the past three decades, but the impact we project on blood sugar is a true shock,” said Donald Lloyd-Jones, M.D., chair and associate professor of preventive medicine at the Feinberg School of Medicine, a physician at Northwestern Memorial Hospital and senior author of the study. “Those are some really scary numbers. When blood sugar goes up like that all of the complications of diabetes come into play.”
Less than five percent of Americans currently are considered to have ideal cardiovascular health. The modest six percent improvement in cardiovascular health that is projected for 2020 means better cholesterol and blood pressure numbers for Americans and fewer smokers. Improvements in treatment and control of cholesterol and blood pressure with medication and declines in smoking would partially account for this small boost, but they wouldn’t be enough to offset the weight and diabetes problems Americans face, Huffman said. Projected improvements in diet and physical activity also contribute to the six percent projection, but the absolute increase in Americans who consume ideal diets will remain less than two percent by 2020, if current trends continue.
“Since the 1960s cardiovascular disease death rates have substantially decreased, but if the weight and dysglycemia trends continue to increase, we are in danger of seeing a reversal of those gains,” Huffman said.
Achieving a healthy weight through diet and physical activity is the best way most Americans can improve their cardiovascular health, but, as Huffman stressed, not smoking is the number one preventable cause of preventable death. Yet, one in five Americans still smoke.
The National Heart, Lung, and Blood Institute funded this study
Public release date: 16-Nov-2011
Probiotic protects intestine from radiation injury
Scientists at Washington University School of Medicine in St. Louis have shown that taking a probiotic before radiation therapy can protect the intestine from damage — at least in mice.
The new study suggests that taking a probiotic also may help cancer patients avoid intestinal injury, a common problem in those receiving radiation therapy for abdominal cancers. The research is published online in the journal Gut.
Radiation therapy often is used to treat prostate, cervical, bladder, endometrial and other abdominal cancers. But the therapy can kill both cancer cells and healthy ones, leading to severe bouts of diarrhea if the lining of the intestine gets damaged.
“For many patients, this means radiation therapy must be discontinued, or the radiation dose reduced, while the intestine heals,” says senior investigator William F. Stenson, MD, the Dr. Nicholas V. Costrini Professor in Gastroenterology & Inflammatory Bowel Disease at Washington University. “Probiotics may provide a way to protect the lining of the small intestine from some of that damage.”
Stenson has been searching for ways to repair and protect healthy tissue from radiation. This study showed that the probiotic bacteria Lactobacillus rhamnosus GG (LGG), among other Lactobacillus probiotic strains, protected the lining of the small intestine in mice receiving radiation.
“The lining of the intestine is only one cell-layer thick,” Stenson says. “This layer of epithelial cells separates the rest of the body from what’s inside the intestine. If the epithelium breaks down as the result of radiation, the bacteria that normally reside in the intestine can be released, travel through the body and cause serious problems such as sepsis.”
The researchers found that the probiotic was effective only if given to mice before radiation exposure. If the mice received the probiotic after damage to the intestinal lining had occurred, the LGG treatment could not repair it in this model.
Because the probiotic protected intestinal cells in mice exposed, the investigators believe it may be time to study probiotic use in patients receiving radiation therapy for abdominal cancers.
“In earlier human studies, patients usually took a probiotic after diarrhea developed when the cells in the intestine already were injured,” says first author Matthew A. Ciorba, MD, assistant professor of medicine in the Division of Gastroenterology. “Our study suggests we should give the probiotic prior to the onset of symptoms, or even before the initiation of radiation because, at least in this scenario, the key function of the probiotic seems to be preventing damage, rather than facilitating repair.”
The investigators sought to evaluate LGG’s protective effects in a way that would leave little doubt about whether it was preventing injury, and if so, how it was protecting the cells that line the intestine.
“Some human studies have looked at the possibility that probiotics might reduce diarrhea, but most of those studies have not been quite as rigorous as we would like, and the mechanism by which the probiotics might work has not been addressed,” Stenson says.
Previously, Stenson and his colleagues demonstrated that a molecular pathway involving prostaglandins and cyclooxygenase-2 (COX-2), key components in inflammation, could protect cells in the small intestine by preventing the programmed cell death, or apoptosis, that occurs in response to radiation.
They gave measured doses of LGG to mice, directly delivering the live bacteria to the stomach. They found it protected only mice that could make COX-2. In mutant mice unable to manufacture COX-2, the radiation destroyed epithelial cells in the intestine, just as it did in mice that didn’t receive the probiotic.
“In the large intestine, or colon, cells that make COX-2 migrate to sites of injury and assist in repair,” Ciorba says. “In this study, we evaluated that response in the small intestine, and we found that COX-2-expressing cells could migrate from the lining to the area of the intestine, called the crypt, where new epithelial cells are made, and we believe this mechanism is key to the protective effect we observed.”
If human studies are launched, Ciorba says one bit of encouraging news is that the doses of probiotic given to mice were not exceptionally large, and their intestines were protected. So people wouldn’t need mega-doses of the probiotic to get protection.
“The bacteria we use is similar to what’s found in yogurt or in commercially available probiotics,” he says. “So theoretically, there shouldn’t be risk associated with this preventative treatment strategy any more than there would be in a patient with abdominal cancer eating yogurt.”
In addition, he notes, future research is focused on isolating the particular radio-protective factor produced by the probiotic. When that is identified, a therapeutic could be developed to harness the probiotic benefit without using the live bacteria.
Public release date: 16-Nov-2011
Today’s teens will die younger of heart disease
High blood sugar, obesity, poor diet, smoking, little exercise make adolescents unhealthiest in US history
CHICAGO — A new study that takes a complete snapshot of adolescent cardiovascular health in the United States reveals a dismal picture of teens who are likely to die of heart disease at a younger age than adults do today, reports Northwestern Medicine research.
“We are all born with ideal cardiovascular health, but right now we are looking at the loss of that health in youth,” said Donald Lloyd-Jones, M.D., chair and associate professor of preventive medicine at Northwestern University Feinberg School of Medicine and a physician at Northwestern Memorial Hospital. “Their future is bleak.”
Lloyd-Jones is the senior investigator of the study presented Nov. 16 at the American Heart Association Scientific Sessions in Orlando.
The effect of this worsening teen health is already being seen in young adults. For the first time, there is an increase in cardiovascular mortality rates in younger adults ages 35 to 44, particularly in women, Lloyd-Jones said.
The alarming health profiles of 5,547 children and adolescents, ages 12 to 19, reveal many have high blood sugar levels, are obese or overweight, have a lousy diet, don’t get enough physical activity and even smoke, the new study reports. These youth are a representative sample of 33.1 million U.S. children and adolescents from the 2003 to 2008 National Health and Nutrition Examination Surveys.
“Cardiovascular disease is a lifelong process,” Lloyd-Jones said. “The plaques that kill us in our 40s and 50s start to form in adolescence and young adulthood. These risk factors really matter.”
“After four decades of declining deaths from heart disease, we are starting to lose the battle again,” Lloyd-Jones added.
The American Heart Association (AHA) defines ideal cardiovascular health as having optimum levels of seven well-established cardiovascular risk factors, noted lead study author Christina Shay, who did the research while she was a postdoctoral fellow in preventive medicine at Northwestern’s Feinberg School. Shay now is an assistant professor of epidemiology at the University of Oklahoma Health Sciences Center.
“What was most alarming about the findings of this study is that zero children or adolescents surveyed met the criteria for ideal cardiovascular health,” Shay said. “These data indicate ideal cardiovascular health is being lost as early as, if not earlier than the teenage years.”
The study used measurements from the AHA’s 2020 Strategic Impact Goals for monitoring cardiovascular health in adolescents and children. Among the findings:
All the 12-to-19-year-olds had terrible diets, which, surprisingly, were even worse than those of adults, Lloyd-Jones said. None of their diets met all five criteria for being healthy. Their diets were high in sodium and sugar-sweetened beverages and didn’t include enough fruits, vegetables, fiber or lean protein.
“They are eating too much pizza and not enough whole foods prepared inside the home, which is why their sodium is so high and fruit and vegetable content is so low,” Lloyd-Jones said.
HIGH BLOOD SUGAR
More than 30 percent of boys and more than 40 percent of girls have elevated blood sugar, putting them at high risk for developing type 2 diabetes.
OVERWEIGHT OR OBESE
Thirty-five percent of boys and girls are overweight or obese. “These are startling rates of overweight and obesity, and we know it worsens with age,” Lloyd-Jones said. “They are off to a bad start.”
LOW PHYSICAL ACTIVITY
Approximately 38 percent of girls had an ideal physical activity level compared to 52 percent of boys.
Girls’ cholesterol levels were worse than boys’. Only 65 percent of girls met the ideal level compared to 73 percent of boys.
Almost 25 percent of teens had smoked within the past month of being surveyed.
Most boys and girls (92.9 percent and 93.4 percent, respectively) had an ideal level of blood pressure.
The problem won’t be easy to fix. “We are much more sedentary and get less physical activity in our daily lives,” Lloyd-Jones said. “We eat more processed food, and we get less sleep. It’s a cultural phenomenon, and the many pressures on our health are moving in a bad direction. This is a big societal problem we must address.”
These reports are done with the appreciation of all the Doctors, Scientist, and other
Medical Researchers who sacrificed their time and effort. In order to give people the
ability to empower themselves. Without the base aspirations for fame, or fortune. Just honorable people, doing honorable things.
You must log in to post a comment.