Editors Top Five:
1. Zinc lozenges may shorten common cold duration
2. Drug shown to improve sight for patients with inherited blindness 3. Are cancers newly evolved species?
4. Increased risk of Parkinson’s disease in methamphetamine users, CAMH study finds 5. New study shows Transcendental Meditation improves brain
In this Issue:
1. Soy/milk protein dietary supplements linked to lower blood pressure
2. Vegetarian diet may protect against common bowel disorder
3. Does food act physiologically like a ‘drug of choice’ for some?
4. Patients who use anti-depressants are more likely to suffer relapse, researcher finds
5. Seaweed as a rich new source of heart-healthy food ingredients
6. U of M researchers discover gene required to maintain male sex throughout life
7. Breastfeeding may prevent asthma
8. Hospital bacteria outbreak linked to nasal spray
9. Drug shown to improve sight for patients with inherited blindness
10. New study finds cancer-causing mineral in US road gravel
11. Increased risk of Parkinson’s disease in methamphetamine users, CAMH study finds
12. Concern over intensive treatment for patients with Type 2 diabetes
13. Zinc lozenges may shorten common cold duration
14. New study shows Transcendental Meditation improves brain
15. Are cancers newly evolved species?
16. Popular mammography tool not effective for finding invasive breast cancer
17. Yoga boosts stress-busting hormone, reduces pain: York U study
18. Fructose consumption increases risk factors for heart disease
19. Grapes protect against ultraviolet radiation
Public release date: 18-Jul-2011
Soy/milk protein dietary supplements linked to lower blood pressure
Milk and soy protein supplements were associated with lower systolic blood pressure compared to refined carbohydrate dietary supplements, in a study reported in Circulation: Journal of the American Heart Association.
The study’s results suggest that partly replacing refined carbohydrates with foods or drinks high in soy or milk protein may help prevent and treat high blood pressure, said Jiang He, M.D., Ph.D., lead researcher of the study.
The randomized, controlled clinical trial is the first to document that milk protein lowers blood pressure for people with pre-hypertension and stage-1 high blood pressure.
Study participants who took a milk protein supplement had a 2.3 millimeters of mercury (mmHg) lower systolic blood pressure, compared to when they took a refined carbohydrate supplement.
Participants who took a soy protein supplement had a 2.0 mmHg lower systolic blood pressure when compared to the refined carbohydrate supplement.
Systolic blood pressure is the top number in a blood pressure reading and gauges the pressure when the heart contracts. Refined carbohydrate supplements were not linked to a change in systolic blood pressure.
The 352 adults in the study were at increased risk of high blood pressure or had mild cases of the condition.
Previous studies have shown that a diet rich in low-fat dairy products reduces blood pressure. Almost 75 million Americans have high blood pressure, a “silent killer” that can cause heart attacks, heart failure, strokes, kidney damage and other potentially fatal events.
“Some previous observational research on eating carbohydrates inconsistently suggested that a high carbohydrate diet might help reduce blood pressure,” said He, an epidemiologist at Tulane University School of Public Health and Tropical Medicine in New Orleans, La. “In contrast, our clinical trial directly compares soy protein with milk protein on blood pressure, and shows they both lower blood pressure better than carbohydrates.”
Participants were age 22 or older, with systolic blood pressure ranging from 120 to 159 mmHg and diastolic blood pressure from 80 to 95 mmHg. Each was randomly assigned to take 40 grams of soy protein, milk protein or a refined carbohydrate supplement every day, for eight weeks each. The supplements used were formulated in a way that allowed researchers to compare the effects of soy protein, milk protein, and refined complex carbohydrate on blood pressure without changing sodium, potassium, and calcium.
Each eight-week phase was followed by a three-week washout period when study participants did not take supplements. They took the three supplements as identical powder supplements dissolved in liquid.
Blood pressure readings were taken three times at each of two clinical visits before and two clinical visits after each eight-week phase, yielding a net blood pressure change for each supplement period. The study results showed no decrease in diastolic blood pressure.
“The systolic blood pressure differences we found are small for the individual, but they are important at the population level,” He said.
Based on previous research, a 2 mmHg decrease in systolic blood pressure could lead to 6 percent fewer
stroke-related deaths, a 4 percent lower rate of heart disease deaths and a 3 percent reduction in overall deaths among Americans.
Long-term studies would be needed to make specific recommendations for dietary changes, He said.
Public release date: 19-Jul-2011
Vegetarian diet may protect against common bowel disorder
Research: Diet and risk of diverticular disease in Oxford cohort of European Prospective Investigation into Cancer and Nutrition (EPIC): prospective study of British vegetarians and non-vegetarians
Vegetarians are a third less likely to get a common bowel disorder (diverticular disease) than their meat eating counterparts, finds a new study published on bmj.com today.
Diverticular disease has been termed a “disease of western civilisation” because of the higher numbers of cases in countries like the UK and the US compared with parts of Africa. The condition affects the large bowel or colon and is thought to be caused by not consuming enough fibre. Typical symptoms include painful abdominal cramps, bloating, wind, constipation and diarrhoea.
Previous research has suggested that a low fibre diet could lead to diverticular disease, and that vegetarians may have a lower risk compared with meat eaters, but there is little evidence to substantiate this.
So Dr Francesca Crowe and her team from the Cancer Epidemiology Unit at the University of Oxford set out to examine the link between a vegetarian diet and intake of dietary fibre with the risk of diverticular disease.
Their findings are based on 47,033 generally health conscious British adults who were taking part in the European Prospective Investigation into Cancer and Nutrition (EPIC)-Oxford study. Of those recruited, 15,459 reported consuming a vegetarian diet.
After an average follow-up time of 11.6 years, there were 812 cases of diverticular disease (806 admissions to hospital and six deaths). After adjusting the factors such as smoking, alcohol and body mass index (BMI), vegetarians had a lower risk of diverticular disease compared with meat eaters.
Furthermore, participants with a relatively high intake of dietary fibre (around 25g a day) had a lower risk of being admitted to hospital with or dying from diverticular disease compared with those who consumed less than 14g of fibre a day.
Consuming a vegetarian diet and a high intake of dietary fibre are both associated with a lower risk of diverticular disease, say the authors. The 2000-1 UK National Diet and Nutrition Survey showed that 72% of men and 87% of women were not meeting the recommended average intake for dietary fibre of 18 g per day and so the proportion of cases of diverticular diseases in the general population attributed to a low fibre diet could be considerable, they add.
These findings lend support to the public health recommendations that encourage the consumption of foods high in fibre such as wholemeal breads, wholegrain cereals, fruits and vegetables, they conclude.
In an accompanying editorial, researchers from Nottingham University Hospital discuss the implications for the health of the population and the individual.
Based on these findings, David Humes and Joe West estimate that “about 71 meat eaters would have to
become vegetarians to prevent one diagnosis of diverticular disease.”
They add: “Overall the opportunity for preventing the occurrence of diverticular disease and other conditions, such as colorectal cancer, probably lies in the modification of diet, at either a population or an individual level.” However, they stress that “far more evidence is needed before dietary recommendations can be made to the general public.”
Public release date: 19-Jul-2011
Does food act physiologically like a ‘drug of choice’ for some?
Variety is considered the “spice of life,” but does today’s unprecedented level of dietary variety help explain skyrocketing rates of obesity? Some researchers think it might.
According to ASN Spokesperson Shelley McGuire, PhD: “We’ve known for years that foods- even eating, itself- can trigger release of various brain chemicals, some of which are also involved in what happens with drug addiction and withdrawal. And, as can happen with substance abusers, tolerance or “habituation” can occur, meaning that repeated use (in this case, exposure to a food) is sometimes accompanied by a lack of response (in this case, disinterest in the food). The results of the study by Epstein and colleagues provides a very interesting new piece to the obesity puzzle by suggesting that meal monotony may actually lead to reduced calorie consumption. The trick will be balancing this concept with the importance of variety to good nutrition.”
Studies have shown that many people become disinterested in a particular food when they are repeatedly exposed to it. This response, called habituation, can decrease caloric intake in the short-run. Conversely, when presented with a variety of foods, caloric intake can increase. The “food addiction hypothesis” purports that some people may overeat because they are insensitive to the normal habituation response and thus need even more exposure to a food to trigger a disinterest. However, there has been no rigorous research investigating whether healthy-weight and overweight individuals have different habituation responses, and little is known about what patterns of food exposure are most powerful in triggering habituation. To help close these research gaps, researchers studied long-term habituation in obese and nonobese women. Their results, and an accompanying editorial by Nicole Avena and Mark Gold, are published in the August 2011 issue of The American Journal of Clinical Nutrition.
Sixteen nonobese [BMI (in kg/m2) < 30] and 16 obese (BMI I 30) women were randomly assigned to 1 of 2 groups: the “weekly group” participated in weekly experimental food exposure sessions for 5 wk, whereas the “daily group” were studied daily for 5 consecutive days. During each 28-min experimental session, subjects were asked to complete a variety of tasks after which they were “rewarded” by being given a 125-kcal portion of macaroni and cheese. Participants could work for as much food as they wanted. The researchers then evaluated total energy intake.
Whereas weekly food exposure increased total caloric intake by approximately 30 kcal/d, daily exposure decreased energy consumption by ~100 kcal/d. This supports long-term habituation in terms of caloric intake. Very few differences were found between how obese and nonobese individuals responded.
The authors concluded that reducing variety in food choices may represent an important strategy for those trying to lose weight. Moreover, having a person even remember that they have eaten a certain food recently may be effective in this regard. In their accompanying editorial, Avena and Gold compare physiologic components of the food addiction hypothesis to the body’s addictive responses to drugs. They also ponder whether school-lunch planners and public health officials should note that diversity in the menu is not necessarily a virtue” but might instead “be associated with promoting excess food intake and increased body mass index.” Provocative food for thought.
Public release date: 19-Jul-2011
Patients who use anti-depressants are more likely to suffer relapse, researcher finds
Patients who use anti-depressants are much more likely to suffer relapses of major depression than those who use no medication at all, concludes a McMaster researcher.
In a paper that is likely to ignite new controversy in the hotly debated field of depression and medication, evolutionary psychologist Paul Andrews concludes that patients who have used anti-depressant medications can be nearly twice as susceptible to future episodes of major depression.
Andrews, an assistant professor in the Department of Psychology, Neuroscience & Behaviour, is the lead author of a new paper in the journal Frontiers of Psychology.
The meta-analysis suggests that people who have not been taking any medication are at a 25 per cent risk of relapse, compared to 42 per cent or higher for those who have taken and gone off an anti-depressant.
Andrews and his colleagues studied dozens of previously published studies to compare outcomes for patients who used anti-depressants compared to those who used placebos.
They analyzed research on subjects who started on medications and were switched to placebos, subjects who were administered placebos throughout their treatment, and subjects who continued to take medication throughout their course of treatment.
Andrews says anti-depressants interfere with the brain’s natural self- regulation of serotonin and other neurotransmitters, and that the brain can overcorrect once medication is suspended, triggering new depression.
Though there are several forms of anti-depressants, all of them disturb the brain’s natural regulatory mechanisms, which he compares to putting a weight on a spring. The brain, like the spring, pushes back against the weight. Going off antidepressant drugs is like removing the weight from the spring, leaving the person at increased risk of depression when the brain, like the compressed spring, shoots out before retracting to its resting state.
“We found that the more these drugs affect serotonin and other neurotransmitters in your brain — and that’s what they’re supposed to do — the greater your risk of relapse once you stop taking them,” Andrews says. “All these drugs do reduce symptoms, probably to some degree, in the short-term. The trick is what happens in the long term. Our results suggest that when you try to go off the drugs, depression will bounce back. This can leave people stuck in a cycle where they need to keep taking anti-depressants to prevent a return of symptoms.”
Andrews believes depression may actually be a natural and beneficial — though painful — state in which the brain is working to cope with stress.
“There’s a lot of debate about whether or not depression is truly a disorder, as most clinicians and the majority of the psychiatric establishment believe, or whether it’s an evolved adaptation that does
something useful,” he says.
Longitudinal studies cited in the paper show that more than 40 per cent of the population may experience major depression at some point in their lives.
Most depressive episodes are triggered by traumatic events such as the death of a loved one, the end of a relationship or the loss of a job. Andrews says the brain may blunt other functions such as appetite, sex drive, sleep and social connectivity, to focus its effort on coping with the traumatic event.
Just as the body uses fever to fight infection, he believes the brain may also be using depression to fight unusual stress.
Not every case is the same, and severe cases can reach the point where they are clearly not beneficial, he emphasizes.
Public release date: 20-Jul-2011
Seaweed as a rich new source of heart-healthy food ingredients
In an article that may bring smiles to the faces of vegetarians who consume no dairy products and vegans, who consume no animal-based foods, scientists have identified seaweed as a rich new potential source of heart-healthy food ingredients. Seaweed and other “macroalgae” could rival milk products as sources of these so-called “bioactive peptides,” they conclude in an article in ACS’s Journal of Agricultural and Food Chemistry.
Maria Hayes and colleagues Ciarán Fitzgerald, Eimear Gallagher and Deniz Tasdemir note increased interest in using bioactive peptides, now obtained mainly from milk products, as ingredients in so-called functional foods. Those foods not only provide nutrition, but have a medicine-like effect in treating or preventing certain diseases. Seaweeds are a rich but neglected alternative source, they state, noting that people in East Asian and other cultures have eaten seaweed for centuries: Nori in Japan, dulse in coastal Europe, and limu palahalaha in native Hawaiian cuisine.
Their review of almost 100 scientific studies concluded that that some seaweed proteins work just like the bioactive peptides in milk products to reduce blood pressure almost like the popular ACE inhibitor drugs. “The variety of macroalga species and the environments in which they are found and their ease of cultivation make macroalgae a relatively untapped source of new bioactive compounds, and more efforts are needed to fully exploit their potential for use and delivery to consumers in food products,” Hayes and her colleagues conclude.
Public release date: 20-Jul-2011
U of M researchers discover gene required to maintain male sex throughout life
Researchers find that loss of gene Dmrt1 leads to male cells becoming female
MINNEAPOLIS / ST. PAUL (July 20, 2011) – University of Minnesota Medical School and College of Biological Sciences researchers have made a key discovery showing that male sex must be maintained throughout life.
The research team, led by Drs. David Zarkower and Vivian Bardwell of the U of M Department of Genetics, Cell Biology and Development, found that removing an important male development gene, called Dmrt1, causes male cells in mouse testis to become female cells.
The findings are published online today in Nature.
In mammals, sex chromosomes (XX in female, XY in male) determine the future sex of the animal during embryonic development by establishing whether the gonads will become testes or ovaries.
“Scientists have long assumed that once the sex determination decision is made in the embryo, it’s final,” Zarkower said. “We have now discovered that when Dmrt1 is lost in mouse testes – even in adults – many male cells become female cells and the testes show signs of becoming more like ovaries.”
Previous research has shown that removing a gene, called Foxl2, in ovaries caused female cells to become male cells and the ovaries to become more like testes. According to Zarkower, the latest U of M research determines that the gonads of both sexes must actively maintain the original sex determination decision throughout the remainder of life.
For the genetic research community this new understanding is a breakthrough. The findings provide new insight into how to turn one cell type into another, a process known as reprogramming, and also show that throughout life, cells in the testis must be actively prevented from transforming into female cells normally found in the ovary.
“This work shows that sex determination in mammals can be surprisingly prone to change, and must be actively maintained throughout an organism’s lifetime,” said Dr. Susan Haynes, who oversees developmental biology grants at the National Institute of General Medical Sciences of the National Institutes of Health. “These new insights have important implications for our understanding of how to reprogram cells to take on different identities, and may shed light on the origin of some human sex reversal disorders.”
The new findings may force the scientific community to reconsider how disorders involving human sex- reversal occur. Some of these disorders may not result from errors in the original sex determination decision in the embryo, but instead may result from failure to maintain that decision later in embryonic development. In addition, because DMRT1 has been associated with human gonadal cancers, the researchers hope their findings will provide another clue into how gonadal cancer develops.
Public release date: 21-Jul-2011
Breastfeeding may prevent asthma
Feeding a baby on only breast milk and for up to 6 months after birth can reduce their risk of developing asthma-related symptoms in early childhood, according to new research.
The study, which is published online today (21 July 2011) in the European Respiratory Journal, looked at the impact of the duration of breastfeeding and the introduction of alternative liquids or solids in addition to breast milk.
The researchers, from the Generation R Study, Erasmus Medical Center in The Netherlands, used questionnaires to gather data from over 5,000 children. They ascertained in the first 12 months after birth whether the children had ever been breastfed, when breastfeeding was stopped, and whether any other milk or solids were introduced.
Further questionnaires were completed when the children were aged 1, 2, 3 and 4 years to check whether they had any asthma-related symptoms.
The results showed that children who had never been breastfed had an increased risk of wheezing, shortness of breath, dry cough and persistent phlegm during their first 4 years, compared to children who were breastfed for more than 6 months.
The strongest links were seen with wheezing and persistent phlegm, as children were 1.4 and 1.5 times more likely to develop these symptoms if they had never been breastfed.
Children who were fed other milk or solids during their first 4 months in addition to breast milk had an increased risk of wheezing, shortness of breath, dry cough and persistent phlegm during the first 4 years, compared to children who were exclusively breastfed for their first 4 months.
While previous studies have shown a similar effect between breastfeeding and asthma risk, this research is the first that showed a link between the length of breastfeeding and the number of wheezing episodes.
Also, this study found evidence that the first asthma-related symptoms occur earlier in life if children were breastfed for shorter lengths of time or not exclusively.
Dr Agnes Sonnenschein-van der Voort, researcher at Generation R and lead author from the Erasmus Medical Center in The Netherlands, said: “The link of duration and exclusiveness of breastfeeding with asthma-related symptoms during the first 4 years was independent of infectious and atopic diseases. These results support current health policy strategies that promote exclusive breastfeeding for 6 months in industrialised countries. Further studies are needed to explore the protective effect of breastfeeding on the various types of asthma in later life.”
Public release date: 21-Jul-2011
Hospital bacteria outbreak linked to nasal spray
Chicago, IL—Infection control researchers investigating a rare bacterial outbreak of Burholderia cepacia complex (Bcc) identified contaminated nasal spray as the root cause of the infections, leading to a national recall of the product. An article in the August issue of Infection Control and Hospital Epidemiology, the journal of the Society for Healthcare Epidemiology of America (SHEA), describes how researchers were able to trace the outbreak back to the nasal decongestant spray.
Bcc is a group of Gram-negative bacteria that can cause hard-to-treat infections. Patients with underlying medical conditions such as lung disease and weakened immune systems are at greater risk of contracting Bcc. When patients in a Denver children’s hospital began testing positive for the bacteria in 2003, investigators suspected that a batch of Major Twice-a-Day Nasal Spray, a brand that each of the patients had used, might be to blame. However, standard tests of the spray did not find any bacteria initially.
Noticing some peculiarities in the initial tests, the investigators decided to retest the spray using a non- standard culture medium. The second set of tests was positive for Bcc, the same strain as was identified in patients. The nasal spray contained a preservative agent that can interfere with standard bacterial cultures and the second set of tests neutralized the preservative, allowing the detection of the bacteria.
The spray was voluntarily recalled by the manufacturer, but the findings raise lingering questions about how manufacturers should test nasal spray products before distribution. “If standard culturing methods were used by the manufacturer then they may not have [discovered] this organism,” the researchers write.
“Nasal spray products are among the most widely used over-the-counter
pharmaceuticals, but to date they are not required by the FDA to be sterile,” said Susan Dolan, one of the article’s authors. “Given the implications of Bcc infections we question this decision.”
Other products, such as mouthwash, nebulization therapy, tap water, disinfectants, and reusable temperature probes have previously been implicated as Bcc outbreak sources.
Public release date: 25-Jul-2011
Drug shown to improve sight for patients with inherited blindness
A clinical trial led by Newcastle University shows that the drug, idebenone (Catena®), improved the vision and perception of colour in patients with Leber’s Hereditary Optic Neuropathy (LHON). The inherited condition means patients, who can see normally, lose the sight in one eye then within 3 to 6 months lose the sight in their other eye.
In some severely affected patients such as those who were unable to read any letters on the chart, the treatment with idebenone resulted in a marked improvement in their vision. In nine patients (12 eyes) out of 36 patients (61 eyes) taking idebenone, vision improved to the extent that patients were able to read at least one row of letters on the chart. In contrast not a single patient of the 26 who were taking the placebo improved to that extent.
Inherited from the mother, and mainly affecting men, LHON is caused by damage to the mitochondria in the eyes – the ‘batteries’ which power their cells. It is one of the most common causes of inherited blindness and is thought to affect around 2,000 people in the UK, around 10,000 in Europe and a further 10,000 in the USA.
“This is the first proven treatment for a mitochondrial disorder. We have seen patients who couldn’t even see an eye chart on the wall go on to read the first line down – and some even attempted the second line. For these patients, it can mean a vast improvement in their quality of life,” said Professor Patrick Chinnery, a Wellcome Trust Senior Fellow in Clinical Science at Newcastle University who also works at the Royal Victoria Infirmary in Newcastle – part of the Newcastle upon Tyne Hospitals NHS Foundation Trust.
Released today in the journal Brain published by Oxford University Press, the authors describe how patients with LHON were recruited from Newcastle Hospitals in the UK, in Munich, Germany and in Montreal, Canada for a double blind trial. Patients were either given idebenone for 24 weeks or a placebo.
At the end of the six months, some patients who were taking idebenone had improved vision and this is the first time a successful treatment has been found. The greatest improvement was seen in patients who had deteriorated in one eye more than the other.
Professor Chinnery explained: “We saw most progress in people who had better vision in one eye than the other – this tends to indicate that they are at an earlier stage of the condition. While we know that their vision is not what it once was, we also know that this treatment can dramatically improve their lives – some were able to move around more easily or even see family photos again.”
Idebenone penetrates into the mitochondria and is thought to mop-up toxic free radicals and enhance mitochondrial function. Previous research had provided anecdotal reports of improvements in vision but this is the first time it had been put to the test in a clinical trial. The drug company which sponsored this trial, Santhera Pharmaceuticals, is now seeking marketing approval from the European Medicines Agency
for it to be offered as a standard form of treatment.
“We are hearing from patients that they still have improved vision – even though they are no longer taking the drug but we would like to verify this and study the effect further,” said Professor Chinnery. “There may also be a case for offering idebenone from the first moment that LHON is diagnosed – preferably before any symptoms are shown – and a further trial would ideally examine this.”
“I lost the sight in my left eye in just five days”
Mike Scholes, 58 from Lindfield in West Sussex, UK and a graduate of Newcastle University took part in the trial. He said: “I was training for a freefall parachute jump five years ago when I noticed I was having problems with my eye. I went for an eye test at the optician and on the way to pick up my glasses five days later, I nearly crashed the car. The optician tested my right eye and there was no problem, when he came to the left eye I asked him to switch on the machine – and he said he already had. I had lost the sight in my left eye in just five days.
“This meant an abrupt change in my life – I had a very successful hot air balloon business and I had to stop flying. I had to sell my cars as I could no longer drive.
“Following seven months of tests including CAT scans, X-Rays, MRI scans and a lumbar puncture, I was finally given a DNA test which revealed I had Leber’s hereditary optic neuropathy.
“It was around this time that my vision started to go in my second eye. I couldn’t see in an increasingly large area in the centre of my eyes and gradually colours disappeared. At worst the only colours I could make out were shades of blue.
“Soon after friends spotted a clinical trial in Newcastle, I volunteered to take part and started taking the tablets three times a day – not knowing whether I was taking a placebo or the drug.
“After just a month and a half I noticed that the area affected in the centre of my vision was smaller. The improvement continued and I began to appreciate colours again seeing yellow and most reds.
“Having Leber’s hasn’t stopped me enjoying life to the full – I run marathons with a guide, I’ve hiked to the North pole – but the noticeable improvement in my vision means daily life is easier. I can use a computerised viewer to help me read, I can get dressed without having to use a detector for the colours of clothes and while initially I couldn’t even see the eye chart, now if I get really close to a street sign I can read it.”
Ralph’s Note – This is found on the shelves of better health food stores..
Public release date: 25-Jul-2011
New study finds cancer-causing mineral in US road gravel
Erionite in North Dakota roads may increase risk of mesothelioma
Honolulu, HI—As school buses drive down the gravel roads in Dunn County, North Dakota, they stir up more than dirt. The clouds of dust left in their wake contain such high levels of the mineral erionite that those who breathe in the air every day are at an increased risk of developing mesothelioma, a type of cancer of the membranes around the lungs, new research shows. Erionite is a natural mineral fiber that shares similar physical similarities with asbestos. When it’s disturbed by human activity, fibers can become airborne and lodge themselves in people’s lungs. Over time, the embedded fibers can make cells of the lung grow abnormally, leading to mesothelioma, a form of lung cancer most often associated with the
related mineral asbestos.
Michele Carbone, M.D., Ph.D., director of the University of Hawaii Cancer Center in Honolulu, has previously linked erionite exposure in some Turkish villages to unusually high rates of mesothelioma. Recently, he and colleagues turned their attention to potential erionite exposure in the U.S., where at least 12 states have erionite-containing rock deposits. His research team—which includes scientists from the National Institute of Environmental Health Sciences, Environmental Protection Agency, New York University, University of Chicago, University of Iowa, and University of Hacettepe—focused their efforts on Dunn County, North Dakota, when they learned that rocks containing erionite have been used to produce gravel for the past 30 years. More than 300 miles of roads are now paved with the gravel. The new study, reported in the July 25, 2011 issue of Proceedings of the National Academy of Sciences (PNAS) is the first to look at the potential hazards associated with erionite exposure in the U.S.
The scientists compared the erionite in North Dakota to erionite from the Turkish villages with high mesothelioma rates. They measured airborne concentrations of the mineral in various settings, studied its chemical composition, and analyzed its biological activity. When mice were injected with the erionite from Dunn County, their lungs showed signs of inflammation and abnormal cell growth, precursors to mesothelioma. Under the microscope, the fiber size of the erionite from North Dakota was similar to that of the Turkish erionite. Overall, the researchers found no chemical differences between the North Dakota erionite and samples of the cancer-causing mineral from Turkey. The airborne levels of erionite in North Dakota were comparable to levels found in Turkish villages with 6-8 percent mortality rates from mesothelioma, the researchers reported.
“Based on the similarity between the erionite from the two sources,” says Carbone, “there is concern for increased risk of mesothelioma in North Dakota.” The long latency period of the disease—it can take 30 to 60 years of exposure to cause mesothelioma—and the fact that many erionite deposits have only been mined in the past few decades suggests that the number of cases could soon be on the rise. In addition to North Dakota, California, Oregon, Arizona, Nevada and other states have erionite deposit, but the possibility of human exposure elsewhere in the U.S. has not yet been investigated.
In contrast to asbestos, which causes mesothelioma at lower rates, there are no established health benchmarks in the U.S. on safe levels of erionite exposure, because until recently, physicians thought that erionate was present only in Turkey. The new findings, however, indicate that precautionary measures should be put in place to reduce exposure to the mineral, says Carbone. In Turkey, his earlier findings led to moving villagers away from areas with high levels of erionite, into new housing built out of erionite- free materials. “Our findings provide an opportunity to implement novel preventive and detection programs in the U.S. similar to what we have been doing in Turkey,” he says. Future studies could analyze erionite levels in other areas of the U.S. and develop strategies to prevent and screen for mesothelioma.
The study was funded through grants from the National Cancer Institute and the 2008 AACR-Landon Innovator Award for International Collaboration in Cancer Research to Michele Carbone.
Public release date: 26-Jul-2011
Increased risk of Parkinson’s disease in methamphetamine users, CAMH study finds
For immediate release – July 26 (Toronto) – People who abused methamphetamine or other amphetamine- like stimulants were more likely to develop Parkinson’s disease than those who did not, in a new study from the Centre for Addiction and Mental Health (CAMH).
The researchers examined almost 300,000 hospital records from California covering 16 years. Patients
admitted to hospital for methamphetamine or amphetamine-use disorders had a 76 per cent higher risk of developing Parkinson’s disease compared to those with no disorder.
Globally, methamphetamine and similar stimulants are the second most commonly used class of illicit drugs.
“This study provides evidence of this association for the first time, even though it has been suspected for 30 years,” said lead researcher Dr. Russell Callaghan, a scientist with CAMH. Parkinson’s disease is caused by a deficiency in the brain’s ability to produce a chemical called dopamine. Because animal studies have shown that methamphetamine damages dopamine-producing areas in the brain, scientists have worried that the same might happen in humans.
It has been a challenge to establish this link, because Parkinson’s disease develops in middle and old age, and it is necessary to track a large number of people with methamphetamine addiction over a long time span.
The CAMH team took an innovative approach by examining hospital records from California – a state in which methamphetamine use is prevalent – from 1990 up to 2005. In total, 40,472 people, at least 30 years of age, had been hospitalized due to a methamphetamine- or amphetamine-use disorder during this period.
These patients were compared to two groups: 207,831 people admitted for appendicitis with no diagnosis of any type of addiction, and 35,335 diagnosed with cocaine use disorders. A diagnosis of Parkinson’s disease was identified from hospital records or death certificates. Only the methamphetamine group had an increased risk of developing Parkinson’s disease.
While the appendicitis group served as a comparison to the general population, the cocaine group was selected for two reasons. Because cocaine is another type of stimulant that affects dopamine, this group could be used to determine whether the risk was specific to methamphetamine stimulants. Cocaine users also served as a control group to account for the health effects or lifestyle factors associated with dependence on an illicit drug.
“It is important for the public to know that our findings do not apply to patients who take amphetamines for medical purposes, such as attention deficit hyperactivity disorder (ADHD), since these patients use much lower doses of amphetamines than those taken by patients in our study,” said Dr. Stephen Kish, a CAMH scientist and co-author.
To put the study findings into numbers, if 10,000 people with methamphetamine dependence were followed over 10 years, 21 would develop Parkinson’s, compared with 12 people out of 10,000 from the general population. “It is also possible that our findings may underestimate the risk because in California, methamphetamine users may have had less access to health-care insurance and consequently to medical care,” said Dr. Callaghan.
The current project is significant because it is one of the few studies examining the long-term association between methamphetamine use and the development of a major brain disorder. “Given that methamphetamine and other amphetamine stimulants are the second most widely used illicit drugs in the world, the current study will help us anticipate the full long-term medical consequences of such problematic drug use,” said Dr. Callaghan.
Ralph’s Note- A.D.D. medications?
Public release date: 26-Jul-2011
Concern over intensive treatment for patients with Type 2 diabetes
Research: Effect of intensive glucose lowering treatment on all cause mortality, cardiovascular death and microvascular events in Type 2 diabetes: Meta-analysis of randomized controlled trials
Doctors should be cautious about prescribing intensive glucose lowering treatment for patients with type 2 diabetes as a way of reducing heart complications, concludes a new study published on bmj.com today.
French researchers found that intensive glucose lowering treatment, which is widely used for people with type 2 diabetes to reduce their heightened risk of cardiovascular disease, showed no benefit on all-cause or cardiovascular mortality.
Globally, there were an estimated 150 million adults with diabetes in 2000 and this is expected to rise to 366 million by 2030. People with type 2 diabetes are twice as likely to have cardiovascular disease than non-diabetics and are also more at risk of microvascular complications (damage to small blood vessels).
Glycaemic lowering therapies are commonly used to treat people with type 2 diabetes to prevent long term cardiovascular complications and renal and visual impairment, but previous studies have not shown clear and universal benefits of the treatment.
So a team, led by Catherine Cornu at the Louis Pradel Hospital in Bron, France, reviewed studies that looked at microvascular complications and cardiovascular events related to the intensity of glycaemic control and the quality of trials.
They analysed 13 studies involving 34,533 patients of whom 18,315 were given intensive glucose lowering treatment and 16,218 given standard treatment.
They found that intensive glucose treatment did not significantly affect all-cause mortality or cardiovascular death.
There was, however, a 15% reduction in the risk of non-fatal heart attacks, following intensive treatment and a 10% reduction in microalbuminuria – an indication of kidney problems and heart disease – but a more than two-fold increase in the risk of severe hypoglycaemia (dangerously low blood glucose levels).
The researchers calculated that over a five-year treatment period, 117 to 150 patients would need to be treated to avoid one heart attack, 32 to 142 to avoid one case of microalbuminuria, and 15 to 52 to avoid one severe hypoglycaemic event.
They conclude: “Intensive glucose lowering treatment of type 2 diabetes should be considered with caution and therapeutic escalation should be limited.”
In an accompanying editorial, UK experts state that clinicians should consider the absolute risks and benefits of more intensive therapy carefully on an individual patient basis to determine the most sensible treatment strategy.
Public release date: 26-Jul-2011
Zinc lozenges may shorten common cold duration
Author: Tiina.L.Palomaki 26.07.2011 12:44 University of Helsinki
Communications July 26, 2011
Zinc lozenges may shorten common cold duration
Depending on the total dosage of zinc and the composition of lozenges,
zinc lozenges may shorten the duration of common cold episodes by up to 40%, according to a study published in the Open Respiratory Medicine Journal.
For treating the common cold, zinc lozenges are dissolved slowly in the mouth. Interest in zinc lozenges started in the early 1980s from the serendipitous observation that a cold of a young girl with leukemia rapidly disappeared when she dissolved a therapeutic zinc tablet in her mouth instead of swallowing it. Since then over a dozen studies have been carried out to find out whether zinc lozenges are effective, but the
results of those studies have diverged.
Dr. Harri Hemila of the University of Helsinki, Finland, carried out a meta-analysis of all the placebo-controlled trials that have examined the effect of zinc lozenges on natural common cold infections. Of the 13 trial comparisons identified, five used a total daily zinc dose of less than 75
mg and uniformly those five comparisons found no effect of zinc. Three trials used zinc acetate in daily doses of over 75 mg, with the average indicating a 42% reduction in the duration of colds. Five trials used zinc salts other than acetate in daily doses of over 75 mg, with the average indicating a 20% decrease in the duration of colds.
In several studies, zinc lozenges caused adverse effects, such as bad taste, but there is no evidence that zinc lozenges might cause long term harm. Furthermore, in the most recent trial on zinc acetate lozenges,
there were no significant differences between the zinc and placebo groups in the occurrence of adverse effects although the daily dose of zinc was 92 mg. Dr. Hemila concluded that ?since a large proportion of trial
participants have remained without adverse effects, zinc lozenges might be useful for them as a treatment option for the common cold.?
Public release date: 26-Jul-2011
New study shows Transcendental Meditation improves brain functioning in ADHD students
A non-drug approach to enhance students’ ability to learn
A random-assignment controlled study published today in Mind & Brain, The Journal of Psychiatry (Vol 2, No 1) found improved brain functioning and decreased symptoms of attention-deficit/hyperactivity disorder, ADHD, in students practicing the Transcendental Meditation® (TM) technique. The paper, ADHD, Brain Functioning, and Transcendental Meditation Practice, is the second published study demonstrating TM’s ability to help students with attention-related difficulties.
The first exploratory study, published in Current Issues in Education, followed a group of middle school students diagnosed with ADHD who meditated twice a day in school. After 3 months, researchers found
over 50% reductions in stress, anxiety, and ADHD symptoms. During the study, a video was made of some students discussing what it felt like to have ADHD, and how those experiences changed after 3 months of regular TM practice.
In this second study, lead author, neuroscientist Fred Travis, PhD, director of the Center for Brain, Consciousness and Cognition, joined principal investigator Sarina J. Grosswald, EdD, a George Washington University-trained cognitive learning specialist, and co-researcher William Stixrud, PhD, a prominent Silver Spring, Maryland, clinical neuropsychologist, to investigate the effects of Transcendental Meditation practice on task performance and brain functioning in 18 ADHD students, ages 11-14 years.
The study was conducted over a period of 6 months in an independent school for children with language- based learning disabilities in Washington, DC. The study showed improved brain functioning, increased brain processing, and improved language-based skills among ADHD students practicing the Transcendental Meditation technique.
A local TV news station reported on the study in-progress during the first 3 months. What was Measured
Students were pretested, randomly assigned to TM or delayed-start comparison groups, and post-tested at 3- and 6-months. Delayed-start students learned TM after the 3-month post-test.
EEG measurements of brain functioning were taken while students were performing a demanding computer-based visual-motor task. Successful performance on the task requires attention, focus, memory, and impulse control.
In addition, students were administered a verbal fluency test. This test measured higher-order executive functions, including initiation, simultaneous processing, and systematic retrieval of knowledge.
Performance on this task depends on several fundamental cognitive components, including vocabulary knowledge, spelling, and attention.
Theta/Beta Power Ratios and ADHD
Using EEG measurements, the relationship of theta brain waves to beta brain waves can be diagnostic of ADHD. Dr. Joel Lubar of the University of Tennessee has demonstrated that the theta/beta ratio can very accurately identify students with ADHD from those without it.
While theta EEG around 4-5 Hz is commonly associated with daydreaming, drowsiness, and unfocused mental states, theta EEG around 6-8 Hz is seen when one focuses on inner mental tasks, such as memory processing, identifying, and associating.
“In normal individuals, theta activity in the brain during tasks suggests that the brain is blocking out irrelevant information so the person can focus on the task,” said Dr. Travis. “But, in individuals with ADHD, the theta activity is even higher, suggesting that the brain is also blocking out relevant information.”
And when beta activity, which is associated with focus, is lower than normal,” Travis added, “it affects the ability to concentrate on task for extended periods of time.”
“Prior research shows ADHD children have slower brain development and a reduced ability to cope with stress,” said Dr. Stixrud. “Virtually everyone finds it difficult to pay attention, organize themselves and get things done when they’re under stress,” he explained. “Stress interferes with the ability to learn—it
shuts down the brain. Functions such as attention, memory, organization, and integration are compromised.”
Why the TM Technique
“We chose the TM technique for this study because studies show that it increases brain function. We wanted to know if it would have a similar effect in the case of ADHD, and if it did, would that also improve the symptoms of ADHD,” said Dr. Grosswald.
Dr. Stixrud added, “Because stress significantly compromises attention and all of the key executive functions such as inhibition, working memory, organization, and mental flexibility, it made sense that a technique that can reduce a child’s level of stress should also improve his or her cognitive functioning.”
The Transcendental Meditation technique is an effortless, easy-to-learn practice, unique among categories of meditation. “TM does not require concentration, controlling the mind or disciplined focus—challenges for anyone with ADHD,” Grosswald added.
There is substantial research showing the effectiveness of the TM technique for reducing stress and anxiety, and improving cognitive functioning among the general population. “What’s significant about these new findings,” Grosswald said, “is that among children who have difficulty with focus and attention, we see the same results. The fact that these children are able to do TM, and do it easily, shows us that this technique may be particularly well-suited for children with ADHD.”
Transcendental Meditation produces an experience of restful alertness, which is associated with higher metabolic activity in the frontal and parietal parts of the brain, indicating alertness, along with decreased metabolic activity in the thalamus, which is involved in regulating arousal, and hyperactivity.
With regular practice, this restfully alert brain state, characteristic of the TM technique, becomes more present outside of meditation, allowing ADHD students to attend to tasks. “In a sense,” Dr. Travis said, “the repeated experience of the Transcendental Meditation technique trains the brain to function in a style opposite to that of ADHD.”
Improved Brain Functioning
During the practice of the Transcendental Meditation technique, coherence is found across different EEG frequencies. After meditation, the brain utilizes this increased functioning ability to support the performance of a task in an integrated manner.
Three months of TM practice resulted in significant decreases in theta/beta ratios and increased verbal fluency. This translates into improved executive function and more efficient cognitive processing.
During the first 3 months of the study, the theta/beta ratios of the control group (delayed start) actually increased. After learning, and practicing TM for 3 months, this group experienced dramatic decreases in theta/beta ratios and increased verbal fluency as well.
Student and Parent Surveys
Students reported that the TM technique was enjoyable and easy to do. They felt calmer, less stressed, and better able to concentrate on their schoolwork. They also said they were happier since they started TM. This correlated with reports from the parents.
At the end of the research, the parents completed a questionnaire to assess their perceptions of changes in five ADHD-related symptoms in their children from the beginning to the end of the study. There were positive and statistically significant improvements in the five areas measured: a) Ability to focus on
schoolwork, b) Organizational abilities, c) Ability to work independently, d) Happiness, and e) Quality of sleep.
The combined results were significant. There was a 48% reduction in the theta/beta power ratios and a 30% increase in brain coherence after the 6-month period. Studies have shown that pharmaceuticals decrease theta/beta power ratios by 3%, and neurofeedback by 25%.
“These are very encouraging findings,” said Dr. Stixrud. “Significant improvement in the theta/beta ratio without medication and without having to use any expensive equipment is a big deal, as is significant improvement in student happiness and student academic functioning reported by the parents.”
“While stimulant medication is very beneficial for some of my clients with ADHD,” Stixrud added, “the number of children who receive great benefit from medicine with minimal side-effects is relatively small. The fact that TM appears to improve attention and executive functions, and significantly reduces stress with no negative side-effects, is clearly very promising.” Stixrud said he hoped these findings would lead to more research on the use of TM with children and adolescents.
In conclusion, these findings warrant additional research to assess the impact of Transcendental Meditation practice as a non-drug treatment for ADHD, and to track meditating students’ improved academic achievements.
The study was funded by a grant from the David Lynch Foundation. FACT SHEET
Attention-Deficit/Hyperactivity Disorder (ADHD)
•Attention-deficit/hyperactivity disorder (ADHD)—characterized by inattentiveness, impulsivity, and hyperactivity—is diagnosed in almost 10% of children ages 4-17 years, representing 5.4 million children.
•The Center for Disease Control and Prevention reported among children with current ADHD, 66.3% were taking medication for the disorder. In total, 4.8% of all children ages 4-17 years (2.7 million) were taking medication for ADHD. The majority of them stay on it into adulthood.
•The rate of prescriptions for Attention-Deficit/Hyperactivity Disorder in the U.S. has increased by a factor of five since 1991—with production of ADHD medicines up 2,000 percent in 9 years.
•The commonly used drugs for ADHD are stimulants (amphetamines). These drugs can cause persistent and negative side-effects, including sleep disturbances, reduced appetite, weight loss, suppressed growth, and mood disorders. The side-effects are frequently treated with additional medications to manage insomnia or mood swings. Almost none of the medications prescribed for insomnia or mood disturbances are approved by the Food and Drug Administration (FDA) for use with children.
•The long-term health effects of ADHD medications are not fully known, but evidence suggests risks of cardiac disorders and sudden death, liver damage and psychiatric events. It has also been found that children on long-term medication have significantly higher rates of delinquency, substance use, and stunted physical growth.
•A new study, Study raises questions about long-term effects of ADHD medication, the first of its kind, released February 17, 2010 by the Government of Western Australia’s Department of Health, found that “long-term use of drugs such as Ritalin and dexamphetamine may not improve a child’s social and emotional well-being or academic performance.” The chair of the Ministerial Implementation Committee for Attention Deficit Hyperactivity Disorder in Western Australia said in the Department’s press release,
“We found that stimulant medication did not significantly improve a child’s level of depression, self perception or social functioning and they were more likely to be (performing below their age level at school by a factor of 10.5 times.”)
The Transcendental Meditation Technique
•The Transcendental Meditation technique is an effortless technique practiced 10-20 minutes twice a day sitting comfortably with the eyes closed.
•TM is not a religion or philosophy and involves no new beliefs or change in lifestyle.
•Over 350 peer-reviewed research studies on the TM technique confirm a range of benefits for mind, body and behavior.
•Several studies have compared the effects of different meditation practices and found that Transcendental Meditation provides deeper relaxation and is more effective at reducing anxiety, depression and hypertension than other forms of meditation and relaxation. In addition, no other meditation practice shows the widespread coherence throughout all areas of the brain that is seen with Transcendental Meditation.
•The Transcendental Meditation technique is taught in the United States by a non-profit, educational organization.
Public release date: 26-Jul-2011
Are cancers newly evolved species?
Cancer patients may view their tumors as parasites taking over their bodies, but this is more than a metaphor for Peter Duesberg, a molecular and cell biology professor at the University of California, Berkeley.
Cancerous tumors are parasitic organisms, he said. Each one is a new species that, like most parasites, depends on its host for food, but otherwise operates independently and often to the detriment of its host.
A karyograph is one way to display the number of copies of each chromosome in a clone of cells from an individual or a cancer. Here, the karyograph shows the chromosomes of 20 individual cells (represented by black lines) of a normal human male. Each cell has precisely two copies of 22 chromosomes and one copy of each sex chromosome, demonstrating that human cells have a fixed and stable karyotype.
In a paper published in the July 1 issue of the journal Cell Cycle, Duesberg and UC Berkeley colleagues describe their theory that carcinogenesis – the generation of cancer – is just another form of speciation, the evolution of new species.
“Cancer is comparable to a bacterial level of complexity, but still autonomous, that is, it doesn’t depend on other cells for survival; it doesn’t follow orders like other cells in the body, and it can grow where, when and how it likes,” said Duesberg. “That’s what species are all about.”
This novel view of cancer could yield new insights into the growth and metastasis of cancer, Duesberg said, and perhaps new approaches to therapy or new drug targets. In addition, because the disrupted chromosomes of newly evolved cancers are visible in a microscope, it may be possible to detect cancers earlier, much as today’s Pap smear relies on changes in the shapes of cervical cells as an indication of chromosomal problems that could lead to cervical cancer.
Carcinogenesis and evolution
The idea that cancer formation is akin to the evolution of a new species is not new, with various biologists hinting at it in the late 20th century. Evolutionary biologist Julian S. Huxley wrote in 1956 that “Once the neoplastic process has crossed the threshold of autonomy, the resultant tumor can be logically regarded as a new biologic species ….”
Last year, Dr. Mark Vincent of the London Regional Cancer Program and University of Western Ontario argued in the journal Evolution that carcinogenesis and the clonal evolution of cancer cells are speciation events in the strict Darwinian sense.
The evolution of cancer “seems to be different from the evolution of a grasshopper, for instance, in part because the cancer genome is not a stable genome like that of other species. The challenging question is, what has it become?” Vincent said in an interview. “Duesberg’s argument from karyotype is different from my argument from the definition of a species, but it is consistent.”
Vincent noted that there are three known transmissible cancers, including devil facial tumor disease, a “parasitic cancer” that attacks and kills Tasmanian devils. It is transmitted from one animal to another by a whole cancer cell. A similar parasitic cancer, canine transmissible venereal tumor, is transmitted between dogs via a single cancer cell that has a genome dating from the time when dogs were first domesticated. A third transmissible cancer was found in hamsters.
“Cancer has become a successful parasite,” Vincent said. Mutation theory vs. aneuploidy
Duesbeg’s arguments derive from his controversial proposal that the reigning theory of cancer – that
tumors begin when a handful of mutated genes send a cell into uncontrolled growth – is wrong. He argues, instead, that carcinogenesis is initiated by a disruption of the chromosomes, which leads to duplicates, deletions, breaks and other chromosomal damage that alter the balance of tens of thousands of genes. The result is a cell with totally new traits – that is, a new phenotype.
“I think Duesberg is correct by criticizing mutation theory, which sustains a billion-dollar drug industry focused on blocking these mutations,” said Vincent, a medical oncologist. “Yet very, very few cancers have been cured by targeted drug therapy, and even if a drug helps a patient survive six or nine more months, cancer cells often find a way around it.”
Chromosomal disruption, called aneuploidy, is known to cause disease. Down syndrome, for example, is caused by a third copy of chromosome 21, one of the 23 pairs of human chromosomes. All cancer cells are aneuploid, Duesberg said, though proponents of the mutation theory of cancer argue that this is a consequence of cancer, not the cause.
Key to Duesberg’s theory is that some initial chromosomal mutation – perhaps impairing the machinery that duplicates or segregates chromosomes in preparation for cell division – screws up a cell’s chromosomes, breaking some or making extra copies of others. Normally this would be a death sentence for a cell, but in rare cases, he said, such disrupted chromosomes might be able to divide further, perpetuating and compounding the damage. Over decades, continued cell division would produce many unviable cells as well as a few still able to divide autonomously and seed cancer.
Duesberg asserts that cancers are new species because those viable enough to continue dividing develop relatively stable chromosome patterns, called karyotypes, distinct from the chromosome pattern of their human host. While all known organisms today have stable karyotypes, with all cells containing precisely two or four copies of each chromosome, cancers exhibit a more flexible and unpredictable karyotype, including not only intact chromosomes from the host, but also partial, truncated and mere stumps of chromosomes.
“If humans changed their karyotype – the number and arrangement of chromosomes – we would either die or be unable to mate, or in very rare cases become another species,” Duesberg said. But cancer cells just divide and make more of themselves. They don’t have to worry about reproduction, which is sensitive to chromosomal balance. In fact, as long as the genes for mitosis are still intact, a cancer cell can survive with many disrupted and unbalanced chromosomes, such as those found in an aneuploid cell, he said.
The karyotype does change as a cancer cell divides, because the chromosomes are disrupted and thus don’t copy perfectly. But the karyotype is “only flexible within a certain margin,” Duesberg said. “Within these margins it remains stable, despite its flexibility.”
Karyographs display karyotype variability
Duesberg and his colleagues developed karyographs as a way to display the aneuploid nature of a cell’s karyotype and its stability across numerous cell cultures. Using these karyographs, he and his colleagues analyzed several cancers, clearly demonstrating that the karyotype is amazingly similar in all cells of a specific cancer line, yet totally different from the karyotypes of other cancers and even the same type of cancer from a different patient.
In contrast to normal cells, cervical cancer cells (HeLa) have flexible chromosomes. The 23 normal chromosomes have between 0 and 4 copies, while the several dozen hybrid or “marker” chromosomes have between 0 and 2. The copy numbers differ in the 20 individual HeLa cells shown, but they are nearly clonal, varying around an average clonal number.
HeLa cells are a perfect example. Perhaps the most famous cancer cell line in history, HeLa cells were obtained in 1951 from a cervical cancer that eventually killed a young black woman named Henrietta Lacks. The 60-year-old cell line derived from her cancer has a relatively stable karyotype that keeps it alive through division after division.
“Once a cell has crossed that barrier of autonomy, it’s a new species,” Duesberg said. “HeLa cells have evolved in the laboratory and are now even more stable than they probably were when they first arose.”
The individualized karyotypes of cancers resemble the distinct karyotypes of different species,, Duesberg said. While biologists have not characterized the karyotypes of most species, no two species are known that have the same number and arrangement of chromosomes, including those of, for example, gorillas and humans, who share 99 percent of their genes.
Duesberg argues that his speciation theory explains cancer’s autonomy, immortality and flexible, but relatively stable, karyotype. It also explains the long latency period between initial aneuploidization and full blown cancer, because there is such a low probability of evolving an autonomous karyotype.
“You start with a chromosomal mutation, that is, aneuploidy perhaps from X-rays or cigarettes or radiation, that destabilizes and eventually changes your karyotype or renders it non-viable,” he said. “The rare viable aneuploidies of cancers are, in effect, the karyotypes of new species.”
Duesberg hopes that the carcinogenesis-equals-speciation theory will spur new approaches to diagnosing and treating cancer. Vincent, for example, suspects that cancers are operating right at the edge of survivability, maintaining genomic flexibility while retaining the ability to divide forever. Driving them to evolve even faster, he said, “might push them over the edge.”
Public release date: 27-Jul-2011
Popular mammography tool not effective for finding invasive breast cancer
Computer-aided detection (CAD) technology is ineffective in finding breast tumors, and appears to increase a woman’s risk of being called back needlessly for additional testing following mammography, a large UC Davis study has found.
The analysis of 1.6 million mammograms in seven states has delivered the most definitive findings to date on whether the popular mammography tool is effective in helping find breast cancer.
“In real-world practice, CAD increases the chances of being unnecessarily called back for further testing because of false-positive results without clear benefits to women,” said Joshua Fenton, assistant professor in the UC Davis Department of Family and Community Medicine. “Breast cancers were detected at a similar stage and size regardless of whether or not radiologists used CAD.”
The study examined screening mammograms performed on more than 680,000 women at 90 mammography facilities in seven U.S. states from 1998 to 2006. The false-positive rate typically
increased from 8.1 percent before CAD to 8.6 percent after CAD was installed at the facilities in the study. In addition, the detection rate of breast cancer and the stage and size of breast cancer tumors were similar regardless of CAD.
The study, entitled “Effectiveness of Computer-Aided Detection in Community Mammography Practice,” was published online today in the Journal of the National Cancer Institute and used data from the Breast Cancer Surveillance Consortium.
Computer-aided detection software, approved by the Food and Drug Administration in 1998, analyzes the mammogram image and marks suspicious areas for radiologists to review. Its use has skyrocketed in recent years since Medicare began covering it in 2001. CAD is now applied to the large majority of screening mammograms in the U.S. with annual direct Medicare costs exceeding $30 million, according to a 2010 study in the Journal of the American College of Radiology.
According to 2009 Medicare data, insurers including Medicare typically paid about $12 per screening mammogram for CAD in addition to the costs of the mammogram (about $81 for film mammography and
$130 for digital mammography), representing a 9 percent to 15 percent additional cost for CAD use.
The current study builds on Fenton’s initial assessment of the technology published in the New England Journal of Medicine in 2007. That report, which examined mammography screening results in 43 facilities, including seven that utilized CAD, found that CAD was associated with reduced accuracy of interpretation of screening mammograms but no difference in the detection rate of invasive breast cancer.
Critics of the research findings in the New England Journal of Medicine said the study was based on use of an older kind of CAD technology, and so did not accurately reflect its usefulness.
“In the current study, we evaluated newer technology in a larger sample and over a longer time period,” said Fenton. “We also looked for the first time at cancer stage and cancer size, which are critical for understanding how CAD may affect long-term breast cancer outcomes, such as mortality.”
The authors write that the results of real-world studies of CAD may differ from results from pre-clinical studies. They suggest that these differences may arise because radiologists in clinical practice don’t always adhere as strictly to use of the technology as designed, as have radiologists in protocol-driven studies.
Public release date: 27-Jul-2011
Yoga boosts stress-busting hormone, reduces pain: York U study
TORONTO, July 27, 2011 – A new study by York University researchers finds that practicing yoga reduces the physical and psychological symptoms of chronic pain in women with fibromyalgia.
The study is the first to look at the effects of yoga on cortisol levels in women with fibromyalgia. The condition, which predominantly affects women, is characterized by chronic pain and fatigue; common symptoms include muscle stiffness, sleep disturbances, gastrointestinal discomfort, anxiety and depression.
Previous research has found that women with fibromyalgia have lower-than-average cortisol levels, which contribute to pain, fatigue and stress sensitivity. According to the study, participants’ saliva revealed elevated levels of total cortisol following a program of 75 minutes of hatha yoga twice weekly over the course of eight weeks.
“Ideally, our cortisol levels peak about 30-40 minutes after we get up in the morning and decline throughout the day until we’re ready to go to sleep,” says the study’s lead author, Kathryn Curtis, a PhD student in York’s Department of Psychology, Faculty of Health. “The secretion of the hormone, cortisol, is dysregulated in women with fibromyalgia” she says.
Cortisol is a steroid hormone that is produced and released by the adrenal gland and functions as a component of the hypothalamic-pituitary-adrenal (HPA) axis in response to stress.
“Hatha yoga promotes physical relaxation by decreasing activity of the sympathetic nervous system, which lowers heart rate and increases breath volume. We believe this in turn has a positive effect on the HPA axis,” says Curtis.
Participants completed questionnaires to determine pain intensity pre- and post-study; they reported significant reductions in pain and associated symptoms, as well as psychological benefits. They felt less helpless, were more accepting of their condition, and were less likely to “catastrophize” over current or future symptoms.
“We saw their levels of mindfulness increase – they were better able to detach from their psychological experience of pain,” Curtis says. Mindfulness is a form of active mental awareness rooted in Buddhist traditions; it is achieved by paying total attention to the present moment with a non-judgmental awareness of inner and outer experiences.
“Yoga promotes this concept – that we are not our bodies, our experiences, or our pain. This is extremely useful in the management of pain,” she says. “Moreover, our findings strongly suggest that psychological changes in turn affect our experience of physical pain.”
The study – Curtis’ thesis – was published yesterday in the Journal of Pain Research. It is co-authored by her supervisor, York professor Joel Katz, Canada Research Chair in Health Psychology, and Anna Osadchuk, a York University undergraduate student.
Curtis was supported by a Canadian Institutes of Health Research (CIHR) Canada Graduate Scholarship and a CIHR Strategic Training Grant Fellowship in Pain: Molecules to Community.
Public release date: 28-Jul-2011
Fructose consumption increases risk factors for heart disease
Study suggests US Dietary Guideline for upper limit of sugar consumption is too high
A recent study accepted for publication in The Endocrine Society’s Journal of Clinical Endocrinology & Metabolism (JCEM) found that adults who consumed high fructose corn syrup for two weeks as 25 percent of their daily calorie requirement had increased blood levels of cholesterol and triglycerides, which have been shown to be indicators of increased risk for heart disease.
The American Heart Association recommends that people consume only five percent of calories as added sugar. The Dietary Guidelines for Americans 2010 suggest an upper limit of 25 percent or less of daily calories consumed as added sugar. To address this discrepancy in recommended consumption levels, researchers examined what happened when young overweight and normal weight adults consumed fructose, high fructose corn syrup or glucose at the 25 percent upper limit.
“While there is evidence that people who consume sugar are more likely to have heart disease or diabetes, it is controversial as to whether high sugar diets may actually promote these diseases, and dietary
guidelines are conflicting,” said the study’s senior author, Kimber Stanhope, PhD, of the University of California, Davis. “Our findings demonstrate that several factors associated with an elevated risk for cardiovascular disease were increased in individuals consuming 25 percent of their calories as fructose or high fructose corn syrup, but consumption of glucose did not have this effect.”
In this study, researchers examined 48 adults between the ages of 18 and 40 years and compared the effects of consuming 25 percent of one’s daily calorie requirement as glucose, fructose or high fructose corn syrup on risk factors for cardiovascular disease. They found that within two weeks, study participants consuming fructose or high fructose corn syrup, but not glucose, exhibited increased concentrations of LDL cholesterol, triglycerides and apolipoprotein-B (a protein which can lead to plaques that cause vascular disease).
“These results suggest that consumption of sugar may promote heart disease,” said Stanhope. “Additionally our findings provide evidence that the upper limit of 25 percent of daily calories consumed as added sugar as suggested by The Dietary Guidelines for American 2010 may need to be re-evaluated.”
Ralph’s Note: The evidence against High Fructose Corn Syrup through multiple studies has become overwhelming. It needs to be banned.
Public release date: 29-Jul-2011
Grapes protect against ultraviolet radiation
Some compounds found in grapes help to protect skin cells from the sun’s ultraviolet radiation, according to a study by researchers from the University of Barcelona and the CSIC (Spanish National Research Council). The study supports the use of grapes or grape derivatives in sun protection products.
Ultraviolet (UV) rays emitted by the sun are the leading environmental cause of skin complaints, causing skin cancer, sunburn and solar erythema, as well as premature ageing of the dermis and epidermis. Now, a Spanish study has proven that some substances in grapes can reduce the amount of cell damage caused in skin exposed to this radiation.
UV rays act on the skin by activating ‘reactive oxygen species’ (ROS). These compounds in turn oxidise macromolecules such as lipids and DNA, stimulating certain reactions and enzymes (JNK and p38MAPK) which cause cell death.
A group of scientists from the University of Barcelona and the CSIC have shown that some polyphenolic substances extracted from grapes (flavonoids) can reduce the formation of ROSs in human epidermis cells that have been exposed to long-wave (UVA) and medium-wave (UVB) ultraviolet radiation. The study, carried out in vitro in the laboratory, has been published in the Journal of Agricultural and Food Chemistry.
Grape-based sun protection
“These polyphenolic fractions inhibit the generation of the ROSs and, as a result, the subsequent activation of the JNK and p38 enzymes, meaning they have a protective effect against ultraviolet radiation emitted by the sun”, Marta Cascante, a biochemist at the University of Barcelona (Spain) and director of the research project, tells SINC.
The researchers found that the higher the degree of the flavonoids’ polymerisation and formation of compounds containing gallic acid, the greater their photoprotective capacity.
The study suggests that these “encouraging results should be taken into consideration in clinical pharmacology using plant-based polyphenolic extracts to develop new photoprotection skin products.
Cosmetics and drugs containing grape compounds are already available, but the way they act on cells has not been well understood until now. “This study supports the idea of using these products to protect the skin from cell damage and death caused by solar radiation, as well as increasing our understanding of the mechanism by which they act”, concludes Cascante.
These reports are done with the appreciation of all the Doctors, Scientist, and other Medical Researchers who sacrificed their time and effort. In order to give people the ability to empower themselves. Without the base aspirations for fame, or fortune.
Just honorable people, doing honorable things.