Health Technology Research Synopsis
134th Issue Date 27 JUL 2012
Compiled By Ralph Turchiano
http://www.healthresearchreport.me
http://www.facebook.com/vitaminandherbstore
http://www.engineeringevil.com
Editors Top Five:
1. Increase in RDA for vitamin C could help reduce heart disease, stroke, cancer
2. Vitamin E may lower liver cancer risker
3. High dietary antioxidant intake might cut pancreatic cancer risk
4. HPV improves survival for African-Americans with throat cancer
5. Recent research uncovers tick bite as the cause for a delayed allergic reaction to red meat
In This Issue:
1. Gold nanoparticles could treat prostate cancer with fewer side effects than chemotherapy
2. Increase in RDA for vitamin C could help reduce heart disease, stroke, cancer
3. AAAS joins more than 3000 organizations in urging Congress to avoid ‘devastating’ budget cuts
4. Vitamin E may lower liver cancer risk
5. Mammography screening shows limited effect on breast cancer mortality in Sweden
6. Workplace exposure to organic solvents linked to heart defects at birth
7. Widely prescribed MS treatment may not slow progression of disease: VCH-UBC research
8. Marijuana use doubles risk of premature birth
9. Study suggests moderate drinking lowers risk of developing rheumatoid arthritis in women
10. Genetic link to rapid weight gain from antipsychotics discovered
11. Poisoning from industrial compounds can cause similar effects to ALS
12. UVC Light Kills Wound Bacteria
13. Copper Surfaces Could Reduce Hospital Acquired Infections
14. Stanford chemists synthesize compound that flushes out latent HIV
15. Parental consent for HPV vaccine should not be waived, poll says
16. Botanical compound could prove crucial to healing influenza
17. PSU study finds ‘caffeinated’ coastal waters
18. Study questions safety and effectiveness of common kidney disease drugs
19. HPV improves survival for African-Americans with throat cancer
20. Moderate alcohol intake is associated with a lower risk of kidney cancer
21. Beneficial bacteria may help ward off infection
22. Vitamin D may protect against lung function impairment and decline in smokers
23. FDA PANEL MEMBERS EXPRESS OPPOSING VIEWS ON TRUVANDA APPROVAL
24. High dietary antioxidant intake might cut pancreatic cancer risk
25. Researchers develop ginseng-fortified milk to improve cognitive function
26. New study: Raisins as effective as sports chews for fueling workouts
27. Synthetic stimulants called ‘bath salts’ act in the brain like cocaine
28. Diets high in salt could deplete calcium in the body: UAlberta research
29. Recent research uncovers tick bite as the cause for a delayed allergic reaction to red meat
30. Yoga reduces stress; now it’s known why
31. How a low-protein diet predisposes offspring to adulthood hypertension
32. Published clinical trial demonstrates efficacy of Sea-Band® for migraine-related nausea
33. Do ovaries continue to produce eggs during adulthood?
34. Yoga may help stroke survivors improve balance
35. Men with prostate cancer more likely to die from other causes
36. Repetitious, Time-Intensive Magical Rituals Considered More Effective, Study Shows
Gold nanoparticles could treat prostate cancer with fewer side effects than chemotherapy
In new study published in PNAS, scientists found that nanoparticles, produced from chemicals in tea, reduced tumors by 80 percent
COLUMBIA, Mo. – Currently, large doses of chemotherapy are required when treating certain forms of cancer, resulting in toxic side effects. The chemicals enter the body and work to destroy or shrink the tumor, but also harm vital organs and drastically affect bodily functions. Now, University of Missouri scientists have found a more efficient way of targeting prostate tumors by using gold nanoparticles and a compound found in tea leaves. This new treatment would require doses that are thousands of times smaller than chemotherapy and do not travel through the body inflicting damage to healthy areas. The study is being published in the Proceedings of the National Academy of Sciences.
“In our study, we found that a special compound in tea was attracted to tumor cells in the prostate,” said Kattesh Katti, curators’ professor of radiology and physics in the School of Medicine and the College of Arts and Science and senior research scientist at the MU Research Reactor. “When we combined the tea compound with radioactive gold nanoparticles, the tea compound helped ‘deliver’ the nanoparticles to the site of the tumors and the nanoparticles destroyed the tumor cells very efficiently.”
Currently, doctors treat prostate cancer by injecting hundreds of radioactive ‘seeds’ into the prostate. However, that treatment is not effective when treating an aggressive form of prostate cancer, said Cathy Cutler, research professor at the MU Research Reactor and co-author of the study. The size of the seeds and their inability to deliver effective doses hampers their ability to stop the aggressive form of prostate cancer.
In the study, the MU scientists created nanoparticles that are just the right size. Instead of hundreds of injections, the team only used one or two injections, and the nanoparticles were more likely to stay very close to the tumor sites.
Cutler and Katti have been working with colleagues Raghuraman Kannan, Anandhi Upendran, Charles Caldwell as well as others in the Department of Radiology and at the MU Research Reactor to develop and design the nanoparticles to the correct shape and size to treat prostate cancer. If the nanoparticles produced are too small, they can escape and spread; if they are made large enough, the nanoparticles will stay inside the tumor and treat it much more effectively than current methods.
“Current therapy for this disease is not effective in those patients who have aggressive prostate cancer tumors,” Cutler said. “Most of the time, prostate cancers are slow-growing; the disease remains localized and it is easily managed. Aggressive forms of the disease spread to other parts of the body, and it is the second-leading cause of cancer deaths in U.S. men. However, we believe the gold nanoparticles could shrink the tumors, both those that are slow-growing and aggressive, or eliminate them completely.”
“This treatment is successful due to the inherent properties of radioactive gold nanoparticles,” Kannan said. “First, the gold nanoparticles should be made to the correct size, and second, they have very favorable radiochemical properties, including a very short half-life.”
With a half-life of only 2.7 days, the radioactivity from the gold nanoparticles is finished within three weeks.
“Because of their size and the compound found in tea, the nanoparticles remain at the tumor sites,” Upendran said. “This helps the nanoparticles maintain a high level of effectiveness, resulting in significant tumor volume reduction within 28 days of treatment.”
In the current study, the team tested the nanoparticles on mice. Prior to human trials, the scientists will study the treatment in dogs with prostate cancer. Prostate cancer in dogs is extremely close to the human form of the disease.
“When it comes to drug discovery, MU is fortunate because we have a combination of experts in cancer research, animal modeling, isotope production and nanomedicine, and state-of-the-art research infrastructure to take discoveries from ‘the bench to the bedside’ and never leave campus,” Katti said. “For example, we developed the nanoparticles here at our research reactor, which is one of the few places in the world that produces therapeutic, clinical grade radioisotopes. We then tested the radioactive gold nanoparticles in small animals in collaboration with other radiology researchers using testing facilities located at the Harry S. Truman Veterans Hospital. Our next steps include partnering with the College of Veterinary Medicine to treat larger animals with the hopes of having human clinical trials, held on our campus, soon.”
Katti, Cutler, Kannan, Upendran and Caldwell were joined in the study by Ravi Shukla, Nripen Chanda and Ajit Zambre, all from the Department of Radiology.
Increase in RDA for vitamin C could help reduce heart disease, stroke, cancer
CORVALLIS, Ore. – The recommended dietary allowance, or RDA, of vitamin C is less than half what it should be, scientists argue in a recent report, because medical experts insist on evaluating this natural, but critical nutrient in the same way they do pharmaceutical drugs and reach faulty conclusions as a result.
The researchers, in Critical Reviews in Food Science and Nutrition, say there’s compelling evidence that the RDA of vitamin C should be raised to 200 milligrams per day for adults, up from its current levels in the United States of 75 milligrams for women and 90 for men.
Rather than just prevent the vitamin C deficiency disease of scurvy, they say, it’s appropriate to seek optimum levels that will saturate cells and tissues, pose no risk, and may have significant effects on public health at almost no expense – about a penny a day if taken as a dietary supplement.
“It’s time to bring some common sense to this issue, look at the totality of the scientific evidence, and go beyond some clinical trials that are inherently flawed,” said Balz Frei, professor and director of the Linus Pauling Institute at Oregon State University, and one of the world’s leading experts on the role of vitamin C in optimum health.
“Significant numbers of people in the U.S. and around the world are deficient in vitamin C, and there’s growing evidence that more of this vitamin could help prevent chronic disease,” Frei said. “The way clinical researchers study micronutrients right now, with the same type of so-called ‘phase three randomized placebo-controlled trials’ used to test pharmaceutical drugs, almost ensures they will find no beneficial effect. We need to get past that.”
Unlike testing the safety or function of a prescription drug, the researchers said, such trials are ill suited to demonstrate the disease prevention capabilities of substances that are already present in the human body and required for normal metabolism. Some benefits of micronutrients in lowering chronic disease risk also show up only after many years or even decades of optimal consumption of vitamin C – a factor often not captured in shorter-term clinical studies.
A wider body of metabolic, pharmacokinetic, laboratory and demographic studies suggests just the opposite, that higher levels of vitamin C could help reduce the chronic diseases that today kill most people in the developed world – heart disease, stroke, cancer, and the underlying issues that lead to them, such as high blood pressure, chronic inflammation, poor immune response and atherosclerosis.
“We believe solid research shows the RDA should be increased,” Frei said. “And the benefit-to-risk ratio is very high. A 200 milligram intake of vitamin C on a daily basis poses absolutely no risk, but there is strong evidence it would provide multiple, substantial health benefits.”
An excellent diet with the recommended five to nine daily servings of fruits and raw or steam-cooked vegetables, together with a six-ounce glass of orange juice, could provide 200 milligrams of vitamin C a day. But most Americans and people around the world do not have an excellent diet.
Even at the current low RDAs, various studies in the U.S. and Canada have found that about a quarter to a third of people are marginally deficient in vitamin C, and up to 20 percent in some populations are severely deficient – including college students, who often have less-than-perfect diets. Smokers and older adults are also at significant risk.
Even marginal deficiency can lead to malaise, fatigue, and lethargy, researchers note. Healthier levels of vitamin C can enhance immune function, reduce inflammatory conditions such as atherosclerosis, and significantly lower blood pressure.
A recent analysis of 29 human studies concluded that daily supplements of 500 milligrams of vitamin C significantly reduced blood pressure, both systolic and diastolic. High blood pressure is a major risk factor for heart disease and stroke, and directly attributes to an estimated 400,000 deaths annually in the U.S.
A study in Europe of almost 20,000 men and women found that mortality from cardiovascular disease was 60 percent lower when comparing the blood plasma concentration of vitamin C in the highest 20 percent of people to the lowest 20 percent.
Another research effort found that men with the lowest serum vitamin C levels had a 62 percent higher risk of cancer-related death after a 12-16 year period, compared to those with the highest vitamin C levels.
Laboratory studies with animals – which may be more accurate than human studies because they can be done in controlled conditions and with animals of identical genetic makeup – can document reasons that could explain all of these findings, Frei said.
Critics have suggested that some of these differences are simply due to better overall diet, not vitamin C levels, but the scientists noted in this report that some health benefits correlate even more strongly to vitamin C plasma levels than fruit and vegetable consumption.
Scientists in France and Denmark collaborated on this report. Research at OSU on these issues has been supported by the National Center for Complementary and Alternative Medicine, a division of the National Institutes of Health.
About the Linus Pauling Institute: The Linus Pauling Institute at OSU is a world leader in the study of micronutrients and their role in promoting optimum health or preventing and treating disease. Major areas of research include heart disease, cancer, aging and neurodegenerative disease
AAAS joins more than 3000 organizations in urging Congress to avoid ‘devastating’ budget cuts
The American Association for the Advancement of Science (AAAS) has joined more than 3000 national, state, and local organizations in warning the U.S. Congress and President Barack Obama that automatic budget cuts set for January could have “devastating” effects on research, education, social services, security, and international relations.
The planned cuts threaten federal programs that “support economic growth and strengthen the safety and security of every American in every state and community across the nation,” the groups wrote in a 12 July letter to Congress. “We strongly urge a balanced approach to deficit reduction that does not include further cuts to [non-defense discretionary] programs, which have already done their part to reduce the deficit.”
The massive number of organizations represents the potential scope of the budget “sequestration,” which would reduce non-defense discretionary spending by approximately 8% across most federal programs starting in January 2013. (Sequestration would reduce defense spending by approximately 7.5% across its affected programs.) The sequestration was included in a 2011 agreement that raised the federal debt ceiling in exchange for a commitment to reduce the federal deficit by more than $1 trillion over the next decade.
The letter’s signers come from all 50 states and include veterans groups, church organizations, school districts, universities, research institutes, urban job training centers, hospitals and clinics, and nonprofit art centers.
If the sequestration is allowed to take effect, they wrote in the letter, the impact will be immediate and widespread. “There will be fewer scientific and technological innovations, fewer teachers in classrooms, fewer job opportunities, fewer National Park visitor hours, fewer air traffic controllers and airport screeners, fewer food and drug inspectors, and fewer first responders.”
In a 10 July op-ed, Office of Management and Budget Acting Director Jeffrey Zients said 700,000 young children and mothers would lose nutrition assistance if the sequestration goes into effect. An estimated 100,000 children would lose places in Head Start educational programs, and more than 25,000 teachers and aides would lose their jobs as a result of the budget cuts. A report by Research!America suggests that agencies like the National Cancer Institute and the Food and Drug Administration would lose “critical funding” if the sequestration cuts occur.
Other experts have suggested that some workers who depend on federal contracts could receive layoff notices as early as October 2012, in anticipation of the sequestration, if lawmakers do not agree on strategies to raise revenue, reduce spending or some combination of those to close a budget gap of up to $1.2 trillion over the next decade to address the deficit.
The sequestration as written could amount to cuts of $5 billion or more to federal research and development investments next year, according to a AAAS analysis.
“Non-defense discretionary spending includes virtually all R&D at agencies like the National Science Foundation, the National Institutes of Health, and NASA,” said Matt Hourihan, director of the AAAS R&D Budget and Policy Program. “This spending has already been cut significantly as a result of the debt ceiling agreement. Adding further cuts to those already enacted would severely impact the national research enterprise.
“Public investments in research, especially basic research, are at the core of economic performance and international competitiveness,” Hourihan added, “but these investments have already declined in real dollars over the past decade.”
As the letter to Congress notes, the non-defense discretionary budget, including federal R&D, represented only 3.4% of the gross domestic product (GDP) in 2011. Under the recent deficit reduction agreement, this budget will decline even further, to 2.5% of GDP by 2021.
“NDD [non-defense discretionary] programs are not the reason behind our growing debt,” the letter states. “In fact, even completely eliminating all NDD programs would still not balance the budget. Yet NDD programs have borne the brunt of deficit reduction efforts.”
Even as many experts call for a more balanced approach that prevents further cuts to non-defense spending, some in Congress are attempting to move in the opposite direction. Several proposals—including the budget resolution approved by House Republicans in March—could have even more severe implications for R&D, by shifting the burden of cuts away from defense spending and entirely onto non-defense spending.
According to AAAS estimates, such a move could cut non-defense R&D by up to 27%, or $161 billion, over the next decade. Although the House proposal is unlikely to succeed in the Senate, Hourihan said support for cuts of this magnitude persists in some quarters.
“When we look at these proposals, we’re talking about eliminating a quarter of public non-defense research,” he said. “We recognize the need to pursue a responsible budget, but we also need to be responsible stewards of our economic future.”
Vitamin E may lower liver cancer risk
High consumption of vitamin E either from diet or vitamin supplements may lower the risk of liver cancer, according to a study published July 17 in the Journal of the National Cancer Institute.
Liver cancer is the third most common cause of cancer mortality in the world, the fifth most common cancer found in men and the seventh most common in women. Approximately 85% of liver cancers occur in developing nations, with 54% in China alone. Some epidemiological studies have been done to examine the relationship between vitamin E intake and liver cancer; however, the results have been inconsistent.
To determine the relationship between vitamin E intake and liver cancer risk, Wei Zhang, MD, MPH., of the Shanghai Cancer Institute, Renji Hospital, Shanghai Jiaotong University School of Medicine and colleagues analyzed data from a total of 132,837 individuals in China who were enrolled in the Shanghai Women’s Health Study (SWHS) from 1997-2000 or the Shanghai Men’s Health Study (SMHS) from 2002-2006, two population-based cohort studies jointly conducted by the Shanghai Cancer Institute and Vanderbilt University. Using validated food-frequency questionnaires, the researchers conducted in-person interviews to gather data on study participants’ dietary habits. They compared liver cancer risk among participants who had high intake of vitamin E with those with low intake.
The analysis included 267 liver cancer patients (118 women and 149 men) who were diagnosed between 2 years after study enrollment and an average of 10.9 (SWHS) or 5.5 (SMHS) years of follow-up. Vitamin E intake from diet and vitamin E supplement use were both associated with a lower risk of liver cancer. This association was consistent among participants with and without self-reported liver disease or a family history of liver cancer. “We found a clear, inverse dose-response relation between vitamin E intake and liver cancer risk,” the authors write, noting a small difference between men and women in the risk estimate, which is likely attributable to fewer liver cancer cases having occurred among SMHS participants due to the shorter follow-up period. Overall, the take home message is that, “high intake of vitamin E either from diet or supplements was related to lower risk of liver cancer in middle-aged or older people from China.”
Mammography screening shows limited effect on breast cancer mortality in Sweden
Breast cancer mortality statistics in Sweden are consistent with studies that have reported that screening has limited or no impact on breast cancer mortality among women aged 40-69, according to a study published July 17 in the Journal of The National Cancer Institute.
Since 1974, Swedish women aged 40-69 have increasingly been offered mammography screening, with nationwide coverage peaking in 1997. Researchers set out to determine if mortality trends would be reflected accordingly.
In order to determine this, Philippe Autier, M.D., of the International Prevention Research Institute (iPRI) in France and colleagues, looked at data from the Swedish Board of Health and Welfare from 1960-2009 to analyze trends in breast cancer mortality in women aged age 40 and older by the county in which they lived. The researchers compared actual mortality trends with the theoretical outcomes using models in which screening would result in mortality reductions of 10%, 20%, and 30%.
The researchers expected that screening would be associated with a gradual reduction in mortality, especially because Swedish mammography trials and observational studies have suggested that mammography leads to a reduction in breast cancer mortality. In this study, however, they found that breast cancer mortality rates in Swedish women started to decrease in 1972, before the introduction of mammography, and have continued to decline at a rate similar to that in the prescreening period. “It seems paradoxical that the downward trends in breast cancer mortality in Sweden have evolved practically as if screening had never existed,” they write. “Swedish breast cancer mortality statistics are consistent with studies that show limited or no impact of screening on mortality from breast cancer.”
The researchers do note certain limitations of their study—namely, that it was observational, so unable to take into account the potential influence of other breast cancer risk factors such as obesity, which may have masked the effect of screening on mortality. They also write that population mobility may have biased the results.
In an accompanying editorial, Nereo Segnan, M.D., MSc Epi, CPO Piemonte, of the Unit of Cancer Epidemiology at ASO S Giovanni Battista University Hospital in Italy and colleagues write that, in the assessment the efficacy of the introduction of screening, the paradox is that descriptive analyses of time trends of breast cancer mortality rates are used to confute the results of incidence based mortality studies, employing individual data and conceived for overcoming some of their limitations, or of randomized trials.
The conclusion by Autier et al that the 37% decline in breast cancer mortality in Sweden was not associated with breast cancer screening seems therefore difficult to justify and partially unsupported by data (two groups of Swedish Counties do show a mortality decrease that, according to the stated criteria, could be linked to screening).
They also feel that “it is time to move beyond an apparently never-ending debate on at what extent screening for breast cancer in itself conducted in the seventies through the nineties of the last century has reduced mortality for breast cancer, as if it was isolated from the rest of health care …. The presence of an organized screening program may have promoted the provision of more effective care by monitoring the treatment quality of screen-detected cancers and by favoring the creation of multidisciplinary units of breast cancer specialists”.
In another accompanying editorial, Michael W. Vannier, M.D. of the Department of Radiology at the University of Chicago Medical Center, feels that it’s hard to see mortality reduction as a screening benefit because outliers such as the natural history of the disease, along with the frequency of screening as well as the duration of follow up may misrepresent the time patterns in the mortality reductions. “We know that isolating screening as an evaluable entity using death records fails to reveal major benefits,” he writes, adding that even if screening were 100% effective, the number of deaths may remain unchanged. Still he feels that without a better alternative, mammography screening will continue to be used. “As our tools improve, we can begin to fully realize the promise of breast cancer screening to arrest this dread disease at its earliest stage with the least morbidity and cost.”
###
Workplace exposure to organic solvents linked to heart defects at birth
Study backs up previous research findings
Workplace exposure to organic solvents is linked to several types of heart defects at birth, indicates research published online in Occupational and Environmental Medicine.
Organic solvents are widely used for dissolving or dispersing substances, such as fats, oils, and waxes, as well as in chemical manufacturing. They are found in paints, varnishes, adhesives, degreasing/cleaning agents, dyes, polymers, plastic, synthetic textiles, printing inks and agricultural products.
Most organic solvents are highly volatile and enter the body through the lungs, but can also enter through the mouth and skin.
Industrial hygienists assessed the levels of workplace exposure to organic solvents in 5000 women from across the US, from one month before conception through to the first three months of pregnancy (first trimester).
All their babies were delivered between 1997 and 2002, and included stillbirths and pregnancy terminations. All the women were taking part in the National Birth Defects Prevention Study, an ongoing population based study that is exploring risk factors for birth defects.
The authors looked for associations between 15 categories of congenital heart defects and exposure to types of organic solvents known to be relatively common in the workplace. These included chlorinated solvents; aromatic solvents; and a mix of C10 or higher hydrocarbons known as Stoddard solvent.
The levels of exposure were measured according to two approaches: an expert consensus-based approach and an approach based on the published evidence.
The expert consensus approach indicated that around 4% of mothers whose babies did not have birth defects, and 5% of those who did, had been exposed to an organic solvent at about the time they were trying to conceive or early in pregnancy.
This increased to 8% and 10%, respectively, using the published evidence approach.
According to the expert consensus approach, two types of congenital heart defects were associated with exposure to any solvent and to chlorinated solvents, although these associations were only of borderline significance.
The published evidence approach indicated several additional associations between congenital heart defects and exposure to organic solvents.
The authors conclude that their results suggest that exposure to organic solvents in the period from one month before conception to early pregnancy is a potential risk factor for several types of heart defects at birth.
Some of their findings back up those of other researchers, while the rest are new, they say. But they caution: “Despite the strengths of this analysis, the results do not allow for the drawing of definitive conclusions on specific exposure-congenital heart defect combinations.”
Widely prescribed MS treatment may not slow progression of disease: VCH-UBC research
Researchers with the UBC Hospital MS Clinic and Brain Research Centre at Vancouver Coastal Health and the University of British Columbia have published important data in the Journal of the American Medical Association (JAMA) about the impact of a common drug therapy on the progression of multiple sclerosis for people with the relapsing‑remitting form of the disease.
The study, led by Drs. Helen Tremlett, Afsaneh Shirani, Joel Oger and others, shows no strong evidence that a group of drugs, beta interferons (β-IFNs), prescribed to treat MS had a measurable impact on the long-term disability progression of the disease.
The team examined the linked health records of 2656 BC patients between 1985 – 2008 in a retrospective cohort study, which means data from already collected sources were linked together in an anonymized form and studied. Data sources included the BC Ministry of Health, PharmaNet and the BC Multiple Sclerosis (BCMS) database, facilitated by Population Data BC.
The study population included patients with MS who were treated with beta interferons (β-IFNs), the most widely used treatment for relapsing‑remitting MS, as well as untreated MS patients. The research team discovered that administration of β-IFN was not associated with a significant change in the progression of disability.
These findings will be of interest to MS patients with this form of the disease, but researchers are quick to point out that this is just one measure of these disease modifying drugs and there is still potentially significant benefit to patients.
“What this study provides is additional information to patients and clinicians about the longer term effect of this class of drugs,” says corresponding author, Dr. Helen Tremlett (PhD), who also holds the Canada Research Chair in Neuroepidemiology and Multiple Sclerosis at UBC. “We know that this class of drugs is very helpful in reducing relapses, which can be important to patients. We do not recommend that patients stop taking these medications, but these findings provide evidence, allowing more realistic expectations as to the anticipated benefits associated with drug treatment from the disability perspective.”
“It is still possible that some patients gain long-term benefit from β-IFNs. We are currently working toward identifying who those potential treatment responders might be,” says Dr Afsaneh Shirani, who is the first author of the paper and a post-doctoral research fellow in the UBC Faculty of Medicine and Brain Research Centre at UBC and VCH Research Institute. “Our study also encourages the investigation of novel treatments for MS,” she adds.
“In addition, this study suggests that linked data from health administrative databases have enormous potential for research applications, despite all the challenges of record linkage” says Dr Shirani.
Relapsing-remitting MS is characterized by relapses or “flare-ups” during which time new symptoms can appear or old ones can resurface or worsen. The relapses are followed by periods of remission during which time the person can fully or partially recover. Relapsing-remitting MS is the most common form of MS affecting around 85% of MS patients in Canada.
“In clinical trial situations, it has been quite evident for years that patients receiving β-IFN treatment have reduced frequency of relapses as well as reduced frequency of new lesions seen on MRI,” says Dr. Joel Oger, who is also a neurologist with the UBC Hospital MS Clinic. “This study following a large number of patients for a long time in “real life situation” does not show an association of the β-IFNs with long term disability and tends to confirm a more modern way of understanding MS: relapses may not be responsible for long term disability in all patients and another mechanism might be at work as well.”
The research team is preparing for future studies further examining this and other classes of disease modifying drugs. The hope is that the research will ultimately lead to an individualized approach to the treatment of MS.
Marijuana use doubles risk of premature birth
A large international study led by University of Adelaide researchers has found that women who use marijuana can more than double the risk of giving birth to a baby prematurely.
Preterm or premature birth – at least three weeks before a baby’s due date – can result in serious and life-threatening health problems for the baby, and an increased risk of health problems in later life, such as heart disease and diabetes.
A study of more than 3000 pregnant women in Adelaide, Australia and Auckland, New Zealand has detailed the most common risk factors for preterm birth. The results have been published online today in the journal PLoS ONE.
The research team, led by Professor Gus Dekker from the University of Adelaide’s Robinson Institute and the Lyell McEwin Hospital, found that the greatest risks for spontaneous preterm birth included:
Strong family history of low birth weight babies (almost six times the risk);
Use of marijuana prior to pregnancy (more than double the risk);
Having a mother with a history of pre-eclampsia (more than double the risk);
Having a history of vaginal bleeds (more than double the risk);
Having a mother with diabetes type 1 or 2 (more than double the risk).
The team also found that the greatest risk factors involved in the preterm rupture of membranes leading to birth included:
Mild hypertension not requiring treatment (almost 10 times the risk);
Family history of recurrent gestational diabetes (eight times the risk);
Receiving some forms of hormonal fertility treatment (almost four times the risk);
Having a body mass index of less than 20 (more than double the risk).
“Our study has found that the risk factors for both forms of preterm birth vary greatly, with a wide variety of health conditions and histories impacting on preterm birth,” says Professor Dekker, who is the lead author of the study.
“Better understanding the risk factors involved in preterm birth moves us another step forward in potentially developing a test – genetic or otherwise – that will help us to predict with greater accuracy the risk of preterm birth. Our ultimate aim is to safeguard the lives of babies and their health in the longer term,” he says.
Study suggests moderate drinking lowers risk of developing rheumatoid arthritis in women
A follow-up study of more than 34,000 women in Sweden has shown that moderate drinkers, in comparison with abstainers, were at significantly lower risk of developing rheumatoid arthritis (RA), an often serious and disabling type of arthritis. RA is known to relate to inflammation, and it is thought that this inflammation is blocked to some degree by the consumption of alcohol. In this study, women who consumed at least 4 drinks per week (with a drink being defined as containing 15 grams of alcohol) had 37% lower risk of developing RA than subjects reporting never drinking or consuming less than 1 drink/week.
This large study is important as few prospective studies are of adequate size to have sufficient cases of RA to evaluate factors related to its development. The study supports previous research showing a lower risk of developing RA, or milder severity of the disease, among moderate drinkers than among abstainers.
Genetic link to rapid weight gain from antipsychotics discovered
Page Content
For Immediate Release – July 17, 2012 – Toronto – Scientists have discovered two genetic variants associated with the substantial, rapid weight gain occurring in nearly half the patients treated with antipsychotic medications, according to two studies involving the Centre for Addiction and Mental Health (CAMH).
These results could eventually be used to identify which patients have the variations, enabling clinicians to choose strategies to prevent this serious side-effect and offer more personalized treatment.
“Weight gain occurs in up to 40 per cent of patients taking medications called second-generation or atypical antipsychotics, which are used because they’re effective in controlling the major symptoms of schizophrenia,” says CAMH Scientist Dr. James Kennedy, senior author on the most recent study published online in the Archives of General Psychiatry.
This weight gain can lead to obesity, type 2 diabetes, heart problems and a shortened life span. “Identifying genetic risks leading to these side-effects will help us prescribe more effectively,” says Dr. Kennedy, head of the new Tanenbaum Centre for Pharmacogenetics, which is part of CAMH’s Campbell Family Mental Health Research Institute. Currently, CAMH screens for two other genetic variations that affect patients’ responses to psychiatric medications.
Each study identified a different variation near the melanocortin-4 receptor (MC4R) gene, which is known to be linked to obesity.
In the Archives of General Psychiatry study, people carrying two copies of a variant gained about three times as much weight as those with one or no copies, after six to 12 weeks of treatment with atypical antipsychotics. (The difference was approximately 6 kg versus 2 kg.) The study had four patient groups: two from the U.S., one in Germany and one from a larger European study.
“The weight gain was associated with this genetic variation in all these groups, which included pediatric patients with severe behaviour or mood problems, and patients with schizophrenia experiencing a first episode or who did not respond to other antipsychotic treatments,” says CAMH Scientist Dr. Daniel Müller. “The results from our genetic analysis combined with this diverse set of patients provide compelling evidence for the role of this MC4R variant. Our research group has discovered other gene variants associated with antipsychotic-induced weight gain in the past, but this one appears to be the most compelling finding thus far.”
Three of the four groups had never previously taken atypical antipsychotics. Different groups were treated with drugs such as olanzapine, risperidone, aripiprazole or quetiapine, and compliance was monitored to ensure the treatment regime was followed. Weight and other metabolic-related measures were taken at the start and during treatment.
A genome-wide association study was conducted on pediatric patients by the study’s lead researcher, Dr. Anil Malhotra, at the Zucker Hillside Hospital in Glen Oaks, NY. In this type of study, variations are sought across a person’s entire set of genes to identify those associated with a particular trait. The result pointed to the MC4R gene.
This gene’s role in antipsychotic-induced weight gain had been identified in a CAMH study published earlier this year in The Pharmacogenomics Journal, involving Drs. Müller and Kennedy, and conducted by PhD student Nabilah Chowdhury. They found a different variation on MC4R that was linked to the side-effect.
For both studies, CAMH researchers did genotyping experiments to identify the single changes to the sequence of the MC4R gene – known as single nucleotide polymorphisms (SNPs) – related to the drug-induced weight gain side-effect.
The MC4R gene encodes a receptor involved in the brain pathways regulating weight, appetite and satiety. “We don’t know exactly how the atypical antipsychotics disrupt this pathway, or how this variation affects the receptor,” says Dr. Müller. “We need further studies to validate this result and eventually turn this into a clinical application.”
The CAMH researchers were supported by a Canadian Institutes of Health Research (CIHR) grant and a NARSAD grant from the U.S. Brain and Behavior Fund.
Media Contact: Michael Torres, Media Relations, CAMH; 416-595-6015
Poisoning from industrial compounds can cause similar effects to ALS
Researchers from the IDIBELL at the University of Barcelona (UB) have coordinated a research into how the IDPN nitrile causes neurological syndromes similar to those of the amyotrophic lateral sclerosis (ALS), a severe neuromuscular degenerative disease.
Researchers from the Bellvitge Biomedical Research Institute (IDIBELL) at the University of Barcelona (UB) have coordinated a research into how the IDPN nitrile causes neurological syndromes similar to those of the amyotrophic lateral sclerosis (ALS), a severe neuromuscular degenerative disease. The study, led by Jordi Llorens, has been recently published in Neuropathology and Applied Neurobiology journal.
Nitriles, chemical compounds containing the cyano (-CN) group, are ubiquitous in nature and have diverse applications in industry. In nature they appear as cyanogenic glycosides, for example in bitter almonds, and as aminonitriles, in some legumes. In industry they are used as solvents and intermediates in the synthesis of plastics, synthetic fibers, resins and pharmaceutical products, among others. The consumption by humans or animals of certain nitriles can cause symptoms similar to cyanide poisoning. This fact suggests that the release of this compound is responsible for acute intoxication. Some nitriles realease less cyanide or do it more slowly, causing neurotoxicity and neurological syndromes.
In a variety of diseases of the nervous system appears abnormal accumulation of neurofilaments, fibers that confer stiffness to neurons. Specifically, in amyotrophic lateral sclerosis (ALS) are observed protrusions formed by a large number of neurofilaments in the axons of motor neurons.
In the study, researchers have observed accumulations of neurofilaments strikingly similar to those occurring in ALS, in laboratory rats exposed to IDPN (3.3′-iminodipropionitril). Because of this similarity, the researchers have studied the effect of IDPN to understand the biology of neurofilaments in ALS disease. The disease and the IDPN poisoning cause axonopathy, an injury to axons. The novelty of this study is the observation that the axonopathy causes a marked loss of neurofilaments in the terminals of motor neurons. This observation is relevant because the refraction of the terminals is the critical factor in the degeneration of the motor neuron.
The research coordinator explained that “the lack of neurofilaments in the area of the neuromuscular junction can have an impact on the function or stability of the union, contributing to its retraction and subsequent regeneration of the neuron”.
Potential clinical applications
Due to the similarity between the proximal accumulations of neurofilaments between the animal model of IDPN and ALS, the researchers predict that the neuromuscular junctions of patients with amyotrophic lateral sclerosis show a lack of neurofilaments similar to that observed in the animal model. The loss of neurofilament terminals could be a pathogenic factor in the disease.
It is increasingly clear that ALS is a multifactorial disease in which genetic and environmental influences cause proximal axonopathy. These factors could alter the neuromuscular junction through the emptying mechanism of neurofilaments observed in IDPN intoxication. In addition, the data indicate that IDPN seems a useful tool for inclusion in multivariate animal models of disease.
The study was presented at the symposium “Mechanisms of neurotoxicity and Implications for Neurological Disorders” satellite event of the Congress of the Federation of European Neuroscience (FENS) held last Friday in Barcelona.
Article Reference
Soler-Martin C, U Vilardosa, Saldana-Ruiz S, Garcia N, Llorens J. Loss of neurofilaments in the neuromuscular junction in a rat model of proximal axonopathy. Neuropathology and Applied Neurobiology 2012 Feb, 38 (1):61-71.
UVC Light Kills Wound Bacteria
Ultraviolet (UVC) light can eradicate wound-infecting bacteria on mice increasing both survival and healing rates, according to a paper in the July 2012 issue of Antimicrobial Agents and Chemotherapy. The light did not damage the animals’ skin or delay wound healing, says principal investigator Michael R. Hamblin, of the Massachusetts General Hospital, and the Harvard Medical School, Boston, MA.
Skin infections range from the superficial, to the life threatening, which are rare except among immunocompromised patients. However, “…these infections are becoming worrisome due to bacterial resistance to conventional antibiotics,” the researchers write.
Unlike with antibiotics, bacteria probably cannot develop complete resistance to UVC light, “although it is possible that variants with enhanced DNA repair systems may emerge,” the investigators note, adding that only four times more radiation would be needed to decimate Deinococcus radiodurans, a species that is famous for its radiation resistance, than in the case of E. coli.
In the study, the investigators infected the mice with bioluminescent strains of gram-negative Pseudomonas aeruginosa, and Staphylococcus aureus, the former “noted for its invasive properties in mouse wound models,” according to the report. The dimming of the bioluminescence—down to near zero—indicated the fate of the infective bacteria. The mice were exposed to UVC light 30 minutes after inoculation.
For both bacteria UVC treatment reduced bacterial contamination of wounds by 10-fold compared to untreated mice. In addition, treatment increased the survival rate of mice infected with P. aeruginosa and the wound healing rate in mice infected with S. aureus.
“These results suggested that UVC light may be used for the prophylaxis of cutaneous wound infections,” write the researchers.
(T. Dai, B. Garcia, C.K. Murray, M.S. Vrahas, and M.R. Hamblin, 2012. UVC light prophylaxis for cutaneous wound infections in mice. Antim. Agents Chemother. 56:3841-3848.)
Download the journal article at: http://bit.ly/asm0712a
Copper Surfaces Could Reduce Hospital Acquired Infections
Research from the Medical University of South Carolina suggests that adding copper to hospital surfaces which are commonly touched by medical personnel and patients could help reduce the risk of hospital-acquired infections. The findings appear in the July 2012 issue of the Journal of Clinical Microbiology.
Hospital-acquired infections kill around 100,000 people annually in the United States—equivalent to a wide-body jet crash every day of the year. About five percent of patients admitted to US hospitals—nearly 5,500 daily, or two million annually—get sick from the hospital, adding $45 billion ($45,000,000,000) to the annual cost of healthcare.
In this study, the microbial burden on commonly touched surfaces in the medical intensive care units of three hospitals was determined, first to assess the risk from those surfaces, and second, to determine whether or not copper surfacing would lower that burden, and those risks. The study was divided into two phases, pre- and post-copper, and lasted for 43 months.
During the pre-copper phase, “We learned that the average microbial burden found on six commonly touched objects was 28 times higher than levels considered benign, and thus represented a risk to the patient,” says Michael Schmidt, a researcher on the study. Installing copper surfaces, he says, resulted in an 83 percent reduction of that microbial burden, leading the team to conclude that copper surfaces on commonly touched objects could provide a substantially safer environment.
“Given that the average hospital acquired infection in the United States conservatively adds an additional 19 days of hospitalization and $43,000 in costs the use of antimicrobial copper surfaces warrants further study and optimization,” says Schmidt, adding that this is the fourth leading cause of death, after cancer, heart disease, and strokes. He notes that “Copper has been used by humans for millennia, first as tools and then as a tool to fight the spread of infectious agents.”
(M.G. Schmidt, H.H. Attaway, P.A. Sharpe, J. John, Jr., K.A. Sepkowitz, A. Morgan, S.E. Fairey, S. Singh, L.L. Steed, J.R. Cantey, K.D. Freeman, H.T. Michels, and C.D. Salgado, 2012. Sustained reduction of microbial burden on common hospital surfaces through induction of copper. J. Clin. Microbiol. 50:2217-2223.)
Download the journal article at: http://bit.ly/asm0712c
Stanford chemists synthesize compound that flushes out latent HIV
A new collection of compounds, called “bryologs” – derived from a tiny marine organism – activate hidden reservoirs of the virus that currently make the disease nearly impossible to eradicate.
By Max McClure
Chemistry Professor Paul Wender (Photo: Linda A. Cicero / Stanford News Service)
Thanks to antiretrovirals, an AIDS diagnosis hasn’t been a death sentence for nearly two decades. But highly active antiretroviral therapy, or HAART, is also not a cure.
Patients must adhere to a demandingly regular drug regimen that carries plenty of side effects. And while the therapy may be difficult to undergo in the United States, it is nearly impossible to scale to the AIDS crisis in the developing world.
The problem with HAART is that it doesn’t address HIV’s so-called proviral reservoirs – dormant forms of the virus that lurk within T-cells and other cell types. Even after all of the body’s active HIV has been eliminated, a missed dose of antiretroviral drugs can allow the hibernating virus to emerge and ravage its host all over again.
“It’s really a two-target problem,” said Stanford chemistry Professor Paul Wender, “and no one has successfully targeted the latent virus.”
But Wender’s lab is getting closer, exciting many HIV patients hoping for a cure.
The lab has created a collection of “bryologs” designed after a naturally occurring, but difficult to obtain, molecule. The new compounds have been shown to activate latent HIV reservoirs with equal or greater potency than the original substance. The lab’s work may give doctors a practical way to flush out the dormant virus.
The findings are set to be published July 15 in the journal Nature Chemistry.
Nature’s medicine
The first attempts to reactivate latent HIV were inspired by observations of Samoan healers. When ethnobotanists examined the bark of Samoa’s mamala tree, traditionally used by healers to treat hepatitis, they found a compound known as prostratin.
Prostratin binds to and activates protein kinase C, an enzyme that forms part of the signaling pathway that reactivates latent viruses. The discovery sparked interest in the enzyme as a potential therapeutic target, especially as it was discovered that prostratin isn’t the only biomolecule to bind to the kinase.
The bryozoan Bugula neritina – a mossy, colonial marine organism – produces a protein kinase C-activating compound that is many times more potent than prostratin. The molecule, named bryostatin 1, was deemed to hold promise as a treatment, not only for HIV but for cancer and Alzheimer’s disease as well.
The National Cancer Institute initiated a Phase II clinical trial for the compound in 2009 for the treatment of non-Hodgkin lymphoma. But the substance had a number of side effects and proved prohibitively difficult to produce.
“It took 14 tons of bryozoans to make 18 grams of bryostatin,” said Wender. “They’ve stopped accrual in trials because, even if the trials worked, the compound cannot be currently supplied.”
Patient enrollment was suspended until more accessible compounds came out of the Wender Group’s lab.
A synthetic approach
Wender, who published the first practical synthesis of prostratin and its analogs in 2008, had set out to make a simpler, more effective synthetic analog of bryostatin.
“We can copy the molecule,” he said, “or we can learn how it works and use that knowledge to create something that has never existed in nature and might be superior to it.”
The seven resulting compounds, called bryologs, share two fundamental features with the original bryostatin: the recognition domain, which directly contacts protein kinase C, and the spacer domain, which allows the bryolog-protein kinase C complex to be inserted into the cell membrane.
The researchers tested the new compounds’ ability to reactivate viral reservoirs in J-Lat cell lines, which contain latent HIV and begin to fluoresce when they express the virus.
In the J-Lat line, bryologs induced virus in as many or more cells than bryostatin at a variety of concentrations, and ranged from 25 to 1,000 times more potent than prostratin. The compounds showed no toxic effects.
Bryolog testing remains in the early stages – the researchers are currently conducting in vivo studies in animal models. But practical bryostatin substitutes may be the first step toward true HIV-eradication therapy.
“I receive letters on a regular basis from people who are aware of our work – who are not, so far as I know, scientifically trained, but do have the disease,” said Wender. “The enthusiasm they express is pretty remarkable. That’s the thing that keeps me up late and gets me up early.”
The research was supported by the National Institutes of Health.
Primary authors are Stanford chemistry graduate student Brian Loy and doctoral students Brian DeChristopher and Adam Schrier, in collaboration with Professor Jerry Zack, co-director of the UCLA AIDS Center, and Dr. Matthew Marsden from the UCLA School of Medicine.
Parental consent for HPV vaccine should not be waived, poll says
Most adults say parents should be involved in decision for adolescents to get the vaccination that protects against genital warts, cervical cancer
ANN ARBOR, Mich. – Most U.S. adults support laws that allow teens to get medical care for sexually transmitted infections without parental consent. But when asked about the vaccine against the human papillomavirus (HPV), most adults want parents to have the final say on whether their teen or pre-teen gets the shots.
The University of Michigan C.S. Mott Children’s Hospital National Poll on Children’s Health recently asked a national sample of adults about allowing adolescents age 12 to 17 years old to receive the HPV vaccinations without parental consent.
Only 45 percent of those polled would support state laws allowing the HPV vaccination without parental consent.
“But in contrast, 57 percent say they support teens being able to get medical care for prevention of sexually transmitted infections and 55 percent for treatment, all without parental consent,” says Sarah Clark, M.P.H., Associate Director of the Child Health Evaluation and Research (CHEAR) Unit at the University of Michigan and Associate Director of the National Poll on Children’s Health.
In the short term, the HPV vaccine protects against genital warts, one of the most common types of sexually transmitted infection. In the long term, the vaccine prevents development of cervical cancer in females and some head and neck cancers in men.
Routine HPV vaccination is recommended for males and females at 11-12 years of age. The vaccine is most effective if administered before the onset of sexual activity.
“That presents a challenge. Parents aren’t thinking their 11 or 12 year-old child is ready for sexual activity at that age,” Clark says. “Many parents ask to delay the vaccine until their child is a little older. But older teens go to the doctor much less than younger adolescents, and often they go without a parent.”
Public health officials have considered pushing laws that would drop the need for parental consent, in order to boost HPV vaccination rates.
“But in this poll, most agreed they are reluctant to support dropping parental consent, even though 74 percent agreed that getting vaccines is a good way to protect adolescents from disease,” Clark says.
Those who did not support dropping parental consent were asked about their reasons. The most common reason, cited by 86 percent, was that HPV should be a parent’s decision; 43 percent cited the risk of side effects of the vaccine. About 40 percent said they have moral or ethical concerns about the vaccine.
The support for state laws that would allow HPV vaccination without parental consent was not different between parents and non-parents.
“These poll results show the majority of adults view HPV vaccination as distinct from sexually transmitted infection prevention and are reluctant to support taking away parental consent,” Clark says.
“Policymakers and public health officials interested in changing parental consent rules should consider this data and provide education to ensure adults understand the importance of HPV vaccination as a form of prevention against sexually transmitted infections.”
Botanical compound could prove crucial to healing influenza
Virginia Tech researchers have discovered that abscisic acid has anti-inflammatory effects in the lungs as well as in the gut
Building on previous work with the botanical abscisic acida, researchers in the Nutritional Immunology and Molecular Medicine Laboratory (NIMML) have discovered that abscisic acid has anti-inflammatory effects in the lungs as well as in the gut. The results will be published in the Journal of Nutritional Biochemistry.
“While the immune effects of abscisic acid are well understood in the gut, less was known about its effects in the respiratory tract. We’ve shown definitively that not only does abscisic acid ameliorate disease activity and lung inflammatory pathology, it also aids recovery and survival in influenza-infected mice,” said Raquel Hontecillas, Ph.D., study leader, assistant professor of immunology at Virginia Bioinformatics Institute, and co-director of NIMML.
Influenza accounts for anywhere from 3,000 to 49,000 deaths per year in the United States alone, according to the Centers for Disease Control. It is difficult to treat if not caught immediately; antivirals usually become ineffective after the virus incubation period has passed and resistance to antiviral drugs poses a serious public health problem in the face of outbreaks. Abscisic acid, however, has been shown to be most effective at about seven to ten days into the infection, targeting the immune response rather than the virus itself, which many researchers feel is a safer way to reduce flu-associated fatalities.
“Most drugs for respiratory infections target the virus itself, rather than the inflammatory responses caused by the virus. Abscisic acid activates peroxisome proliferator-activated receptor-gamma, a receptor that aids in reducing inflammation, through a newly identified pathwaya but it does so without the side effects of other agonists like thiazolidinediones, which are known to have strong adverse side effects. The development of complementary and alternative Medicine approaches that modulate the host response has great promise in decreasing respiratory damage caused by influenza or other respiratory pathogens,” said Josep Bassaganya-Riera, Ph.D., director of NIMML and professor of nutritional immunology at the Virginia Bioinformatics Institute.
From this and previous research, it’s clear that abscisic acid could yield a novel new way to combat inflammatory disease, both in the gut and the respiratory tract. By using host-targeted strategies to mediate disease, alternate pathways can be established to activate immune responses without the deadly side effects of many drugs currently on the market.
PSU study finds ‘caffeinated’ coastal waters
Possible sources include sewer overflows, septic tanks
A new study finds elevated levels of caffeine at several sites in Pacific Ocean waters off the coast of Oregon—though not necessarily where researchers expected.
This study is the first to look at caffeine pollution off the Oregon coast. It was developed and conducted by Portland State University master’s student Zoe Rodriguez del Rey and her faculty adviser Elise Granek, assistant professor of Environmental Science and Management, in collaboration with Steve Sylvester of Washington State University, Vancouver.
In spring 2010, Rodriguez del Rey and Granek collected and analyzed samples from 14 coastal locations and seven adjacent water bodies as far north as Astoria, Ore., and as far south as Brookings.
Locations were identified as potentially polluted if they were near wastewater treatment plants, large population centers or rivers and streams emptying into the ocean.
The study found high caffeine levels near Carl Washburne State Park (Florence, Ore.) and Cape Lookout, two areas not near the potential pollution sources, yet low levels of caffeine near large population centers like Astoria/Warrenton and Coos Bay.
High levels were also found following a late-season storm of wind and rain that triggered sewer overflows.
Results of the study were published in the July 2012 Marine Pollution Bulletin, “Occurrence and concentration of caffeine in Oregon coastal waters.”
The results seem to indicate that wastewater treatment plants are effective at removing caffeine, but that high rainfall and combined sewer overflows flush the contaminants out to sea. The results also suggest that septic tanks, such as those used at the state parks, may be less effective at containing pollution.
“Our study findings indicate that, contrary to our prediction, the waste water treatment plants are not a major source of caffeine to coastal waters,” says Granek. “However, onsite waste disposal systems may be a big contributor of contaminants to Oregon’s coastal ocean and need to be better studied to fully understand their contribution to pollution of ocean waters.”
Caffeine is found in many food and beverage products as well as some pharmaceuticals, and caffeine pollution is directly related to human activity (although many plant species produce caffeine, there are no natural sources of the substance in the Northwest). The presence of caffeine may also signal additional anthropogenic pollution, such as pesticides, pharmaceuticals and other contaminants.
Even “elevated levels” of caffeine are measured in nanograms per liter, well below a lethal dose for marine life. However, an earlier study by Rodriguez del Rey and Granek on intertidal mussels showed that caffeine at the levels measured in this current study can still have an effect despite the lower doses
“We humans drink caffeinated beverages because caffeine has a biological effect on us—so it isn’t too surprising that caffeine affects other animals, too,” says Granek. Previous studies have found caffeine in other bodies of water around the world, including the North Sea, the Mediterranean, Puget Sound, Boston Harbor, and Sarasota Bay, Fla.
Study questions safety and effectiveness of common kidney disease drugs
Longest placebo-controlled trial of phosphate binders conducted to date challenges the drugs’ utility
Highlights
Phosphate binders, drugs commonly prescribed to patients with chronic kidney disease, may not be as effective as previously thought.
Phosphate binders may have negative effects on cardiovascular health.
Additional studies are needed on the safety and effectiveness of these drugs.
Washington, DC (July 19, 2012) — Drugs commonly prescribed to patients with chronic kidney disease (CKD) may not be as strongly effective as once thought, and may cause unexpected harm to blood vessels, according to a study appearing in an upcoming issue of the Journal of the American Society of Nephrology (JASN). The findings indicate that additional studies on the drugs, called phosphate binders, are needed.
Higher blood levels of phosphorus that are still within the normal range have been linked with heart problems, kidney disease, and premature death. Because the kidneys get rid of excess phosphorus by excreting it through the urine, patients with CKD often have elevated blood phosphorus levels.
Drugs called phosphate binders can lower blood phosphorus levels, and while they are approved only for patients with kidney failure, they are often prescribed off-label to patients with CKD. Geoffrey Block, MD (Denver Nephrology) and his colleagues evaluated the effects of these drugs (calcium acetate, lanthanum carbonate, sevelamer carbonate) in patients with moderate to advanced CKD and normal or near normal blood phosphorus levels.
The study included 148 patients who were randomized to receive one of the three phosphate binders or a placebo. The investigators examined patients after three, six, and nine months of treatment. The study is the longest placebo-controlled trial of phosphate binders in patients with CKD conducted to date.
Treatment with phosphate binders significantly lowered patients’ urinary phosphorus levels, moderately lowered their blood phosphorus levels, and slowed progression of a parathyroid disorder that is a common complication of CKD, while treatment with placebo did not. Despite these positive effects, phosphate binders did not have any effect on the blood levels of a hormone that regulates phosphate excretion in the urine, and the drugs caused calcium build-up in blood vessels, which can lead to heart problems. Heart disease is the leading cause of death in patients with CKD.
These findings call into question the safety and effectiveness of phosphate binders in patients with CKD.
“While we continue to believe that serum, or blood, phosphorus is a key component of the increased cardiovascular risk associated with kidney disease, our results suggest the use of the currently approved phosphate binding drugs does not result in substantial reductions in serum phosphorus and may be associated with harm in this population,” said Dr. Block. “Future clinical trials should be conducted in all populations with adequate placebo controls and should address alternative or complementary methods to reduce serum phosphorus,” he added.
HPV improves survival for African-Americans with throat cancer
DETROIT – Even though the human papillomavirus (HPV) is a risk factor for certain head and neck cancers, its presence could make all the difference in terms of survival, especially for African Americans with throat cancer, say Henry Ford Hospital researchers.
According to their new study, HPV has a substantial impact on overall survival in African Americans with oropharyngeal cancer, a cancer that affects part of the throat, the base of the tongue, the tonsils, the soft palate (back of the mouth), and the walls of the pharynx (throat).
The study shows African Americans who are HPV positive have better outcomes than African Americans without HPV.
Further, African Americans who are HPV negative not only have poorer survival compared to African Americans with HPV, they also did worse than Caucasians both with HPV and without HPV present in oropharyngeal cancer.
“This study adds to the mounting evidence of HPV as a racially-linked sexual behavior lifestyle risk factor impacting survival outcomes for both African American and Caucasian patients with oropharyngeal cancer,” says lead author Maria J. Worsham, Ph.D., director of research in the Department of Otolaryngology-Head & Neck Surgery at Henry Ford.
Study results will be presented Sunday, July 22 at the 8th International Conference on Head & Neck Cancer in Toronto. The research was funded by a National Institutes of Health grant.
The American Cancer Society’s estimates about 35,000 people in the U.S. will get oral cavity and oropharyngeal cancers in 2012; an estimated 6,800 people will die of these cancers.
Similar to other cancers of the head and neck cancer, risk factors include smoking, alcohol consumption. HPV is also a risk factor for oropharyngeal cancer.
To compare survival outcomes in HPV positive and HPV negative African Americans with oropharyngeal cancer, Dr. Worsham and her team conducted a retrospective study of 118 patients.
Among the study group, 67 are HPV negative and 51 are HPV positive. Forty-two of those in the study are African American.
The study found that:
African Americans are less likely to be HPV positive
Those older than 50 are less likely to be HPV positive
Those with late-stage oropharyngeal cancer are more likely to be unmarried and more likely to be HPV positive
HPV negative patients had 2.9 times the risk of death as HPV positive patients
Overall, the HPV race groups differed with significantly poorer survival for HPV negative African Americans versus HPV positive African Americans, HPV positive Caucasians and HPV negative Caucasians
Moderate alcohol intake is associated with a lower risk of kidney cancer
A majority of previous epidemiologic studies have shown that moderate drinking is associated with a lower risk of kidney cancer, which may affect about 1% of the general population. In published prospective cohort studies, the risk for such cancer among moderate drinkers is usually about 25% less than the risk seen among non-drinkers.
This well-done meta-analysis supports these findings: for the more-reliable prospective cohort studies (rather than case-control studies) the current study finds a 29% lower risk for subjects in the highest category of alcohol consumption in comparison with subjects in the lowest alcohol category. The findings suggest similar effects among men and women, and for all types of alcohol beverages. The effects are seen at a level of about one drink/day, with little further reduction in risk for greater alcohol consumption.
Beneficial bacteria may help ward off infection
July 19, 2012
While many bacteria exist as aggressive pathogens, causing diseases ranging from tuberculosis and cholera, to plague, diphtheria and toxic shock syndrome, others play a less malevolent role and some are critical for human health. In a new study, Cheryl Nickerson and her group at ASU’s Biodesign Institute, in collaboration with an international team* including Tom Van de Wiele and lead author Rosemarie De Weirdt at Ghent University, Belgium, explore the role of Lactobaccilus reuteri—a natural resident of the human gut—to protect against foodborne infection. Their results demonstrate that this beneficial or probiotic organism, which produces an antimicrobial substance known as reuterin, may protect intestinal epithelial cells from infection by the foodborne bacterial pathogen Salmonella. The study examines for the first time the effect of reuterin during the infection process of mammalian intestinal cells and suggests the efficacy of using probiotic bacteria or their derivatives in future therapies aimed at thwarting Salmonella infection. Members of the Nickerson lab at the Biodesign Institute’s Center for Infectious Diseases and Vaccinology involved in this study were Shameema Sarker and Aurélie Crabbé. Results of the new study recently appeared in the journal PloS ONE. Cell cultures: now in 3-D! Over the past decade, the Nickerson group and their colleagues have developed organotypic three-dimensional (3-D) tissue culture models of the small and large intestine, lung, placenta, bladder, neuronal tissue and vaginal epithelium that mimic key characteristics of the parental tissue, and applied them to study the infectious disease process. Such models offer exciting new insights into host-pathogen interactions, cell proliferation, differentiation and immune function, and are providing a platform to understand normal tissue homeostasis and transition to disease. For the current study, 3-D colon epithelial cells were used. Nickerson explains that cells derived for study through this technique more faithfully approximate key in vivo responses to S. Typhimurium infection, compared with the traditional monolayer methods, making such cells an ideal model to observe infection processes. 3-D cell culture models are cultured in a special environment within a device known as a Rotating Wall Vessel bioreactor— a cylindrical, rotating apparatus, filled with a culture medium supplying essential nutrients, oxygen and physical forces to the cells. Within the reactor, the natural sedimentation of cells due to gravity is balanced by the bioreactor’s rotation, resulting in a gentle tumbling of cells within the media in the chamber. During the culturing phase, cells attach themselves to tiny porous beads, termed microcarriers, or other scaffolding. Under these conditions, cells are able to respond to molecular and chemical gradients in three-dimensions in a way that approximates their behavior under in vivo conditions, causing the cells to aggregate based on natural cellular affinities and form 3-D tissue-like structures.
“In previous studies, we applied our 3-D intestinal cell cultures as human surrogates to further our understanding of how Salmonella interacts with the intestinal epithelium to cause gastrointestinal disease,” Nickerson explains. “We found that these models were able to respond to infection in key ways that mimicked the parental tissue in vivo and which conventional models could not recapitulate. We are excited to advance the use of our 3-D models in the current work to study how commensal intestinal microbes and their products can protect against Salmonella-induced foodborne infection. The results of this study may provide fundamental knowledge for development of new probiotics and other functional food based strategies.” Bacterial Blizzard A swarm of some hundred trillion bacteria occupies the human body, outnumbering human cells by about 10 to 1. Among these are members of the genus Lactobacilli, some of which have been associated with therapeutic, probiotic properties, including anti-inflammatory and anti-cancer activity. The current study zeros in on Lactobacillus reuteri—one of the more than 180 species of Lactobacilli. The group investigated the potential of this bacterium to inhibit the early stages of Salmonella infection, seeking to identify plausible mechanisms for such inhibitory effects. Intestinal infections by non-typhoidal Salmonella strains induce diarrhea and gastroenteritis, and remain a leading source of foodborne illness worldwide. Such infections are acutely unpleasant but self-limiting in healthy individuals. For those with compromised immunity however, they can be deadly and the alarming incidence of multi-drug resistant Salmonella strains has underlined the necessity of more effective therapeutics. The use of benign microorganisms offers a promising new approach to treating infection from pathogens like Salmonella and indeed, L. reuteri has been shown to help protect against gastrointestinal infection and reduce diarrhea in children. Safeguarding cells The origin of L. reuteri’s protective role still remains unclear, and the present study investigated whether reuterin, a metabolite produced by L. reuteri during the process of reducing glycerol in the gut, could be one of the keys to protection. While it has been speculated that reuterin acts by regulating immune responses or competing with Salmonella for key binding sites, the current study represents the first in vitro examination of host-pathogen interactions using human intestinal epithelium in the presence of reuterin-producing L. reuteri. Two approaches were used to study host-pathogen interactions. In the first, 3-D intestinal epithelial cell aggregates were seeded into 24-well plates. Salmonella was added to these intestinal cells along with supernatant of L. reuteri—that is, cell-free culture medium in which the Lactobacillus grew and produced reuterin (obtained by filtering out the bacteria). In the second approach, L. reuteri was first allowed to produce reuterin in the presence of the 3-D colon cells (seeded into the wells), after which the cells were exposed to Salmonella. Here, the L. reuteri bacteria (in the presence of glycerol) produced reuterin in situ. In both approaches, non-reuterin exposed controls were also tested, and the effect of reuterin on a Salmonella population in the absence of host cells was assessed as well. L. reuteri regulates response to infection The results showed a reduction in the Salmonella population (without host cells) after one hour of exposure to a diluted supernatant containing reuterin. Further, the reuterin-containing ferment of L. reuteri was shown to significantly reduce adhesion, invasion and intracellular survival of Salmonella to 3-D colon cells, compared with an untreated control. In an unexpected twist, the application of L. reuteri supernatant lacking glycerol actually stimulated adhesion, invasion and intracellular survival of Salmonella. The authors speculate that the stimulatory effect observed may have been due to low concentrations of acetic acid, previously shown to stimulate expression of Salmonella virulence-related genes. Applying the second approach, live L. reuteri were incubated with 3-D epithelial cells and the medium supplemented with glycerol, allowing for in situ production of reuterin. The presence of L. reuteri was shown to reduce the population of Salmonella by diminishing their capacity for adhesion, invasion and intracellular survival and this effect increased when L. reuteri were producing reuterin. Another interesting detail uncovered in the study is that the effects of reuterin on Salmonella’s infectious capacity are increased in the presence of host cells, suggesting that some type of synergistic protection occurs during epithelial infection, potentially involving the combined activity of reuterin and host cell gene-related responses. Prolonged exposure (of 24 hours or more) to the reuterin-containing supernatant solutions caused a loss of viability in host cells, though shorter exposure times did not appear to adversely affect them. Importantly, the introduction of L. reuteri strains in vivo have been safely carried out in infants and even immuno-compromised adults, indicating that other cell types, host factors or the complex gut microbiota in vivo could counteract the observed cytotoxic effects of reuterin in vitro. While the authors stress that much work remains, particularly in terms of understanding reuterin’s role in the context of a complex gut microbiome, the results are encouraging and suggest a new avenue for fighting Salmonella infection, through the process of glycerol conversion to reuterin by L. reuteri.
Cheryl Nickerson (center) with her colleagues Shameema Sarker (left) and Aurélie Crabbé (right).
*The international research team for this project also included Stefan Roos, Department of Microbiology, Uppsala BioCenter, Swedish University of Agricultural Sciences, Uppsala, Sweden, Sabine Vollenweider & Christophe Lacroix, of the Institute of Food, Nutrition and Health, ETH Zurich, Zurich, Switzerland, and Jan Peter van Pijkeren & Robert A. Britton, Department of Microbiology & Molecular Genetics, Michigan State University, East Lansing, Michigan.
Written by: Richard Harth Science Writer: The Biodesign Institute richard.harth@asu.edu
Vitamin D may protect against lung function impairment and decline in smokers
Vitamin D deficiency is associated with worse lung function and more rapid decline in lung function over time in smokers, suggesting that vitamin D may have a protective effect against the effects of smoking on lung function, according to a new study from researchers in Boston.
“We examined the relationship between vitamin D deficiency, smoking, lung function, and the rate of lung function decline over a 20 year period in a cohort of 626 adult white men from the Normative Aging Study,” said lead author Nancy E. Lange, MD, MPH, of the Channing Laboratory, Brigham and Women’s Hospital. “We found that vitamin D sufficiency (defined as serum vitamin D levels of >20 ng/ml) had a protective effect on lung function and the rate of lung function decline in smokers.”
The findings were published online ahead of print publication in the American Thoracic Society’s American Journal of Respiratory and Critical Care Medicine.
In the study, vitamin D levels were assessed at three different time points between 1984 and 2003, and lung function was assessed concurrently with spirometry.
In vitamin D deficient subjects, for each one unit increase in pack-years of smoking, mean forced expiratory volume in one second (FEV1) was 12 ml lower, compared with a mean reduction of 6.5 ml among subjects who were not vitamin D deficient. In longitudinal models, vitamin D deficiency exacerbated the effect of pack years of smoking on the decline in FEV1 over time.
No significant effect of vitamin D levels on lung function or lung function decline were observed in the overall study cohort, which included both smokers and non-smokers.
“Our results suggest that vitamin D might modify the damaging effects of smoking on lung function,” said Dr. Lange. “These effects might be due to vitamin D’s anti-inflammatory and anti-oxidant properties.”
The study has some limitations, including that the data is observational only and not a trial, that vitamin D levels fluctuate over time, and that the study has limited generalizability due to the cohort being all elderly men.
“If these results can be replicated in other studies, they could be of great public health importance,” said Dr. Lange. “Future research should also examine whether vitamin D protects against lung damage from other sources, such as air pollution.”
“While these results are intriguing, the health hazards associated with smoking far outweigh any protective effect that vitamin D may have on lung function ,” said Alexander C. White MS, MD, chair of the American Thoracic Society’s Tobacco Action Committee. “First and foremost, patients who smoke should be fully informed about the health consequences of smoking and in addition be given all possible assistance to help them quit smoking.”
FDA Panel Members Express Opposing Views on Truvanda Approval
In May, the FDA Antiviral Advisory Committee met to review evidence for the approval of two antiretroviral drugs, tenofovir and emtricitabine for pre-exposure prophylaxis (PrEP) against HIV infection. Two members of the Committee explain why they voted for or against the approval. Judith Feinberg, MD, Professor of Medicine at the University of Cincinnati College of Medicine and director of the University of Cincinnati AIDS Clinical Trials Unit, serves as chairperson of the FDA Antiviral Advisory Committee. She voted in favor of the approval. She explains that while the observation period for tenofovir-emtricitabine has been short, the outcomes and tolerability have been very good. Dr. Feinberg writes that PrEP is particularly important now that HIV is on the rise among young men who have sex with men. Over the past 30 years, HIV has killed 30 million people and infected 60 million. With hundreds of millions of people still at risk, and no vaccine available, PrEP may be the strategy that helps to turn the tide on HIV. Lauren V. Wood, MD, Staff Clinician at the National Cancer Institute and Assistant Professor at the Uniformed Services University of the Health Sciences is also a member of the FDA Antiviral Advisory Committee. Dr. Wood voted against the approval for several reasons. First, Dr. Wood did not find consistent evidence of the benefit of PrEP, especially in women. Second, she expressed concern about low rate of adherence to the tenofovir-emtricitabine regimen. Since adherence is crucial to efficacy, she could not support approval. And finally, Dr. Wood cited safety concerns about PrEP dosing. And since no long-term studies have been done, the potential for the emergence and spread of drug resistant virus as a consequence of PrEP remains a concern
High dietary antioxidant intake might cut pancreatic cancer risk
If link proves causal, 1 in 12 of these cancers might be prevented, say researchers
Increasing dietary intake of the antioxidant vitamins C, E, and selenium could help cut the risk of developing pancreatic cancer by up to two thirds, suggests research published online in the journal Gut.
If the association turns out to be causal, one in 12 of these cancers might be prevented, suggest the researchers, who are leading the Norfolk arm of the European Prospective Investigation of Cancer (EPIC) study.
Cancer of the pancreas kills more than a quarter of a million people every year around the world. And 7500 people are diagnosed with the disease every year in the UK, where it is the six commonest cause of cancer death.
The disease has the worst prognosis of any cancer, with just 3% of people surviving beyond five years. Genes, smoking, and type 2 diabetes are all risk factors, but diet is also thought to have a role, and may explain why rates vary so much from country to country, say the authors.
The researchers tracked the health of more than 23,500 40 to 74 year olds, who had entered the Norfolk arm of the EPIC study between 1993 and 1997.
Each participant filled in a comprehensive food diary, detailing the types and amount of every food they ate for 7 days, as well as the methods they used to prepare it.
Each entry in the food diary was matched to one of 11,000 food items, and the nutrient values calculated using a specially designed computer programme (DINER).
Forty nine people (55% men) developed pancreatic cancer within 10 years of entering the study. This increased to 86 (44% men) by 2010. On average, they survived 6 months after diagnosis.
The nutrient intakes of those diagnosed with the disease within 10 years of entering EPIC were compared with those of almost 4000 healthy people to see if there were any differences.
The analysis showed that a weekly intake of selenium in the top 25% of consumption roughly halved their risk of developing pancreatic cancer compared with those whose intake was in the bottom 25%.
And those whose vitamins C, E, and selenium intake was in the top 25% of consumption were 67% less likely to develop pancreatic cancer than those who were in the bottom 25%.
If the link turns out to be causal, that would add up to the prevention of more than one in 12 (8%) of pancreatic cancers, calculate the authors.
Antioxidants may neutralise the harmful by-products of metabolism and normal cell activity—free radicals—and curb genetically programmed influences, as well as stimulating the immune system response, explain the authors.
Other trials using antioxidant supplements have not produced such encouraging results, but this may be because food sources of these nutrients may behave differently from those found in supplements, they say.
“If a causal association is confirmed by reporting consistent findings from other epidemiological studies, then population based dietary recommendations may help to prevent pancreatic cancer,” they conclude.
Researchers develop ginseng-fortified milk to improve cognitive function
Possible market for new functional food reported in the Journal of Dairy Science
Amsterdam, The Netherlands, July 23, 2012 – American ginseng is reported to have neurocognitive effects, and research has shown benefits in aging, central nervous system disorders, and neurodegenerative diseases. The challenges of incorporating ginseng into food are twofold: it has a bitter taste, and food processing can eliminate its healthful benefits. Reporting in the August issue of the Journal of Dairy Science®, a group of scientists has formulated low-lactose functional milk that maintained beneficial levels of American ginseng after processing. An exploratory study found the product was readily accepted by a niche group of consumers.
“Our goal was to develop low-lactose milk that could be consumed by the elderly to improve cognitive function,” reports lead investigator S. Fiszman, PhD, of the Instituto de Agroquimica y Tecnologia de Alimentos (IATA), Consejo Superior de Investigaciones Cientificas (CSIC), Patema (Valencia), Spain. “Consumers who were interested in the health benefits of ginseng rated our product quite highly.”
Because older people frequently have trouble digesting milk products, the researchers developed a low-lactose formula. American ginseng was added, and then the milk was sterilized by ultra-high temperature processing (UHT), which prolongs shelf life. Analysis found that sufficient levels of ginseng remained in the milk after treatment to improve cognitive function as reported in the literature.
To reduce the bitter taste of American ginseng, the investigators developed samples with vanilla extract and sucralose, a zero-calorie artificial sweetener. In a preliminary study, 10 tasters with a good ability to discriminate between flavors compared low lactose UHT milk without any additives (the control) to low lactose milk with ginseng extract, vanilla aroma, and sucralose added before UHT treatment. They developed a list of 10 attributes that described the sample: color, sweet odor, milk flavor, vanilla flavor, metallic/root flavor, sweetness, bitterness, aftertaste, astringency, and viscosity. They then rated the intensity of each attribute for five samples; the control; the control with ginseng extract, vanilla aroma, and sucralose added; the control with ginseng extract added; the control with vanilla and ginseng extract; and the low lactose milk with ginseng extract; vanilla aroma; and sucralose added before UHT treatment.
In a second study, 100 participants were asked, on a scale of one to five, how willing they would be to consume a “highly digestible semi-skimmed milk,” and a “highly digestible semi-skimmed milk enriched with ginseng extract that would improve cognitive function.” Then, they tasted and rated, on a scale of one to nine, the overall acceptability of the control milk and the low lactose milk with ginseng extract, vanilla aroma, and sucralose added before UHT treatment.
Both the presence of ginseng and the thermal treatment affected some sensory properties of the milk. The addition of ginseng significantly increased the perceived light brown color in the flavored and unflavored samples, and was highest in the reduced-lactose milk with ingredients added before the UHT treatment. The sweet odor was more intense in flavored samples, but decreased slightly in the samples of milk with ingredients added before UHT treatment. Bitterness was clearly perceived in the samples containing ginseng additives, but was lower in flavored samples, indicating that the vanilla aroma and sucralose masked, to some extent, the bitter taste caused by ginseng extract.
Consumer responses varied greatly, depending on interest in the product. 78% indicated that they would be likely to consume the highly digestible milk, and after tasting the product, 87% of them indicated they would buy the sample. 47% indicated they were not interested in milk enriched with ginseng, and after tasting, they gave it a low acceptability rating. However, for the 32% of consumers who did express an interest in the product, 75% declared they would buy it.
“Drinking 150 to 300 mL of this ginseng-enriched milk would provide the amount indicated to be effective for improving cognitive functions. Combined with the low levels of lactose, this makes the drink an appropriate functional beverage for the elderly,” says Dr. Fiszman. “Among consumers more likely to consume ginseng products, the newly developed milk was well accepted. The addition of more congruent flavors such as chocolate, citrus, or coffee, could be more effective in masking non-milk-related sensory attributes, Other alternatives could be investigated.”
Commenting on the studies, Susan Duncan, PhD, professor, Department of Food Science & Technology, Virginia Tech, noted, “With the combination of intrinsic health benefits in milk and these additional ingredients, milk becomes an easy way to deliver valuable functional ingredients and the functional benefits of milk components. Diversifying the product line for milk and dairy products has a number of benefits, including market and consumer visibility and perception.”
New study: Raisins as effective as sports chews for fueling workouts
FRESNO, Calif. (July 23, 2012) – New research published in the Journal of the International Society of Sports Nutrition suggests that eating raisins may provide the same workout boost as sports chews.
Conducted by researchers at the University of California-Davis, the study evaluated the effects that natural versus commercial carbohydrate supplements have on endurance running performance. Runners depleted their glycogen stores in an 80-minute 75% V02 max run followed by a 5k time trial. Runners completed three randomized trials (raisins, chews and water only) separated by seven days. Findings included:
Those that ingested raisins or sports chews ran their 5k on average one minute faster than those that ingested only water
Eating raisins and sports chews promoted higher carbohydrate oxidation compared to water only
“Raisins are a great alternative to sport chews as they also provide fiber and micronutrients, such as potassium and iron, and they do not have any added sugar, artificial flavor or colors,” said James Painter, Ph.D., R.D., and nutrition research advisor for the California Raisin Marketing Board. “As an added bonus, raisins are the most economical dried fruit according to the United Stated Department of Agriculture, so they are cost effective and convenient for use during exercise.”
Synthetic stimulants called ‘bath salts’ act in the brain like cocaine
CHAPEL HILL, NC – The use of the synthetic stimulants collectively known as “bath salts” have gained popularity among recreational drug users over the last five years, largely because they were readily available and unrestricted via the Internet and at convenience stores, and were virtually unregulated.
Recent studies point to compulsive drug taking among bath salts users, and several deaths have been blamed on the bath salt mephedrone (4-methylmethcathinone or “meow-meow”). This has led several countries to ban the production, possession, and sale of mephedrone and other cathinone derivative drugs.
In October 2011, the U.S. Drug Enforcement Administration placed mephedrone on Schedule 1 of the Controlled Substances Act for one year, pending further study. “Basically, the DEA was saying we don’t know enough about these drugs to know how potentially dangerous they could be, so we’re going to make them maximally restricted, gather more data, and then come to a more reasoned decision as to how we should classify these compounds,” said C.J. Malanga, MD, PhD, associate professor of neurology, pediatrics and psychology at the University of North Carolina School of Medicine. He is also a member of the UNC’s Bowles Center for Alcohol Studies.
Now, results of a new study led by Malanga offer compelling evidence for the first time that mephedrone, like cocaine, does have potential for abuse and addiction. “The effects of mephedrone on the brain’s reward circuits are comparable to similar doses of cocaine,” he said. “As expected our research shows that mephedrone likely has significant abuse liability.”
A report of the study was published online on June 21, 2012 by the journal Behavioural Brain Research. The report’s first author and MD/PhD student at UNC J. Elliott Robinson points out that mephedrone and other potentially addictive stimulants “inappropriately activate brain reward circuits that are involved in positive reinforcement. These play a role in the drug ‘high’ and compulsive drug taking.”
The study of laboratory mice used intracranial self-stimulation (ICSS), a technique developed in the 1950s that can measure a drug’s ability to activate reward circuits. In ICSS studies, animals are trained to perform a behavioral task (pressing a lever or a button with their nose or, as in this study, spinning a wheel) to receive a reward: direct stimulation of the brain pathways involved in reward perception.
During the study, adult animals were implanted with brain stimulating electrodes. Measures of their wheel spinning effort were made before, during and after they received various doses of either mephedrone or cocaine.
“One of the unique features of ICSS is that all drugs of abuse, regardless of how they work pharmacologically, do very similar things to ICSS: they make ICSS more rewarding,” Malanga said. “Animals work harder to get less of it [ICSS] when we give them these drugs.”
Indeed, as was expected, cocaine increased the ability of mice to be rewarded by self-stimulation. “And what we found, which is new, is that mephedrone does the same thing. It increases the rewarding potency of ICSS just like cocaine does. ”
Malanga said the study supports the idea that mephedrone and drugs like it may have significant addiction potential, “and justifies the recent legislation to maintain maximum restriction to their access by the Food and Drug Administration.” On July 9 President Obama signed into law legislation passed by Congress to permanently ban the sale of bath salts in the U.S.
Diets high in salt could deplete calcium in the body: UAlberta research
When sodium leaves a body, it takes calcium along with it, creating risk for kidney stones and osteoporosis
The scientific community has always wanted to know why people who eat high-salt diets are prone to developing medical problems such as kidney stones and osteoporosis.
Medical researchers at the University of Alberta may have solved this puzzle through their work with animal lab models and cells.
Principal investigator Todd Alexander and his team recently discovered an important link between sodium and calcium. These both appear to be regulated by the same molecule in the body. When sodium intake becomes too high, the body gets rid of sodium via the urine, taking calcium with it, which depletes calcium stores in the body. High levels of calcium in the urine lead to the development of kidney stones, while inadequate levels of calcium in the body lead to thin bones and osteoporosis.
“When the body tries to get rid of sodium via the urine, our findings suggest the body also gets rid of calcium at the same time,” says Alexander, a Faculty of Medicine & Dentistry researcher whose findings were recently published in the peer-reviewed journal American Journal of Physiology – Renal Physiology.
“This is significant because we are eating more and more sodium in our diets, which means our bodies are getting rid of more and more calcium. Our findings reinforce why it is important to have a low-sodium diet and why it is important to have lower sodium levels in processed foods.”
It’s been known for a long time that this important molecule was responsible for sodium absorption in the body, but the discovery that it also plays a role in regulating calcium levels is new.
“We asked a simple question with our research – could sodium and calcium absorption be linked? And we discovered they are,” says Alexander.
“We found a molecule that seems to have two jobs – regulating the levels of both calcium and sodium in the body. Our findings provide very real biological evidence that this relationship between sodium and calcium is real and linked.”
In their research, the team worked with lab models that didn’t have this important molecule, so the models’ urine contained high levels of calcium. Because calcium was not absorbed and retained by the body, bones became thin.
A journal editorial written about this research discovery noted the molecule could be a drug target to one day “treat kidney stones and osteoporosis.”
The primary funder of the research was the Kidney Foundation of Canada.
“We are proud to support the research of Dr. Todd Alexander,” said Wim Wolfs, National Director of Research of The Kidney Foundation of Canada. “Data in the United States shows that nearly 10% of adults will have a kidney stone at least once in their life. The prevalence of kidney stones also seems to be increasing in the U.S., which may be attributed to high rates of obesity and diabetes, along with possibly increased salt intake.”
Recent research uncovers tick bite as the cause for a delayed allergic reaction to red meat
If you are a steak lover, enjoy your meat while you can. An article by Susan Wolver, MD, and Diane Sun, MD, from Virginia Commonwealth University in the US, and colleagues, explains why if you have been bitten by a tick, you may develop an allergy to red meat. Their article1 elucidates this connection and discusses the journey of the discovery. Their work appears online in the Journal of General Internal Medicine², published by Springer.
Delayed anaphylaxis – a severe, life-threatening allergic reaction – to meat is a new syndrome identified initially in the southeastern United States. Patients may wake up in the middle of the night, with hives or anaphylaxis usually three to six hours after having eaten red meat for dinner. Until recently, the link between red meat ingestion and anaphylaxis had remained elusive.
Wolver, Sun and colleagues’ analysis of three patient case studies sheds light on this reaction. It is thought to be caused by antibodies to a carbohydrate (alpha-gal) that are produced in a patient’s blood in response to a tick bite, specifically the Lone Star tick. This carbohydrate substance is also present in meat. When an individual who has been bitten by a tick eats the meat, his or her immune system activates the release of histamine* in response to the presence of alpha-gal, which can cause hives and anaphylaxis.
Significantly, meat-induced anaphylaxis is the first food-induced severe allergic reaction due to a carbohydrate rather than a protein. It is also the first time anaphylaxis has been noted to be delayed rather than occurring immediately after exposure.
The authors conclude: “Where ticks are endemic, for example in the southeastern United States, clinicians should be aware of this new syndrome when presented with a case of anaphylaxis. Current guidance is to counsel patients to avoid all mammalian meat – beef, pork, lamb and venison.”
*a compound found in mammalian tissues that causes dilatation of capillaries, contraction of smooth muscle, and stimulation of gastric acid secretion, that is released during allergic reactions.
Reference:
1. Wolver SE et al (2012). A peculiar case of anaphylaxis: no more steak? The journey to discovery of a newly recognized allergy to galactose-a-1,3-galactose found in mammalian meat. Journal of General Internal Medicine; DOI 10.1007/s11606-012-2144-z
2. The Journal of General Internal Medicine is the official journal of the Society of General Internal Medicine
Yoga reduces stress; now it’s known why
UCLA study helps caregivers of people with dementia
Six months ago, researchers at UCLA published a study that showed using a specific type of yoga to engage in a brief, simple daily meditation reduced the stress levels of people who care for those stricken by Alzheimer’s and dementia. Now they know why.
As previously reported, practicing a certain form of chanting yogic meditation for just 12 minutes daily for eight weeks led to a reduction in the biological mechanisms responsible for an increase in the immune system’s inflammation response. Inflammation, if constantly activated, can contribute to a multitude of chronic health problems.
Reporting in the current online edition of the journal Psychoneuroendocrinology, Dr. Helen Lavretsky, senior author and a professor of psychiatry at the UCLA Semel Institute for Neuroscience and Human Behavior, and colleagues found in their work with 45 family dementia caregivers that 68 of their genes responded differently after Kirtan Kriya Meditation (KKM), resulting in reduced inflammation.
Caregivers are the unsung heroes for their yeoman’s work in taking care of loved ones that have been stricken with Alzheimer’s and other forms of dementia, said Lavretsky, who also directs UCLA’s Late-Life Depression, Stress and Wellness Research Program. But caring for a frail or demented family member can be a significant life stressor. Older adult caregivers report higher levels of stress and depression and lower levels of satisfaction, vigor and life in general. Moreover, caregivers show higher levels of the biological markers of inflammation. Family members in particular are often considered to be at risk of stress-related disease and general health decline.
As the U.S. population continues to age over the next two decades, Lavretsky noted, the prevalence of dementia and the number of family caregivers who provide support to these loved ones will increase dramatically. Currently, at least five million Americans provide care for someone with dementia.
“We know that chronic stress places caregivers at a higher risk for developing depression,” she said “On average, the incidence and prevalence of clinical depression in family dementia caregivers approaches 50 percent. Caregivers are also twice as likely to report high levels of emotional distress.” What’s more, many caregivers tend to be older themselves, leading to what Lavretsky calls an “impaired resilience” to stress and an increased rate of cardiovascular disease and mortality.
Research has suggested for some time that psychosocial interventions like meditation reduce the adverse effects of caregiver stress on physical and mental health. However, the pathways by which such psychosocial interventions impact biological processes are poorly understood.
In the study, the participants were randomized into two groups. The meditation group was taught the 12-minute yogic practice that included Kirtan Kriya, which was performed every day at the same time for eight weeks. The other group was asked to relax in a quiet place with their eyes closed while listening to instrumental music on a relaxation CD, also for 12 minutes daily for eight weeks. Blood samples were taken at the beginning of the study and again at the end of the eight weeks.
“The goal of the study was to determine if meditation might alter the activity of inflammatory and antiviral proteins that shape immune cell gene expression,” said Lavretsky. “Our analysis showed a reduced activity of those proteins linked directly to increased inflammation.
“This is encouraging news. Caregivers often don’t have the time, energy, or contacts that could bring them a little relief from the stress of taking care of a loved one with dementia, so practicing a brief form of yogic meditation, which is easy to learn, is a useful too.”
Lavretsky is a member of UCLA’s recently launched Alzheimer’s and Dementia Care Program, which provides comprehensive, coordinated care as well as resources and support to patients and their caregivers. Lavretsky has incorporated yoga practice into the caregiver program.
How a low-protein diet predisposes offspring to adulthood hypertension
Studies have shown that the offspring of mothers on a low-protein diet are more likely to develop hypertension as adults. Now, Drs. Gao, Yallampalli, and Yallampalli of the University of Texas Medical Branch at Galveston report that in rats, the high maternal testosterone levels associated with a low-protein diet are caused by reduced activity of an enzyme that inactivates testosterone, allowing more testosterone to reach the fetus and increase the offspring’s susceptibility to adulthood hypertension.
Fetal programming is a term used to describe the impact of maternal stress on an unborn child’s physical characteristics at birth, as well as its long-term health. The placenta is thought to be a major contributor to fetal programming due to its critical roles in hormone production and nutrient transport, as well as its susceptibility to environmental disruptions.
Recently, a study found that protein restriction doubles the plasma testosterone levels in pregnant rats. Elevated testosterone levels are associated with pregnancy-related complications such as preeclampsia and polycystic ovarian syndrome in humans, and emerging evidence suggests that testosterone may play a role in fetal programming of hypertension.
Gao et al. hypothesized that the increased testosterone levels were caused either by increased activity of an enzyme that produces testosterone or by decreased activity of an enzyme that reduces testosterone, specifically Hsd17b2, which converts testosterone to a less potent androgen, androstenedione.
The team found that Hsd17b2 expression in rats was affected by protein restriction in two parts of the placenta. It was increased in the junctional zone, which is responsible for hormone production, but was reduced in the labyrinth zone, which is essential for nutrient transport from mother to fetus and also acts as a protective barrier.
Based on this novel finding, Gao et al. propose that the reduction in Hsd17b2 expression in the protective labyrinth zone may allow more testosterone to reach the fetus and play a role in fetal programming of hypertension.
The finding that Hsd17b2 was the only enzyme for testosterone production affected by gestational protein restriction suggests an important role for Hsd17b2 in regulating the testosterone levels at the maternal-fetal interface; further research is needed to determine its exact functions.
Published clinical trial demonstrates efficacy of Sea-Band® for migraine-related nausea
Study shows pressure applied to the acupoint PC6 Neiguan using Sea-Band effective at controlling nausea during migraine
Newport, R.I., July 25, 2012 – Migraine can be a disabling neurological disorder, often aggravated by accompanying nausea. Stimulation of the acupoint PC6 Neiguan, an approach to controlling nausea adopted by traditional Chinese medicine, has never been documented by published clinical studies in medical literature for the control of migraine-related nausea, until now. Published in the May 2012 Neurological Sciences (journal of the Italian Neurological Society)*, “Acupressure in the control of migraine-associated nausea” is a clinical trial demonstrating that continuous stimulation of the acupoint PC6 Neiguan on the inner wrist, as provided by Sea-Band® wristbands, showed statistically significant improvement in migraine-related nausea versus not using the wristbands.
Previous studies have demonstrated the efficacy of Sea-Band and its continual stimulation of the acupressure point PC6 for nausea relief due to motion sickness, postoperative nausea and chemotherapy-induced nausea. However, this May 2012 study is the first published research aimed at verifying that pressure applied to the acupoint PC6 with Sea-Band is effective at relieving nausea during migraine.
Migraine affects more than 29.5 million Americans, according to the National Headache Foundation, and is considered by the World Health Organization as the 19th leading cause of all years lived with disability for both males and females. Eight out of every 10 people in the U.S. who are diagnosed with migraine report experiencing nausea.
“This new research supports what millions of Sea-Band users already know; that the acupressure wristbands provide fast, consistent, drug-free relief for nausea associated with migraine headaches,” commented Leonard Nihan, president, Sea-Band U.S. “We’re pleased to have this published study to reinforce Sea-Band’s efficacy to the scientific and medical communities.”
The Italian study included 40 female patients suffering from migraine without aura, if nausea was always present as an accompanying symptomatology of their migraine. The patients were treated randomly for a total of six migraine attacks: three with the application of Sea-Band wristbands, which applies continual pressure to the PC6 acupoint (phase SB), and three without it (phase C).
The intensities of nausea at the onset at 30, 60, 120 and 240 minutes were evaluated on a scale from zero to 10. The values were always significantly lower in phase SB than in phase C. Also the number of patients who reported at least a 50 percent reduction in the nausea score was significantly higher in phase SB than in phase C at 30, 60 and 120 minutes. The average nausea scores dropped in the SB phase from 6.36 ± 0.35 in T0, to 4.60 ± 0.39 in T1, to 3.11 ± 0.40 in T2, to 1.88 ± 0.31 in T3 and to 0.92 ± 0.22 in T4. At each time step taken into consideration after the application of the Sea-Band wristbands, there was a statistically significant improvement over the non-treated phases. Moreover, there were a high percentage of responders to the treatment: i.e. 46.8 percent at 60 minutes; 71.8 percent at 120 minutes; 84.3 percent at 240 minutes with a consistent response over time.
Do ovaries continue to produce eggs during adulthood?
A compelling new genetic study tracing the origins of immature egg cells, or ‘oocytes’, from the embryonic period throughout adulthood adds new information to a growing controversy
A compelling new genetic study tracing the origins of immature egg cells, or ‘oocytes’, from the embryonic period throughout adulthood adds new information to a growing controversy. The notion of a “biological clock” in women arises from the fact that oocytes progressively decline in number as females get older, along with a decades-old dogmatic view that oocytes cannot be renewed in mammals after birth. After careful assessment of data from a recent study published in PLoS Genetics, scientists from Massachusetts General Hospital and the University of Edinburgh argue that the findings support formation of new eggs during adult life; a topic that has been historically controversial and has sparked considerable debate in recent years.
Eggs are formed from progenitor germ cells that exit the mitotic cycle, thereby ending their ability to proliferate through cell division, and subsequently enter meiosis, a process unique to the formation of eggs and sperm which removes one half of the genetic material from each type of cell prior to fertilization.
While traditional thinking has held that female mammals are born with all of the eggs they will ever have, newer research has demonstrated that adult mouse and human ovaries contain a rare population of progenitor germ cells called oogonial stem cells capable of dividing and generating new oocytes. Using a powerful new genetic tool that traces the number of divisions a cell has undergone with age (its ‘depth’) Shapiro and colleagues counted the number of times progenitor germ cells divided before becoming oocytes; their study was published in PLoS Genetics in February this year.
If traditional thinking held true, all divisions would have occurred prior to birth, and thus all oocytes would exhibit the same depth regardless of age. However, the opposite was found – eggs showed a progressive increase in depth as the female mice grew older.
In their assessment of the work by Shapiro and colleagues – published recently in a PLoS Genetics Perspective article – reproductive biologists Dori Woods, Evelyn Telfer and Jonathan Tilly conclude that the most plausible explanation for these findings is that progenitor germ cells in ovaries continue to divide throughout reproductive life, resulting in production of new oocytes with greater depth as animals age.
Although these investigations were performed in mice, there is emerging evidence that oogonial stem cells are also present in the ovaries of reproductive-age women, and these cells possess the capacity, like their mouse counterparts, to generate new oocytes under certain experimental conditions. While more work is needed to settle the debate over the significance of oocyte renewal in adult mammals, Woods and colleagues emphasize that “the recent work of Shapiro and colleagues is one of the first reports to offer experimental data consistent with a role for postnatal oocyte renewal in contributing to the reserve of ovarian follicles available for use in adult females as they age.”
Yoga may help stroke survivors improve balance
Group yoga can improve balance in stroke survivors who no longer receive rehabilitative care, according to new research in the American Heart Association journal Stroke.
In a small pilot study, researchers tested the potential benefits of yoga among chronic stroke survivors — those whose stroke occurred more than six months earlier.
“For people with chronic stroke, something like yoga in a group environment is cost effective and appears to improve motor function and balance,” said Arlene Schmid, Ph.D., O.T.R., lead researcher and a rehabilitation research scientist at Roudebush Veterans Administration-Medical Center and Indiana University, Department of Occupational Therapy in Indianapolis, Ind.
The study’s 47 participants, about three-quarters of them male veterans, were divided into three groups: twice-weekly group yoga for eight weeks; a “yoga-plus” group, which met twice weekly and had a relaxation recording to use at least three times a week; and a usual medical care group that did no rehabilitation.
The yoga classes, taught by a registered yoga therapist, included modified yoga postures, relaxation, and meditation. Classes grew more challenging each week.
Compared with patients in the usual-care group, those who completed yoga or yoga-plus significantly improved their balance.
Balance problems frequently last long after a person suffers a stroke, and are related to greater disability and a higher risk of falls, researchers said.
Furthermore, survivors in the yoga groups had improved scores for independence and quality of life and were less afraid of falling.
“For chronic stroke patients, even if they remain disabled, natural recovery and acute rehabilitation therapy typically ends after six months, or maybe a year,” said Schmid, who is also an assistant professor of occupational therapy at Indiana University-Purdue University in Indianapolis and an investigator at the Regenstrief Institute.
Improvements after the six-month window can take longer to occur, she said, “but we know for a fact that the brain still can change. The problem is the healthcare system is not necessarily willing to pay for that change. The study demonstrated that with some assistance, even chronic stroke patients with significant paralysis on one side can manage to do modified yoga poses.”
The oldest patient in the study was in his 90s. All participants had to be able to stand on their own at the study’s outset.
Yoga may be more therapeutic than traditional exercise because the combination of postures, breathing and meditation may produce different effects than simple exercise, researchers said.
“However, stroke patients looking for such help might have a hard time finding qualified yoga therapists to work with,” Schmid said. “Some occupational and physical therapists are integrating yoga into their practice, even though there’s scant evidence at this point to support its effectiveness.”
Researchers can draw only limited conclusions from the study because of its small number of participants and lack of diversity. The study also didn’t have enough participants to uncover differences between the yoga and control groups. The scientists hope to conduct a larger study soon.
Researchers also noticed improvements in the mindset of patients about their disability. The participants talked about walking through a grocery store instead of using an assistive scooter, being able to take a shower and feeling inspired to visit friends.
“It has to do with the confidence of being more mobile,” Schmid said. Although they took time to unfold, “these were very meaningful changes in life for people.”
Men with prostate cancer more likely to die from other causes
Study suggests prostate cancer management should emphasize healthy lifestyle changes
Boston, MA – Men diagnosed with prostate cancer are less likely to die from the disease than from largely preventable conditions such as heart disease, according to a new study from Harvard School of Public Health (HSPH). It is the largest study to date that looks at causes of death among men with prostate cancer, and suggests that encouraging healthy lifestyle changes should play an important role in prostate cancer management.
“Our results are relevant for several million men living with prostate cancer in the United States,” said first author Mara Epstein, a postdoctoral researcher at HSPH. “We hope this study will encourage physicians to use a prostate cancer diagnosis as a teachable moment to encourage a healthier lifestyle, which could improve the overall health of men with prostate cancer, increasing both the duration and quality of their life.”
The study was published July 25, 2012 in the Advance Access online Journal of the National Cancer Institute.
Prostate cancer is the most frequently diagnosed form of cancer, affecting one in six men during their lifetime. While incidence of prostate cancer has greatly increased in the United States, Sweden, and other Western countries in recent decades, the likelihood that a newly diagnosed man in these countries will die from the disease has declined. The researchers attribute this to the widespread use of the prostate-specific antigen (PSA) test, which has resulted in a higher proportion of men diagnosed with lower-risk forms of the disease.
The researchers examined causes of death among prostate cancer cases recorded in the U.S. Surveillance, Epidemiology, and End Results Program (over 490,000 men from 1973 to 2008) and the nationwide Swedish Cancer and Cause of Death registries (over 210,000 men from 1961 to 2008).
The results showed that during the study period, prostate cancer accounted for 52% of all reported deaths in Sweden and 30% of reported deaths in the United States among men with prostate cancer; however, only 35% of Swedish men and 16% of U.S. men diagnosed with prostate cancer died from this disease. In both populations, the risk of prostate cancer-specific death declined, while the risk of death from heart disease and non-prostate cancer remained constant. The five-year cumulative incidence of death from prostate cancer was 29% in Sweden and 11% in the United States.
Death rates from prostate cancer varied by age and calendar year of diagnosis, with the highest number of deaths from the disease among men diagnosed at older ages and those diagnosed in the earlier years of the surveys (especially in the years before the introduction of PSA screening).
“Our study shows that lifestyle changes such as losing weight, increasing physical activity, and quitting smoking, may indeed have a greater impact on patients’ survival than the treatment they receive for their prostate cancer,” said senior author Hans-Olov Adami, professor of epidemiology at HSPH.
Repetitious, Time-Intensive Magical Rituals Considered More Effective, Study Shows
July 26, 2012
AUSTIN, Texas — Even in this modern age of science, people are likely to find logic in supernatural rituals that require a high degree of time and effort, according to new research from The University of Texas at Austin.
The study, published in the June issue of Cognition, is the first psychological analysis of how people of various cultures evaluate the efficacy of ritual beliefs. The findings provide new insight into cognitive reasoning processes — and how people intuitively make sense out of the unknown.
“One of the most remarkable characteristics of human cognition is the capacity to use supernatural reasoning to explain the world around us,” said Cristine Legare, an assistant professor in the Department of Psychology at The University of Texas at Austin. “We argue that the characteristics of ritual are the product of an evolved cognitive system.”
Cause-and-effect thinking is critical to human survival, Legare said. So it’s natural for people to find logic in supernatural rituals that emphasize repetition and procedural steps. If doing something once has some effect, then repeating it must have a greater effect. For example, if a mechanic says he inspected something five times, the frequency of his actions leads the customer to overestimate the effectiveness of his work.
To find out how people rate the effectiveness of magical rituals, Legare and graduate student André Souza conducted a study in Brazil, a country suffused with rituals called simpatias. Used for solving problems as varied as quitting smoking, curing asthma and warding off bad luck, simpatias are formulaic rituals that involve various steps and repetition.
The psychologists presented 162 Brazilian respondents several versions of these rituals. Each was modified with different characteristics, such as repetition of procedures, number of steps, number of items used, and the presence of religious icons.
As part of the study, Legare asked the respondents to rate the effectiveness of each ritual. According to the findings, three elements of the simpatias had the biggest influence: number of steps, repetition of procedures and a specified time.
To see how magical rituals are perceived across cultures, the researchers conducted the same study with 68 U.S. respondents of various religious and socioeconomic backgrounds. As the researchers expected, the majority of respondents didn’t believe in simpatias. Yet similar to the Brazilians, they were more inclined to believe in rituals involving numerous repetitions and steps. For example, they gave a higher rating for this sadness-curing ritual, which involves numerous steps and repetitions.
In a metal container, put the leaves of a white rose. After that, set fire to the leaves. Get the remaining ash from the leaves and put it in a small plastic bag. Take the small plastic bag and leave it at a crossroad. Repeat the procedure for seven days in a row.
Though simpatias are primarily practiced in Brazil, magical rituals and other superstitions are widely accepted in the United States. Findings from the study provide further insight into how people find logic in the supernatural, regardless of concrete evidence.
These reports are done with the appreciation of all the Doctors, Scientist, and other Medical Researchers who sacrificed their time and effort. In order to give people the ability to empower themselves. Without the base aspirations for fame, or fortune. Just honorable people, doing honorable things.
You must log in to post a comment.