Tossing back one to three energy drinks may result in more than just a buzz. A small-meta analysis found that immediately afterward, subjects had increased systolic blood pressure and, more troubling, they also had, on average, a 10-msec prolongation in their QT interval [1].
The study, by Dr Sachin Shah (University of the Pacific, Stockton, CA) and colleagues, was presented at EPI-NPAM 2013, the Epidemiology and Prevention/Nutrition, Physical Activity and Metabolism 2013 Scientific Sessions.
"The blood-pressure finding falls in line with what we would suspect because of the caffeine content," Shah told heartwire . "The QT prolongation that we are seeing--I was very surprised with that. It's a bit of a wake-up call for us investigators to start studying it a bit more thoroughly, and it needs to happen sooner rather than later."
The group aimed to see how energy drinks affect heart health, given that these drinks, along with dietary supplements, are not regulated as stringently as new drugs that must meet Food and Drug Administration (FDA) safety requirements, Shah said.
In a literature search, they identified seven observational and interventional trials that evaluated the impact of energy drinks on QT interval, blood pressure, and heart rate.
Three studies with a pooled sample of 93 subjects had QT/QTc data. Six studies with a pooled sample of 132 subjects had blood-pressure data, and seven studies investigated heart rate.
The patients, who were all young (aged 18 to 45) and healthy, underwent ECG and blood-pressure testing before and just after drinking one to three cans of energy drink--most commonly Red Bull, but also others such as Full Throttle and Meltdown RTD. An 8.4-oz can of Red Bull contains 80 mg of caffeine, compared with 35 mg of caffeine in a 12-oz Coke or about 100 mg of caffeine in an average cup of coffee, Shah said.
Shortly after drinking the energy drinks, the pooled subjects had a systolic blood pressure increase of an average 3.5 mm Hg. "If people are drinking energy drinks every day, that change in blood pressure could be very significant," Shah noted, adding that, as reported by heartwire , research on torcetrapibwas terminated because of a similar 3-mm-Hg increase in blood pressure.
People who don't normally drink coffee might have a heightened blood-pressure response to an energy drink, he added.
In a clinical setting, physicians are usually concerned if a patient has a QT-interval increase of about 30 msec from baseline, Shah noted. He acknowledges that this was a small study, but it did uncover a disturbing signal that needs to be further investigated.
Diastolic BP and heart rate increased nonsignificantly.
Although the 10-msec prolongation in the QT interval is a "small number, if it were consistently produced by a drug being considered by the FDA, the FDA would require more testing to make sure that there was not a liability for producing further, more serious, and life-threatening prolongation of the QT interval and associated arrhythmias," AHA spokesperson and past president Dr Gordon F Tomaselli (Johns Hopkins University, Baltimore, MD) commented.
Both Shah and Tomaselli pointed out that people who are older or who have underlying CVD might have even more heart-related side effects from energy drinks than the young, healthy people in this meta-analysis.

Welcome to my collection of health articles. Most of them contain little nuggets of health wisdom that we can easily apply to our daily lives. As you can gather, I've been consuming all sorts of supplements over the years, most of them from iherb. They deliver on time (DHL), and prices are good. If you're a first-time buyer, use my code 'pot089' to enjoy up to $10 off.
Wednesday, 3 April 2013
Soy improves lung cancer survival in women
The consumption of soy food before a lung cancer diagnosis might favorably affect disease outcomes in women, according to a longitudinal follow-up study from China.
In the Shanghai Women's Health Study, women with lung cancer who ate low levels of soy daily were at greater risk of dying from lung cancer than those who ate average or higher levels of the Asian food staple over their lifetimes.
Specifically, in 301 women with lung cancer, there was an 81% increased risk for death from lung cancer in those in the 10th percentile of soy intake, compared with those in those in the 50th percentile (the median). In contrast, there was an 11% decreased risk for death in women in the 90th percentile of intake, compared with those in the 50th percentile (P for overall significance = .004).
These effect sizes were found after adjustment for a wide variety of factors, including tumor stage and treatment.
In short, soy might have a disease-modulating effect, say the authors, led by Gong Yang, MD, from Vanderbilt University Medical Center in Nashville, Tennessee.
The study results were published online today in the Journal of Clinical Oncology.
"Plant-derived estrogens, such as isoflavones found mainly in soy food, appear to act as natural selective estrogen-receptor modulators," they write.
Thus, soy could compete with a woman's endogenous estrogens in binding to estrogen receptors, the authors explain. Soy might occupy the receptors so they cannot be stimulated by the body's estrogens.
Dr. Yang and his coauthors point out that emerging evidence suggests that female sex hormones negatively affect lung cancer survival. The evidence includes well-publicized data from the Women's Health Initiative, which showed that estrogen-plus-progestin therapy significantly increased the risk for death from lung cancer.
"Soy may have a mechanism of action similar to drugs like tamoxifen," Jyoti Patel, MD, from Northwestern University in Chicago, Illinois, told Medscape Medical News.
"This is the first scientific evidence that soy has a favorable effect on lung cancer survival," said Dr. Patel, who is a spokesperson for the American Society of Clinical Oncology, and provided independent comment on the study.
"It's really exciting," she summarized.
This study is part of a varied body of research that is seeking "hormonal clues" to lung cancer, she explained.
In 2012, Dr. Yang and colleagues found, in the same 74,000-patient cohort of Shanghai women, an approximately 40% reduction in the risk for incident lung cancer associated with the high intake of soy food. Thus, eating soy could both prevent and alter the disease course of lung cancer.
In the current study, the amount of soy consumed by the 301 women with lung cancer that was most protective (>20 g/day) was "not astronomical," said Dr. Patel. "But you have to make a concerted effort [to eat that much]," she added, referring especially to Westerners, who are less likely to include soy in their diet.
Americans typically consume what the study deemed to be a low level of soy (≤12 g/day), said Dr. Patel, who explained that the study results are "likely more applicable" in Asian countries.
More epidemiologic studies are needed to confirm these results, the authors note.
Almost all were non-smokers
There were actually 440 cases of lung cancer in the 74,000-women Shanghai cohort, but only 301 had information about both tumor stage and treatment. The hazard ratios for the 440 women were less dramatic than those for the 301 women reported above.
For example, in the 440 women with lung cancer, there was a 42% increased risk for death from lung cancer in the women who were in the 10th percentile of soy intake, compared with those in the 50th percentile (the median). In the 301 women, that increase was much larger (81%).
Nevertheless, although the effect sizes varied, the evidence in support of a possible protective effect of soy was constant.
As expected in an Asian population, most of the women with lung cancer (about 92%) were never smokers. Mean age at cancer diagnosis was 66.3 years.
The researchers assessed soy dietary intake with a food-frequency questionnaire, which covered soy milk, tofu, fried tofu, dried or pressed tofu, fresh green soy beans, dry soy beans, soy sprouts, and other soy products.
Mean intakes, on a dry weight basis, were 18.0 g/day for soy food and 8.8 g/day for soy protein.
The authors report that soy food intake was not related to patient characteristics such as age at diagnosis, smoking, obesity, family history of lung cancer, tumor stage, treatment regimens, or time between baseline dietary assessment and disease diagnosis.
In this study, the median follow-up time after cancer diagnosis was 36 months. During the follow-up period, 318 of the 440 women died; in 301 (94.7%), the primary cause of death was lung cancer, and in 17 (5.3%), it was another cause.
In the Shanghai Women's Health Study, women with lung cancer who ate low levels of soy daily were at greater risk of dying from lung cancer than those who ate average or higher levels of the Asian food staple over their lifetimes.
Specifically, in 301 women with lung cancer, there was an 81% increased risk for death from lung cancer in those in the 10th percentile of soy intake, compared with those in those in the 50th percentile (the median). In contrast, there was an 11% decreased risk for death in women in the 90th percentile of intake, compared with those in the 50th percentile (P for overall significance = .004).
These effect sizes were found after adjustment for a wide variety of factors, including tumor stage and treatment.
In short, soy might have a disease-modulating effect, say the authors, led by Gong Yang, MD, from Vanderbilt University Medical Center in Nashville, Tennessee.
The study results were published online today in the Journal of Clinical Oncology.
"Plant-derived estrogens, such as isoflavones found mainly in soy food, appear to act as natural selective estrogen-receptor modulators," they write.
Thus, soy could compete with a woman's endogenous estrogens in binding to estrogen receptors, the authors explain. Soy might occupy the receptors so they cannot be stimulated by the body's estrogens.
Dr. Yang and his coauthors point out that emerging evidence suggests that female sex hormones negatively affect lung cancer survival. The evidence includes well-publicized data from the Women's Health Initiative, which showed that estrogen-plus-progestin therapy significantly increased the risk for death from lung cancer.
"Soy may have a mechanism of action similar to drugs like tamoxifen," Jyoti Patel, MD, from Northwestern University in Chicago, Illinois, told Medscape Medical News.
"This is the first scientific evidence that soy has a favorable effect on lung cancer survival," said Dr. Patel, who is a spokesperson for the American Society of Clinical Oncology, and provided independent comment on the study.
"It's really exciting," she summarized.
This study is part of a varied body of research that is seeking "hormonal clues" to lung cancer, she explained.
In 2012, Dr. Yang and colleagues found, in the same 74,000-patient cohort of Shanghai women, an approximately 40% reduction in the risk for incident lung cancer associated with the high intake of soy food. Thus, eating soy could both prevent and alter the disease course of lung cancer.
In the current study, the amount of soy consumed by the 301 women with lung cancer that was most protective (>20 g/day) was "not astronomical," said Dr. Patel. "But you have to make a concerted effort [to eat that much]," she added, referring especially to Westerners, who are less likely to include soy in their diet.
Americans typically consume what the study deemed to be a low level of soy (≤12 g/day), said Dr. Patel, who explained that the study results are "likely more applicable" in Asian countries.
More epidemiologic studies are needed to confirm these results, the authors note.
Almost all were non-smokers
There were actually 440 cases of lung cancer in the 74,000-women Shanghai cohort, but only 301 had information about both tumor stage and treatment. The hazard ratios for the 440 women were less dramatic than those for the 301 women reported above.
For example, in the 440 women with lung cancer, there was a 42% increased risk for death from lung cancer in the women who were in the 10th percentile of soy intake, compared with those in the 50th percentile (the median). In the 301 women, that increase was much larger (81%).
Nevertheless, although the effect sizes varied, the evidence in support of a possible protective effect of soy was constant.
As expected in an Asian population, most of the women with lung cancer (about 92%) were never smokers. Mean age at cancer diagnosis was 66.3 years.
The researchers assessed soy dietary intake with a food-frequency questionnaire, which covered soy milk, tofu, fried tofu, dried or pressed tofu, fresh green soy beans, dry soy beans, soy sprouts, and other soy products.
Mean intakes, on a dry weight basis, were 18.0 g/day for soy food and 8.8 g/day for soy protein.
The authors report that soy food intake was not related to patient characteristics such as age at diagnosis, smoking, obesity, family history of lung cancer, tumor stage, treatment regimens, or time between baseline dietary assessment and disease diagnosis.
In this study, the median follow-up time after cancer diagnosis was 36 months. During the follow-up period, 318 of the 440 women died; in 301 (94.7%), the primary cause of death was lung cancer, and in 17 (5.3%), it was another cause.
3mg melatonin may aid migraine prevention
Melatonin, which is widely available in North America as an over-the-counter supplement, is more effective than placebo for migraine prevention and has a more favorable adverse effect profile than the tricyclic antidepressant amitriptyline, new research shows.
Results from a multicenter, randomized, double-blind, placebo-controlled trial showed that 3 mg of melatonin was more effective than placebo and had efficacy similar to that of 25 mg of amitriptyline. Furthermore, it was better tolerated than amitriptyline, with lower rates of daytime sleepiness and no weight gain.
"Melatonin 3 mg was significantly better than placebo with no difference compared to amitriptyline with respect to migraine prevention," principal investigator Mario Peres, MD, PhD, told delegates here attending the American Academy of Neurology (AAN) 65th Annual Meeting.
"But if we look at the proportion of responders, then melatonin had better results than amitriptyline," added Dr. Peres, who is director of São Paulo Headache Center, professor of neurology at ABC Medical School, and senior research associate at the Albert Einstein Brain Research Institute, Brazil.
Link to headache, sleep disorders
Produced by the pineal gland, melatonin is a hormone that helps regulate the sleep/wake cycle. It has been available as a supplement in the United States since the 1990s and is often used to aid sleep and attenuate jet lag.
According to Dr. Peres, melatonin's role in regulating circadian rhythm has been linked to cluster headache, hypnic headache, and migraine.
Further, he noted, melatonin plays an important role in sleep regulation, and disruption of melatonin production has been linked to sleep disorders, including sleep apnea, insomnia, and delayed sleep phase syndrome, which, in turn, are linked to headache.
He also noted that there is a bidirectional relationship in which headache can disrupt sleep and lead to insomnia and excessive daytime sleepiness.
Finally, he pointed out that research has linked low levels of melatonin in plasma and urine and altered peak time in melatonin levels to a variety of headache types, including migraine.
According to Dr. Peres, research into melatonin as a potential treatment for headache has included several case reports and open-label studies but only 2 randomized controlled trials: 1 in cluster headache, which was positive, and 1 negative trial in migraine.
The negative migraine trial, he said, had several limitations, including a small sample size and a short duration of only 8 weeks. It also used a slow-release, 2-mg formulation of melatonin, and, although the response rate in the melatonin group was 44%, the placebo group had an exceptionally high response rate of 40%.
Surprise weight-loss finding
To test the efficacy and tolerability of melatonin and amitriptyline vs placebo for migraine prevention, the investigators recruited 178 men and women who met International Headache Society diagnostic criteria for migraine with and without aura and who had 2 to 8 migraine attacks per month.
All patients underwent a 4-week baseline phase during which each participant kept a diary of migraine frequency.
Participants were then randomly assigned to receive 3 mg melatonin (n = 60), 25 mg amitriptyline (n = 59), or placebo (n = 59) for 3 months. Medication was taken between 10 and 11 pm daily.
The study's primary outcome was a reduction in the number of headache days per month. Secondary endpoints included migraine intensity and duration and analgesic use. Tolerability was also measured in all 3 study groups.
The mean reduction in headache frequency was 2.7 in the melatonin group, 2.18 in the amitriptyline group, and 1.18 in the placebo group.
Although migraine frequency did not differ between the 2 active treatment groups, the proportion of responders was greatest in the melatonin group: 54% vs 39.1% for amitriptyline and 20.4% for placebo.
Melatonin was also "very tolerable" and had significantly fewer adverse effects compared with amitriptyline, said Dr. Peres. Daytime sleepiness was the most frequent symptom in all 3 groups but was most pronounced in the amitriptyline group (n = 24).
Although patients gained weight in both the amitriptyline (n = 3) and placebo (n = 1) groups, melatonin was associated with weight loss.
Timing of administration and formulation is also important. Ideally, said Dr. Peres, melatonin should be taken between 10 pm and 11 pm to mimic the physiologic peak. In addition, a fast-acting rather than a slow-release formula should be used.
Overall, said Dr. Peres, the study's findings are promising and warrant further research.
Worth a try?
Commenting on the study, Tobias Kurth, MD, director of research Institut national de la santé et de la recherche médicale (INSERM), University of Bordeaux in France, and associate epidemiologist, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, said the researchers "very convincingly" showed that melatonin was as effective as amitriptyline and both were superior to placebo.
"If this is true, this is great," said Dr. Kurth.
Although the study's findings are preliminary and need to be replicated, Dr. Kurth said that given its favorable adverse effect profile, melatonin may be worth a try.
"I'm not aware of any major side effects associated with melatonin. As long clinicians instruct patients appropriately and emphasize the importance of taking the recommended dose at the same time every day it may be worthwhile," he said.
Results from a multicenter, randomized, double-blind, placebo-controlled trial showed that 3 mg of melatonin was more effective than placebo and had efficacy similar to that of 25 mg of amitriptyline. Furthermore, it was better tolerated than amitriptyline, with lower rates of daytime sleepiness and no weight gain.
"Melatonin 3 mg was significantly better than placebo with no difference compared to amitriptyline with respect to migraine prevention," principal investigator Mario Peres, MD, PhD, told delegates here attending the American Academy of Neurology (AAN) 65th Annual Meeting.
"But if we look at the proportion of responders, then melatonin had better results than amitriptyline," added Dr. Peres, who is director of São Paulo Headache Center, professor of neurology at ABC Medical School, and senior research associate at the Albert Einstein Brain Research Institute, Brazil.
Link to headache, sleep disorders
Produced by the pineal gland, melatonin is a hormone that helps regulate the sleep/wake cycle. It has been available as a supplement in the United States since the 1990s and is often used to aid sleep and attenuate jet lag.
According to Dr. Peres, melatonin's role in regulating circadian rhythm has been linked to cluster headache, hypnic headache, and migraine.
Further, he noted, melatonin plays an important role in sleep regulation, and disruption of melatonin production has been linked to sleep disorders, including sleep apnea, insomnia, and delayed sleep phase syndrome, which, in turn, are linked to headache.
He also noted that there is a bidirectional relationship in which headache can disrupt sleep and lead to insomnia and excessive daytime sleepiness.
Finally, he pointed out that research has linked low levels of melatonin in plasma and urine and altered peak time in melatonin levels to a variety of headache types, including migraine.
According to Dr. Peres, research into melatonin as a potential treatment for headache has included several case reports and open-label studies but only 2 randomized controlled trials: 1 in cluster headache, which was positive, and 1 negative trial in migraine.
The negative migraine trial, he said, had several limitations, including a small sample size and a short duration of only 8 weeks. It also used a slow-release, 2-mg formulation of melatonin, and, although the response rate in the melatonin group was 44%, the placebo group had an exceptionally high response rate of 40%.
Surprise weight-loss finding
To test the efficacy and tolerability of melatonin and amitriptyline vs placebo for migraine prevention, the investigators recruited 178 men and women who met International Headache Society diagnostic criteria for migraine with and without aura and who had 2 to 8 migraine attacks per month.
All patients underwent a 4-week baseline phase during which each participant kept a diary of migraine frequency.
Participants were then randomly assigned to receive 3 mg melatonin (n = 60), 25 mg amitriptyline (n = 59), or placebo (n = 59) for 3 months. Medication was taken between 10 and 11 pm daily.
The study's primary outcome was a reduction in the number of headache days per month. Secondary endpoints included migraine intensity and duration and analgesic use. Tolerability was also measured in all 3 study groups.
The mean reduction in headache frequency was 2.7 in the melatonin group, 2.18 in the amitriptyline group, and 1.18 in the placebo group.
Although migraine frequency did not differ between the 2 active treatment groups, the proportion of responders was greatest in the melatonin group: 54% vs 39.1% for amitriptyline and 20.4% for placebo.
Melatonin was also "very tolerable" and had significantly fewer adverse effects compared with amitriptyline, said Dr. Peres. Daytime sleepiness was the most frequent symptom in all 3 groups but was most pronounced in the amitriptyline group (n = 24).
Although patients gained weight in both the amitriptyline (n = 3) and placebo (n = 1) groups, melatonin was associated with weight loss.
Timing of administration and formulation is also important. Ideally, said Dr. Peres, melatonin should be taken between 10 pm and 11 pm to mimic the physiologic peak. In addition, a fast-acting rather than a slow-release formula should be used.
Overall, said Dr. Peres, the study's findings are promising and warrant further research.
Worth a try?
Commenting on the study, Tobias Kurth, MD, director of research Institut national de la santé et de la recherche médicale (INSERM), University of Bordeaux in France, and associate epidemiologist, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, said the researchers "very convincingly" showed that melatonin was as effective as amitriptyline and both were superior to placebo.
"If this is true, this is great," said Dr. Kurth.
Although the study's findings are preliminary and need to be replicated, Dr. Kurth said that given its favorable adverse effect profile, melatonin may be worth a try.
"I'm not aware of any major side effects associated with melatonin. As long clinicians instruct patients appropriately and emphasize the importance of taking the recommended dose at the same time every day it may be worthwhile," he said.
Saturday, 9 March 2013
Bacon, ham, sausage once again linked to cardiovascular deaths
Bacon, sausage, and ham -- some of the most beloved foodstuffs on the planet --are once again being singled out as key culprits driving the association between meat consumption and the world's most common diseases.
In one of the largest studies to address this question, high consumption of processed meat by middle-aged adults was associated with a near doubling of the risk of all-cause mortality, compared with low consumption, over a mean of 12 years. Risk of cardiovascular death, after rigorous modeling, was increased by more than 70% among people eating more than 160 g/day, as compared with those eating 10 to 19.9 g/day. Risk of cancer deaths was also 43% higher among the highest consumers of processed meats.
"The clinical message, in our opinion, is that it's okay to eat some meat, but to limit consumption of processed meat: not every day and not in high amounts," lead author Dr Sabine Rohrmann (University of Zurich, Switzerland) told heartwire in an email.
EPIC Data
The new data come from the European Prospective Investigation into Cancer and Nutrition (EPIC) study, involving 10 countries and almost half a million men and women. It was published online today inBMC Medicine.
Of note, say the authors, while a signal of increased mortality was seen among the highest consumers of red meat in general, the risk for red meat was much lower that that of processed meats and lost statistical significance after correction for measurement error. With the same adjustments and corrections, high processed-meat consumption was associated with an 18% greater risk of all-cause mortality.
As the authors point out, processed meats tend to contain more saturated fat than unprocessed meat (where the fat is often trimmed off), as well as more cholesterol and additives, often as part of the smoking or curing process. Some of these are believed to be carcinogenic or precursors to carcinogenic processes. "Another factor is the content of salt in processed-meat products, which is linked to hypertension, which is a CVD risk factor," Rohrmann told heartwire . "Heme iron is another mechanism, which links meat consumption to CVD risk, but that's not limited to processed meat."
Rohrmann and colleagues also point out that high consumption of processed meat typically went hand in hand with other unhealthy behaviors, including smoking, low physical-activity levels, and low consumption of fruits and vegetables.
"Overall, we estimate that 3% of premature deaths each year could be prevented if people ate less than 20 g of processed meat per day," Rohrmann commented in a press statement.
What about red meat?
Other studies have singled out processed meats as particularly hazardous to health. US analyses of meat consumption and mortality, drawing on data from two large, long-running US studies, have also documented the link between meat consumption and CVD and cancer deaths, but the stronger association seen with processed meats in this European cohort is somewhat at odds with the American data.
"Although we did not find a statistically significant association between unprocessed red-meat consumption and mortality in our studies, the two US studies did," Rohrmann said. "Therefore, we would not say that there is definitely no association [between red-meat consumption and CVD]. What I think our studies show is that it's okay to eat a moderate amount of meat--300 to 600 g per week as recommended by many nutrition societies--for intake of some important minerals and vitamins; however, a balanced vegetarian diet is, of course, okay as well."
Source: http://www.medscape.com/viewarticle/780412?nlid=29037_1301&src=wnl_edit_dail
In one of the largest studies to address this question, high consumption of processed meat by middle-aged adults was associated with a near doubling of the risk of all-cause mortality, compared with low consumption, over a mean of 12 years. Risk of cardiovascular death, after rigorous modeling, was increased by more than 70% among people eating more than 160 g/day, as compared with those eating 10 to 19.9 g/day. Risk of cancer deaths was also 43% higher among the highest consumers of processed meats.
"The clinical message, in our opinion, is that it's okay to eat some meat, but to limit consumption of processed meat: not every day and not in high amounts," lead author Dr Sabine Rohrmann (University of Zurich, Switzerland) told heartwire in an email.
EPIC Data
The new data come from the European Prospective Investigation into Cancer and Nutrition (EPIC) study, involving 10 countries and almost half a million men and women. It was published online today inBMC Medicine.
Of note, say the authors, while a signal of increased mortality was seen among the highest consumers of red meat in general, the risk for red meat was much lower that that of processed meats and lost statistical significance after correction for measurement error. With the same adjustments and corrections, high processed-meat consumption was associated with an 18% greater risk of all-cause mortality.
As the authors point out, processed meats tend to contain more saturated fat than unprocessed meat (where the fat is often trimmed off), as well as more cholesterol and additives, often as part of the smoking or curing process. Some of these are believed to be carcinogenic or precursors to carcinogenic processes. "Another factor is the content of salt in processed-meat products, which is linked to hypertension, which is a CVD risk factor," Rohrmann told heartwire . "Heme iron is another mechanism, which links meat consumption to CVD risk, but that's not limited to processed meat."
Rohrmann and colleagues also point out that high consumption of processed meat typically went hand in hand with other unhealthy behaviors, including smoking, low physical-activity levels, and low consumption of fruits and vegetables.
"Overall, we estimate that 3% of premature deaths each year could be prevented if people ate less than 20 g of processed meat per day," Rohrmann commented in a press statement.
What about red meat?
Other studies have singled out processed meats as particularly hazardous to health. US analyses of meat consumption and mortality, drawing on data from two large, long-running US studies, have also documented the link between meat consumption and CVD and cancer deaths, but the stronger association seen with processed meats in this European cohort is somewhat at odds with the American data.
"Although we did not find a statistically significant association between unprocessed red-meat consumption and mortality in our studies, the two US studies did," Rohrmann said. "Therefore, we would not say that there is definitely no association [between red-meat consumption and CVD]. What I think our studies show is that it's okay to eat a moderate amount of meat--300 to 600 g per week as recommended by many nutrition societies--for intake of some important minerals and vitamins; however, a balanced vegetarian diet is, of course, okay as well."
Source: http://www.medscape.com/viewarticle/780412?nlid=29037_1301&src=wnl_edit_dail
Friday, 1 March 2013
Acne: Bad and good bacteria identified
A genomic approach to comparing acne-ridden skin with clear human skin has revealed a bacterial strain that may protect against the disease. The work was published online February 28 in the Journal of Investigative Dermatology by Sorel Fitz-Gibbon, PhD, from the Department of Molecular and Medical Pharmacology, Crump Institute for Molecular Imaging, David Geffen School of Medicine, University of California, Los Angeles, and colleagues.
Four of 5 people in the United States develop acne at some point. Although dermatologists no longer blame blemishes on too much chocolate or fatty foods, commonly prescribed acne drugs such as benzoyl peroxide, antibiotics, and isotretinoin (Accutane, Roche) have been used for decades, although they are not ideal. Antibiotics are usually ineffective in severe cases, and isotretinoin both has adverse effects and is teratogenic.
Propionibacterium acnes is a dominant skin commensal that colonizes pilosebaceous units, in which certain strains are hypothesized to stimulate development of acne vulgaris. The population sizes of the bacteria are similar among individuals, but the proportions of different strains vary between people prone to acne and people with clear skin. Similar to Staphylococcus aureus, only certain strains of the bacteria are pathogenic.
The researchers sequenced bacterial genomes to better define the genetic compositions of the P acne component of the skin microbiome in acne-marked vs healthy skin. The strategy revealed "a previously unreported portrait of the microbiota of pilosebaceous units at the bacterial strain level," the researchers write.
Dr. Fitz-Gibbon and colleagues applied over-the-counter pore-cleansing strips to the noses of 49 people with acne and 52 participants with clear skin, sampling P acnes in whiteheads or blackheads. The investigators typed bacterial strains according to 16S ribosomal DNA sequences (ribotypes), assigning each strain an acne index based on prevalence among patients with acne. This approach identified 11,009 ribotypes, but only a few were abundant in either patient group.
Genome sequencing was the next step. Team members from Washington University in St. Louis, Missouri, sequenced the genomes of 66 previously unreported P acnes strains from the samples and compared 71 bacterial genome types overall. Author Shuta Tomida, PhD, also from the Department of Molecular and Medical Pharmacology, Crump Institute for Molecular Imaging, David Geffen School of Medicine, identified DNA differences among the strains.
Beneficial Bacteria
Three genomically identified strains are of potential clinical significance. Two strains were discovered in 1 of 5 volunteers with acne, but rarely in clear-skinned people. Conversely, a third bacterial strain appeared to be common in healthy skin yet was rare in skin with acne, suggesting a protective role.
The findings may have practical applications, the authors write, such as the development of a probiotic topical preparation to favor the protective bacterial strain or drugs to selectively target acne-related bacteria. A simple skin test might predict whether a person has an increased risk of developing aggressive acne.
Further studies might address identifying host factors that contribute to acne and matching microbiome subtypes with clinical subtypes of acne, such as cystic, pustular, or inflammatory acne.
"By combining a metagenomic study of the skin microbiome and genome sequencing of this major skin commensal, this study provides insight into potential bacterial genetic determinants in acne pathogenesis and emphasizes the importance of strain-level analysis of the human microbiome to understand the role of commensals in health and disease," the researchers conclude.
Source: http://www.medscape.com/viewarticle/780020?nlid=28944_1301&src=wnl_edit_dail
Four of 5 people in the United States develop acne at some point. Although dermatologists no longer blame blemishes on too much chocolate or fatty foods, commonly prescribed acne drugs such as benzoyl peroxide, antibiotics, and isotretinoin (Accutane, Roche) have been used for decades, although they are not ideal. Antibiotics are usually ineffective in severe cases, and isotretinoin both has adverse effects and is teratogenic.
Propionibacterium acnes is a dominant skin commensal that colonizes pilosebaceous units, in which certain strains are hypothesized to stimulate development of acne vulgaris. The population sizes of the bacteria are similar among individuals, but the proportions of different strains vary between people prone to acne and people with clear skin. Similar to Staphylococcus aureus, only certain strains of the bacteria are pathogenic.
The researchers sequenced bacterial genomes to better define the genetic compositions of the P acne component of the skin microbiome in acne-marked vs healthy skin. The strategy revealed "a previously unreported portrait of the microbiota of pilosebaceous units at the bacterial strain level," the researchers write.
Dr. Fitz-Gibbon and colleagues applied over-the-counter pore-cleansing strips to the noses of 49 people with acne and 52 participants with clear skin, sampling P acnes in whiteheads or blackheads. The investigators typed bacterial strains according to 16S ribosomal DNA sequences (ribotypes), assigning each strain an acne index based on prevalence among patients with acne. This approach identified 11,009 ribotypes, but only a few were abundant in either patient group.
Genome sequencing was the next step. Team members from Washington University in St. Louis, Missouri, sequenced the genomes of 66 previously unreported P acnes strains from the samples and compared 71 bacterial genome types overall. Author Shuta Tomida, PhD, also from the Department of Molecular and Medical Pharmacology, Crump Institute for Molecular Imaging, David Geffen School of Medicine, identified DNA differences among the strains.
Beneficial Bacteria
Three genomically identified strains are of potential clinical significance. Two strains were discovered in 1 of 5 volunteers with acne, but rarely in clear-skinned people. Conversely, a third bacterial strain appeared to be common in healthy skin yet was rare in skin with acne, suggesting a protective role.
The findings may have practical applications, the authors write, such as the development of a probiotic topical preparation to favor the protective bacterial strain or drugs to selectively target acne-related bacteria. A simple skin test might predict whether a person has an increased risk of developing aggressive acne.
Further studies might address identifying host factors that contribute to acne and matching microbiome subtypes with clinical subtypes of acne, such as cystic, pustular, or inflammatory acne.
"By combining a metagenomic study of the skin microbiome and genome sequencing of this major skin commensal, this study provides insight into potential bacterial genetic determinants in acne pathogenesis and emphasizes the importance of strain-level analysis of the human microbiome to understand the role of commensals in health and disease," the researchers conclude.
Source: http://www.medscape.com/viewarticle/780020?nlid=28944_1301&src=wnl_edit_dail
Global rise in diabetes linked to increase in dietary sugar
The increasing availability of sugary food and drink — independent of excess calories, excess weight, or a sedentary lifestyle — explains part of the rise in cases of type 2 diabetes worldwide, suggests a new study published online February 27 in PLoS ONE.
The authors, led by Sanjay Basu, MD, from the Department of Medicine, Stanford University, Palo Alto, California, examined a decade of cross-sectional data from 175 countries. They note that most of the worldwide rise in diabetes is thought to be type 2, linked to the metabolic syndrome, a cluster of factors including dyslipidemia, hypertension, and insulin resistance.
They report that for each added 150 calories of sugar available per person per day — equivalent to one 12-oz can of soda — the prevalence of type 2 diabetes increased by about 1% (P < .001), after they controlled for factors such as food types, total calories, overweight and obesity rates, aging, urbanization, income, and time.
Sugar Alone Explains 25% of Increase in Diabetes
It is important to identify risk factors for type 2 diabetes, since nearly 1 in 10 adults worldwide has the condition, and the rates are increasing. "If obesity is a primary driver of diabetes, then measures to reduce calorie consumption and increase physical activity should be prioritized," Basu and colleagues write. "However, if added sugar consumption is a primary driver, then public health policies to reduce sugar consumption warrant investigation as diabetes-prevention proposals — especially for developing countries, where diabetes rates are rising dramatically, irrespective of obesity."
They give examples of several countries with high diabetes prevalence rates but low obesity rates, including the Philippines, Romania, France, Bangladesh, and Georgia.
To determine whether sugar in a country's food supply predicts subsequent spikes in diabetes rates, the researchers used a regression model to correlate the prevalence of diabetes from 2000 to 2010 in adults aged 20 to 79 living in 175 countries with the availability of sugar in a country's food supply.
They found that during this decade, the prevalence of worldwide diabetes rose by about 27%, with one quarter of this increase explained by an increase in the availability of sugary foods. No other food category had any significant effect on diabetes prevalence.
Rates of diabetes increased in a dose-dependent manner the longer a population was exposed to excess sugar and the greater the amount of available sugar in the food supply. Each extra year of living where sugary foods were widely available was linked with an increase in diabetes prevalence of 0.053% (P < .05).
The reverse was also true. Following periods of scarce sugar (typically due to changes in trade agreements), the prevalence of diabetes decreased by 0.074% (P< .05), after correction for confounding variables.
Although their findings were robust, the study was not designed to establish a causal link between sugar intake and subsequent diabetes diagnosis, the authors admit. Prospective longitudinal cohort studies in international settings, which are now under way, should shed more light on this link.
The authors, led by Sanjay Basu, MD, from the Department of Medicine, Stanford University, Palo Alto, California, examined a decade of cross-sectional data from 175 countries. They note that most of the worldwide rise in diabetes is thought to be type 2, linked to the metabolic syndrome, a cluster of factors including dyslipidemia, hypertension, and insulin resistance.
They report that for each added 150 calories of sugar available per person per day — equivalent to one 12-oz can of soda — the prevalence of type 2 diabetes increased by about 1% (P < .001), after they controlled for factors such as food types, total calories, overweight and obesity rates, aging, urbanization, income, and time.
Sugar Alone Explains 25% of Increase in Diabetes
It is important to identify risk factors for type 2 diabetes, since nearly 1 in 10 adults worldwide has the condition, and the rates are increasing. "If obesity is a primary driver of diabetes, then measures to reduce calorie consumption and increase physical activity should be prioritized," Basu and colleagues write. "However, if added sugar consumption is a primary driver, then public health policies to reduce sugar consumption warrant investigation as diabetes-prevention proposals — especially for developing countries, where diabetes rates are rising dramatically, irrespective of obesity."
They give examples of several countries with high diabetes prevalence rates but low obesity rates, including the Philippines, Romania, France, Bangladesh, and Georgia.
To determine whether sugar in a country's food supply predicts subsequent spikes in diabetes rates, the researchers used a regression model to correlate the prevalence of diabetes from 2000 to 2010 in adults aged 20 to 79 living in 175 countries with the availability of sugar in a country's food supply.
They found that during this decade, the prevalence of worldwide diabetes rose by about 27%, with one quarter of this increase explained by an increase in the availability of sugary foods. No other food category had any significant effect on diabetes prevalence.
Rates of diabetes increased in a dose-dependent manner the longer a population was exposed to excess sugar and the greater the amount of available sugar in the food supply. Each extra year of living where sugary foods were widely available was linked with an increase in diabetes prevalence of 0.053% (P < .05).
The reverse was also true. Following periods of scarce sugar (typically due to changes in trade agreements), the prevalence of diabetes decreased by 0.074% (P< .05), after correction for confounding variables.
Although their findings were robust, the study was not designed to establish a causal link between sugar intake and subsequent diabetes diagnosis, the authors admit. Prospective longitudinal cohort studies in international settings, which are now under way, should shed more light on this link.
Source: http://www.medscape.com/viewarticle/779985?nlid=28944_1301&src=wnl_edit_dail&pa=PCYXhnapQ2wZF1d7JGcwb4Ym85nElZ8ukcKRvE7ifDe0gl7tve%2FFWz1pQS47LFuU43mU9jD%2B1DtnxY47OmyybA%3D%3D
Thursday, 21 February 2013
High calcium intake with supplements unhealthy for women
Too much of a good thing may be just that: too much. That is the conclusion of yet another study, this time a prospective, longitudinal, population-based cohort of Swedish women, looking at calcium intake and cardiovascular mortality.
In this study, high rates of calcium intake were associated with higher all-cause and cardiovascular death rates but not with deaths from stroke, Karl Michaëlsson, MD, PhD, professor in medical epidemiology and senior consultant in orthopedic surgery at Uppsala University in Sweden, and colleagues report in an article published online February 13 in BMJ.
The study is the latest in a series of contentious analyses linking calcium intake and cardiovascular events. Earlier this month, a National Institutes of Health–sponsored study suggested that a high intake of supplemental calcium increased the risk for cardiovascular disease (CVD) death in men, but not women.
However, a commentator notes that the study results suggest that supplements, rather than the intake level, are the problem.
The Swedish mammography cohort, established between 1987 and 1990, followed up 61,433 women born between 1914 and 1948 for a median of 19 years and used registry data to determine outcomes. During that period, there were 11,944 deaths from all causes, of which 3862 were from CVD, 1932 from ischemic heart disease, and 1100 from stroke.
Dietary assessments from food frequency questionnaires at baseline and in 1997 were available for 38,984 women, from which the researchers estimated intakes of dietary and supplemental calcium.
The highest intakes of calcium (>1400 mg/day) were associated with higher all-cause risk for death (after adjustment for age, total energy, vitamin D, and calcium supplement intake, as well as other dietary, physical, and demographic factors) as compared with intakes of 600 to 1000 mg/day (hazard ratio [HR], 1.40; 95% confidence interval [CI], 1.17 - 1.67).
Disease-specific mortality risks were elevated for CVD (HR, 1.49; 95% CI, 1.09 - 2.02) and for ischemic heart disease (HR, 2.14; 95% CI, 1.48 - 3.09) at daily calcium intakes above 1400 mg. At calcium intakes less than 600 mg/day, these same mortality risks were also elevated. None of these patterns was apparent for mortality from stroke.
In an email exchange with Medscape Medical News, Dr. Michaëlsson said the association of calcium intake and all-cause and cardiovascular mortality "was especially strong if a high dietary intake of calcium was combined with calcium supplements."
Women with the highest intake of calcium (>1400 mg/day) and who used supplement tablets had an all-cause risk for death 2.5 times higher than women who had similar total intakes but were not taking a supplement.
The authors explain that serum calcium levels "are under tight homeostatic control" and do not normally correlate with the amount of calcium intake. However, low or very high intakes override this control, "causing changes in blood levels of calcium or calciotropic hormones."
Complex Study Results; Weak Findings?
Dr. Michaëlsson also noted that some previous studies have shown a similar relationship between calcium supplements and a higher risk for CVD but were not powered to look at mortality and did not assess the amount of dietary intake of calcium.
He advised that one should not make recommendations on the basis of a single study, but emerging evidence suggests caution about high calcium intake. He also noted that a meta-analysis of randomized trials has shown that calcium supplementation actually increased the rate of hip fracture. "My present recommendation is to avoid calcium supplement use if you have a normal varied diet," he said.
Commenting to Medscape Medical News by email, John Cleland, MD PhD, professor of cardiology at Hull York Medical School in Kingston-upon-Hull, United Kingdom, called the study results "extremely complex...with rather weak findings." He pointed out that in the study there were few patients or events in the group with high calcium intake (n = 1241; 2%), and the events were confined to those women taking supplements (total events, n = 23, of which 16 occurred among women taking any form of calcium supplement).
Women with calcium intakes greater than 1400 mg/day who were taking calcium tablets had an adjusted all-cause mortality rate of 2.57 (95% CI, 1.19 - 5.55) compared with 1.17 (95% CI, 0.97 - 1.41) among women who had similar daily intakes but were not taking supplements. "So, it's not the diet but the pills that are the problem," Dr. Cleland concluded, which is essentially in agreement with what Dr. Michaëlsson said.
Dr. Cleland raised the issues of what else may have been in the calcium pills and why the women were taking them; for example, if they had chronic kidney disease or osteoporosis. He said the article did not provide such information but just referred to a previous paper.
He also pointed out that calcium tablets "have not been shown to reduce fracture rates or improve any other patient outcome that I know of." He recommended that people stop taking calcium supplements "until efficacy/safety is shown," and that this advice "should definitely include those taking them for osteoporosis and should perhaps include those taking them for [chronic kidney disease]." His recommendation? "Having a healthy balanced diet and avoiding water filters that reduce calcium in drinking water is probably best."
Source: http://www.medscape.com/viewarticle/779541?nlid=28503_1301&src=wnl_edit_dail
In this study, high rates of calcium intake were associated with higher all-cause and cardiovascular death rates but not with deaths from stroke, Karl Michaëlsson, MD, PhD, professor in medical epidemiology and senior consultant in orthopedic surgery at Uppsala University in Sweden, and colleagues report in an article published online February 13 in BMJ.
The study is the latest in a series of contentious analyses linking calcium intake and cardiovascular events. Earlier this month, a National Institutes of Health–sponsored study suggested that a high intake of supplemental calcium increased the risk for cardiovascular disease (CVD) death in men, but not women.
However, a commentator notes that the study results suggest that supplements, rather than the intake level, are the problem.
The Swedish mammography cohort, established between 1987 and 1990, followed up 61,433 women born between 1914 and 1948 for a median of 19 years and used registry data to determine outcomes. During that period, there were 11,944 deaths from all causes, of which 3862 were from CVD, 1932 from ischemic heart disease, and 1100 from stroke.
Dietary assessments from food frequency questionnaires at baseline and in 1997 were available for 38,984 women, from which the researchers estimated intakes of dietary and supplemental calcium.
The highest intakes of calcium (>1400 mg/day) were associated with higher all-cause risk for death (after adjustment for age, total energy, vitamin D, and calcium supplement intake, as well as other dietary, physical, and demographic factors) as compared with intakes of 600 to 1000 mg/day (hazard ratio [HR], 1.40; 95% confidence interval [CI], 1.17 - 1.67).
Disease-specific mortality risks were elevated for CVD (HR, 1.49; 95% CI, 1.09 - 2.02) and for ischemic heart disease (HR, 2.14; 95% CI, 1.48 - 3.09) at daily calcium intakes above 1400 mg. At calcium intakes less than 600 mg/day, these same mortality risks were also elevated. None of these patterns was apparent for mortality from stroke.
In an email exchange with Medscape Medical News, Dr. Michaëlsson said the association of calcium intake and all-cause and cardiovascular mortality "was especially strong if a high dietary intake of calcium was combined with calcium supplements."
Women with the highest intake of calcium (>1400 mg/day) and who used supplement tablets had an all-cause risk for death 2.5 times higher than women who had similar total intakes but were not taking a supplement.
The authors explain that serum calcium levels "are under tight homeostatic control" and do not normally correlate with the amount of calcium intake. However, low or very high intakes override this control, "causing changes in blood levels of calcium or calciotropic hormones."
Complex Study Results; Weak Findings?
Dr. Michaëlsson also noted that some previous studies have shown a similar relationship between calcium supplements and a higher risk for CVD but were not powered to look at mortality and did not assess the amount of dietary intake of calcium.
He advised that one should not make recommendations on the basis of a single study, but emerging evidence suggests caution about high calcium intake. He also noted that a meta-analysis of randomized trials has shown that calcium supplementation actually increased the rate of hip fracture. "My present recommendation is to avoid calcium supplement use if you have a normal varied diet," he said.
Commenting to Medscape Medical News by email, John Cleland, MD PhD, professor of cardiology at Hull York Medical School in Kingston-upon-Hull, United Kingdom, called the study results "extremely complex...with rather weak findings." He pointed out that in the study there were few patients or events in the group with high calcium intake (n = 1241; 2%), and the events were confined to those women taking supplements (total events, n = 23, of which 16 occurred among women taking any form of calcium supplement).
Women with calcium intakes greater than 1400 mg/day who were taking calcium tablets had an adjusted all-cause mortality rate of 2.57 (95% CI, 1.19 - 5.55) compared with 1.17 (95% CI, 0.97 - 1.41) among women who had similar daily intakes but were not taking supplements. "So, it's not the diet but the pills that are the problem," Dr. Cleland concluded, which is essentially in agreement with what Dr. Michaëlsson said.
Dr. Cleland raised the issues of what else may have been in the calcium pills and why the women were taking them; for example, if they had chronic kidney disease or osteoporosis. He said the article did not provide such information but just referred to a previous paper.
He also pointed out that calcium tablets "have not been shown to reduce fracture rates or improve any other patient outcome that I know of." He recommended that people stop taking calcium supplements "until efficacy/safety is shown," and that this advice "should definitely include those taking them for osteoporosis and should perhaps include those taking them for [chronic kidney disease]." His recommendation? "Having a healthy balanced diet and avoiding water filters that reduce calcium in drinking water is probably best."
Source: http://www.medscape.com/viewarticle/779541?nlid=28503_1301&src=wnl_edit_dail
Subscribe to:
Posts (Atom)