We all know exercise is beneficial to our health. Then why is it that ultramarathon runners may generate so many free radicals during a race that they can damage the DNA of a significant percentage of their cells? Researchers have looked at the exercise-induced increase in free radical production as a paradox: why would an apparently healthy act—exercise—lead to detrimental effects through damage to various molecules and tissues? This arises out of somewhat of a misunderstanding: exercise in and of itself is not necessarily the healthy act—it’s the recovery after exercise that is so healthy, the whole “that-which-doesn’t-kill-us-makes-us-stronger” notion. For example, exercise training has been shown to enhance antioxidant defenses by increasing the activities of our antioxidant enzymes. So, during the race ultra-marathoners may be taking hits to their DNA, but a week later they can experience great benefits, as shown in my video, Enhanced Athletic Recovery Without Undermining Adaptation.
In a recent study, researchers from Oregon State University looked at the level of DNA damage in athletes. Six days after a race, athletes didn’t just go back to the baseline level of DNA damage, but had significantly less, presumably because they had revved up their antioxidant defenses. So, maybe exercise-induced oxidative damage is beneficial, similar to vaccination. By freaking out the body a little, we might induce a response that’s favorable in the long run.
This concept, that low levels of a damaging entity can up-regulate protective mechanisms, is known as hormesis. For example, herbicides kill plants, but in tiny doses may actually boost plant growth, presumably by stressing the plant into rallying its resources to successfully fight back.
Wait a second, though. Could eating anti-inflammatory and anti-oxidant rich plant foods undermine this adaptation response? We know that berries may reduce inflammatory muscle damage (See Reducing Muscle Soreness with Berries), and greens may reduce free radical DNA damage (See Preventing Exercise Induced Oxidative Stress with Watercress). Dark chocolate and tomato juice appear to have similar effects. How it works is that flavonoid phytonutrients in fruits, vegetables, and beans seem to inhibit the activity of xanthine oxidase, considered the main contributor of free radicals during exercise. And the carbs in plant foods may also decrease stress hormone levels.
So in 1999, a theoretical concern was raised. Maybe all that free radical stress from exercise is a good thing, and increased consumption of some antioxidant nutrients might interfere with these necessary adaptive processes. If we decrease free radical tissue damage, maybe we won’t get that increase in activity of those antioxidant enzymes.
A group of researchers who performed a study on tart cherry juice and recovery following a marathon responded to this antioxidant concern by suggesting that, although it is likely that muscle damage, inflammation, and oxidative stress are important factors in the adaptation process, minimizing these factors may improve recovery so we can train more and perform better. So, there are theories on both sides, but what happens when we actually put it to the test?
While antioxidant or anti-inflammatory supplements may prevent these adaptive events, researchers found that blackcurrant extract – although packed with antioxidant and anti-inflammatory properties – actually boosted the health benefits of regular exercise.
If we take antioxidant pills—vitamin C and vitamin E supplements— we can also reduce the stress levels induced by exercise, but in doing so we block that boost in antioxidant enzyme activity caused by exercise. Now maybe we don’t need that boost if we don’t have as much damage, but vitamin C supplements seem to impair physical performance in the first place. With plant foods, though, we appear to get the best of both worlds.
For example, lemon verbena, an antioxidant-rich herbal tea, protects against oxidative damage and decreases the signs of muscular damage and inflammation, without blocking the cellular adaptation to exercise. In a recent study, researchers showed that lemon verbena does not affect the increase of the antioxidant enzyme response promoted by exercise. On the contrary: antioxidant enzyme activity was even higher in the lemon verbena group.
Welcome to my collection of health articles. Most of them contain little nuggets of health wisdom that we can easily apply to our daily lives. As you can gather, I've been consuming all sorts of supplements over the years, most of them from iherb. They deliver on time (DHL), and prices are good. If you're a first-time buyer, use my code 'pot089' to enjoy up to $10 off.
Thursday, 11 September 2014
Low carb better than low fat for heart disease reduction
A low-carbohydrate diet may be better than a low-fat diet for losing weight and reducing risks for cardiovascular disease (CVD), according to an article published in the September 2 issue of Annals of Internal Medicine.
Lydia A. Bazzano, MD, PhD, MPH, from the Department of Epidemiology at Tulane University School of Public Health and Tropical Medicine, New Orleans, Louisiana, and colleagues conducted a randomized trial involving 148 adults who did not have diabetes or CVD.
The researchers randomly assigned 75 participants to the low-carbohydrate group and 73 to the low-fat group, with 59 (79%) low-carbohydrate participants and 60 (82%) low-fat participants completing the 1-year study. The groups were well matched, with a median age of 47.8 years in the low-fat group and 45.8 years in the low-carbohydrate group and baseline body mass indexes of 35.6 and 35.2 kg/m2. The groups were predominately female (89% and 88%, respectively) but were racially mixed, with approximately equal numbers of blacks (36 and 40, respectively) and whites (33 and 34, respectively) in each group.
At 1 year, the researchers found that the low-carbohydrate group had a significantly greater reduction in body weight (mean difference, −3.5 kg; 95% confidence interval [CI], −5.6 to −1.4 kg;P = .002), significantly greater proportional reductions in fat mass (mean difference, −1.5%; 95% CI, −2.6% to −0.4%; P = .011), and significantly greater relative increase in lean mass (mean difference, 1.7%; 95% CI, 0.6% - 2.8%; P = .003).
Total and low-density lipoprotein cholesterol levels and plasma glucose levels did not change significantly in either group, nor did blood pressure levels significantly decrease in either group.
However, the low-carbohydrate participants showed significantly greater decreases in C-reactive protein levels (mean difference, −15.2 nmol/L; P = .024); estimated 10-year risk for coronary heart disease, based on the Framingham risk score (mean difference, −1.4%; P < .001); ratio of total to high-density lipoprotein cholesterol (mean difference, −0.44; P = .002); and serum triglyceride levels (mean difference, −0.16 mmol/L;P = .038).
Counseling and Education
Study participants in both groups met with dietitians in weekly individual counseling sessions for the first 4 weeks and then in small groups biweekly for the next 5 months. They met monthly the last 6 months.
Staff members provided equal instructions to both groups in terms of dietary fiber and types of fats. Researchers collected 24-hour dietary recall data at baseline and at 3, 6, and 12 months.
"[T]his 12-month randomized, parallel-group trial showed that a low-carbohydrate diet resulted in greater weight loss and reduction in cardiovascular risk factors than a low-fat diet among obese black and white adults," the researchers conclude.
"Our findings suggest that people who want to lose weight and have risk factors for heart disease could consider a low-carbohydrate diet as an option to both lose weight and improve those risk factors," said coauthor Tian Hu, MD, a doctoral research fellow at Tulane.
Benefits Seen Across Ethnic Groups
The researchers write that their findings are consistent with the findings of other studies as to body weight. "In addition, our findings suggest that the loss of fat mass accounts for most of the reduction in body weight on a low-carbohydrate diet, which is consistent with other study findings," they note.
"Our study has a couple of important differences compared to others. Some previous studies had included only diabetics, or mostly participants with metabolic syndrome or cardiovascular disease. Participants in our study were generally healthy. They might have hypertension or dyslipidemia, but they did not have any clinical cardiovascular disease, kidney disease, or diabetes," said Dr. Hu.
"The diversity of the study group is also an important contribution. Previous studies included mostly white participants or did not test a typical low-carbohydrate diet," he added. "When we evaluated the effects of both diets in African-American and Caucasian participants separately, the results were similar."
What happens after the study is still to be determined. "We could only examine results over the 1-year period of the study. But it's very likely that adherence to the diet may have [an] impact on CVD risk reduction," Dr. Hu noted.
Balance Counts
"I thought [the study] was very well thought out. I'm glad that dieticians and nutrition education were used and the participants had good follow-up," said Kasia Ciaston, MS, RD, LDN, a clinical dietitian at Loyola University Medical Center in Maywood, Illinois.
"My big concern is studying low-carb or low-fat diet, especially with cardiovascular risk, is particularly due to the history of these types of diets," she continued. "For the last 50 to 60 years, fat has been created as the enemy, which caused a significant increase in carbohydrate intake and incidence of diabetes and blood sugar–related disease. I feel that it would be a similar mistake to point the finger of blame at carbohydrates and make another macronutrient the bad guy," she said.
"Although I do agree that decreasing carbohydrate intake from processed foods can have wonderful health benefits, I think the real message to get across here is that getting that right balance in your diet is what really helps to create health benefits, versus cutting out a certain macronutrient," she added.
For the low-carbohydrate participants in this study, "their intake of protein also increased dramatically, which can also create health effects over time. They're also eating a stable amount of fiber in their diet. All of these other factors that aren't necessarily focused on are playing a part in that balance," she said.
"I hope that what clinicians get out of this study is that nutrition is a huge impact factor on decreasing future risks and that [nutrition information] would make a much bigger impact if [patients] weren't just hearing it from me, but if they were hearing it also from their physician and each member of their healthcare team," she added.
Lydia A. Bazzano, MD, PhD, MPH, from the Department of Epidemiology at Tulane University School of Public Health and Tropical Medicine, New Orleans, Louisiana, and colleagues conducted a randomized trial involving 148 adults who did not have diabetes or CVD.
The researchers randomly assigned 75 participants to the low-carbohydrate group and 73 to the low-fat group, with 59 (79%) low-carbohydrate participants and 60 (82%) low-fat participants completing the 1-year study. The groups were well matched, with a median age of 47.8 years in the low-fat group and 45.8 years in the low-carbohydrate group and baseline body mass indexes of 35.6 and 35.2 kg/m2. The groups were predominately female (89% and 88%, respectively) but were racially mixed, with approximately equal numbers of blacks (36 and 40, respectively) and whites (33 and 34, respectively) in each group.
At 1 year, the researchers found that the low-carbohydrate group had a significantly greater reduction in body weight (mean difference, −3.5 kg; 95% confidence interval [CI], −5.6 to −1.4 kg;P = .002), significantly greater proportional reductions in fat mass (mean difference, −1.5%; 95% CI, −2.6% to −0.4%; P = .011), and significantly greater relative increase in lean mass (mean difference, 1.7%; 95% CI, 0.6% - 2.8%; P = .003).
Total and low-density lipoprotein cholesterol levels and plasma glucose levels did not change significantly in either group, nor did blood pressure levels significantly decrease in either group.
However, the low-carbohydrate participants showed significantly greater decreases in C-reactive protein levels (mean difference, −15.2 nmol/L; P = .024); estimated 10-year risk for coronary heart disease, based on the Framingham risk score (mean difference, −1.4%; P < .001); ratio of total to high-density lipoprotein cholesterol (mean difference, −0.44; P = .002); and serum triglyceride levels (mean difference, −0.16 mmol/L;P = .038).
Counseling and Education
Study participants in both groups met with dietitians in weekly individual counseling sessions for the first 4 weeks and then in small groups biweekly for the next 5 months. They met monthly the last 6 months.
Staff members provided equal instructions to both groups in terms of dietary fiber and types of fats. Researchers collected 24-hour dietary recall data at baseline and at 3, 6, and 12 months.
"[T]his 12-month randomized, parallel-group trial showed that a low-carbohydrate diet resulted in greater weight loss and reduction in cardiovascular risk factors than a low-fat diet among obese black and white adults," the researchers conclude.
"Our findings suggest that people who want to lose weight and have risk factors for heart disease could consider a low-carbohydrate diet as an option to both lose weight and improve those risk factors," said coauthor Tian Hu, MD, a doctoral research fellow at Tulane.
Benefits Seen Across Ethnic Groups
The researchers write that their findings are consistent with the findings of other studies as to body weight. "In addition, our findings suggest that the loss of fat mass accounts for most of the reduction in body weight on a low-carbohydrate diet, which is consistent with other study findings," they note.
"Our study has a couple of important differences compared to others. Some previous studies had included only diabetics, or mostly participants with metabolic syndrome or cardiovascular disease. Participants in our study were generally healthy. They might have hypertension or dyslipidemia, but they did not have any clinical cardiovascular disease, kidney disease, or diabetes," said Dr. Hu.
"The diversity of the study group is also an important contribution. Previous studies included mostly white participants or did not test a typical low-carbohydrate diet," he added. "When we evaluated the effects of both diets in African-American and Caucasian participants separately, the results were similar."
What happens after the study is still to be determined. "We could only examine results over the 1-year period of the study. But it's very likely that adherence to the diet may have [an] impact on CVD risk reduction," Dr. Hu noted.
Balance Counts
"I thought [the study] was very well thought out. I'm glad that dieticians and nutrition education were used and the participants had good follow-up," said Kasia Ciaston, MS, RD, LDN, a clinical dietitian at Loyola University Medical Center in Maywood, Illinois.
"My big concern is studying low-carb or low-fat diet, especially with cardiovascular risk, is particularly due to the history of these types of diets," she continued. "For the last 50 to 60 years, fat has been created as the enemy, which caused a significant increase in carbohydrate intake and incidence of diabetes and blood sugar–related disease. I feel that it would be a similar mistake to point the finger of blame at carbohydrates and make another macronutrient the bad guy," she said.
"Although I do agree that decreasing carbohydrate intake from processed foods can have wonderful health benefits, I think the real message to get across here is that getting that right balance in your diet is what really helps to create health benefits, versus cutting out a certain macronutrient," she added.
For the low-carbohydrate participants in this study, "their intake of protein also increased dramatically, which can also create health effects over time. They're also eating a stable amount of fiber in their diet. All of these other factors that aren't necessarily focused on are playing a part in that balance," she said.
"I hope that what clinicians get out of this study is that nutrition is a huge impact factor on decreasing future risks and that [nutrition information] would make a much bigger impact if [patients] weren't just hearing it from me, but if they were hearing it also from their physician and each member of their healthcare team," she added.
Pilots, Crew Have Double the Rate of Melanoma
As in the mythical story of Icarus, human beings in the modern world who fly too close to the sun have a price to pay, new research suggests.
Airline pilots and cabin crews have twice the incidence of melanoma as the general population, according to the largest meta-analysis to date on the subject.
The standardized incidence ratio of melanoma for any flight-based occupation was 2.21 (P < .001).
Furthermore, melanoma was more deadly in these professionals, who had a disease-related mortality rate 42% greater than that of more regularly earthbound beings.
Specifically, the standardized mortality ratio for any flight-based occupation was 1.42 (P = .02).
"Further research on the mechanisms [of harm] and optimal occupational protection is needed," conclude the researchers, led by Martina Sanlorenzo, MD, a research fellow in the Department of Dermatology at the University of California, San Francisco.
They suspect that the increase in melanoma risk seen in airline pilots and cabin crews comes from exposure to ultraviolet (UV) light, which becomes more intense the higher a plane flies. "At 9000 meters [30,000 feet], where most commercial aircraft fly, the UV level is approximately twice that of the ground," the researchers report.
Remarkably, UV light exposure is "not a well-recognized occupational risk factor" for flight crews, they note. In contrast, ionizing radiation is a well-known risk, and levels in airline crews are regularly monitored.
This study provides no details on the clinical aspects of the melanomas.
"It would be interesting to see the anatomic distribution of skin cancers in the crew, because only hands and face should be exposed to UV passing through the windows," said Steven Wang, MD, director of dermatologic surgery and dermatology at the Memorial Sloan Kettering Cancer Center in Basking Ridge, New Jersey, who was not involved in the study.
Frequent flier passengers are probably not at an increased risk, he said in an email to Medscape Medical News. But for the worried, pulling down the window shade and sitting in an aisle seat are protective strategies, added Dr. Wang, who is the author ofBeating Melanoma: A 5-Step Survival Guide.
The meta-analysis examined 19 studies (15 with data on pilots; 4 with data on cabin crews). Data from the studies, which involved more than 266,000 participants, were collected from 1943 to 2008 in more than 11 countries.
The researchers conducted separate analyses for pilots and for cabin crews (which include flight attendants).
They found that pilots were at greater risk for melanoma than the cabin crew.
In fact, pilots and crews would have more than double the annual number of new melanomas per year, which is 21.3 per 100,000 individuals in the general population.
Pilots as a group were also at greater risk for mortality, and had a higher mortality ratio than cabin crews (1.83 vs 0.90).
UVA Radiation and Glass Might Be the Problems
The researchers discuss the technicalities of the types of UV radiation and windshields.
They cite measurements of UV radiation that passes through the windshields of a variety of aircraft — large (e.g., Boeing737 and Airbus A320) and small (e.g., Cessna 182) — from a 2007 report by the US Federal Aviation Administration.
The transmission of UVB is less than 1% for both plastic and glass windshields.
However, UVA is a different story, especially in airplanes with glass windows. "While plastic materials blocked almost all UVA radiation, 54% of it came through glass," Dr. Sanlorenzo and colleagues report.
Airline pilots and cabin crews have twice the incidence of melanoma as the general population, according to the largest meta-analysis to date on the subject.
The standardized incidence ratio of melanoma for any flight-based occupation was 2.21 (P < .001).
Furthermore, melanoma was more deadly in these professionals, who had a disease-related mortality rate 42% greater than that of more regularly earthbound beings.
Specifically, the standardized mortality ratio for any flight-based occupation was 1.42 (P = .02).
"Further research on the mechanisms [of harm] and optimal occupational protection is needed," conclude the researchers, led by Martina Sanlorenzo, MD, a research fellow in the Department of Dermatology at the University of California, San Francisco.
They suspect that the increase in melanoma risk seen in airline pilots and cabin crews comes from exposure to ultraviolet (UV) light, which becomes more intense the higher a plane flies. "At 9000 meters [30,000 feet], where most commercial aircraft fly, the UV level is approximately twice that of the ground," the researchers report.
Remarkably, UV light exposure is "not a well-recognized occupational risk factor" for flight crews, they note. In contrast, ionizing radiation is a well-known risk, and levels in airline crews are regularly monitored.
This study provides no details on the clinical aspects of the melanomas.
"It would be interesting to see the anatomic distribution of skin cancers in the crew, because only hands and face should be exposed to UV passing through the windows," said Steven Wang, MD, director of dermatologic surgery and dermatology at the Memorial Sloan Kettering Cancer Center in Basking Ridge, New Jersey, who was not involved in the study.
Frequent flier passengers are probably not at an increased risk, he said in an email to Medscape Medical News. But for the worried, pulling down the window shade and sitting in an aisle seat are protective strategies, added Dr. Wang, who is the author ofBeating Melanoma: A 5-Step Survival Guide.
The meta-analysis examined 19 studies (15 with data on pilots; 4 with data on cabin crews). Data from the studies, which involved more than 266,000 participants, were collected from 1943 to 2008 in more than 11 countries.
The researchers conducted separate analyses for pilots and for cabin crews (which include flight attendants).
They found that pilots were at greater risk for melanoma than the cabin crew.
In fact, pilots and crews would have more than double the annual number of new melanomas per year, which is 21.3 per 100,000 individuals in the general population.
Pilots as a group were also at greater risk for mortality, and had a higher mortality ratio than cabin crews (1.83 vs 0.90).
UVA Radiation and Glass Might Be the Problems
The researchers discuss the technicalities of the types of UV radiation and windshields.
They cite measurements of UV radiation that passes through the windshields of a variety of aircraft — large (e.g., Boeing737 and Airbus A320) and small (e.g., Cessna 182) — from a 2007 report by the US Federal Aviation Administration.
The transmission of UVB is less than 1% for both plastic and glass windshields.
However, UVA is a different story, especially in airplanes with glass windows. "While plastic materials blocked almost all UVA radiation, 54% of it came through glass," Dr. Sanlorenzo and colleagues report.
Reducing sedentary time protects DNA
Less time spent sitting may lengthen telomeres and protect DNA from age-related damage, a new study shows. However, the researchers found no significant association in telomere length with increased exercise in this small randomized trial.
"We're excited about this study," said lead author Mai-Lis Hellenius, MD, PhD, a professor of cardiovascular disease prevention in the Department of Medicine, Karolinska Institutet, Karolinska University Hospital, Stockholm, Sweden. "Long telomeres are linked to a longer, healthier life."
Per Sjögren, PhD, from the Unit of Clinical Nutrition and Metabolism, Department of Public Health and Caring Sciences, Uppsala University, Sweden, and colleagues published the study online September 3 in the British Journal of Sports Medicine.
Telomeres are sections of DNA that protect the ends of chromosomes from damage during cell division. Telomeres tend to shorten with age, and previous research has linked shorter telomeres with shorter life spans and increased risk for some types of cancer and cardiovascular disease.
"Our DNA will be damaged during a lifetime," Dr. Hellenius said. "It's damaged by bad diets, smoking, infections, and so on. Our capability to copy and produce new DNA and new cells is so important."
Previous research has shown that people who exercise more tend to have longer telomeres.
In the current study, Dr. Sjögren and colleagues analyzed the length of telomeres a subset of people who had participated in a randomized exercise trial. The original study included 101 predominantly sedentary and overweight people adults aged 68 years at baseline. The investigators randomly assigned half of the participants to an exercise program and the other half to their regular behavior.
They asked the participants to keep a 7-day diary and to wear a pedometer to measure the number of steps taken every day. They also asked them to fill out a questionnaire tallying up the amount of time spent sitting down each day.
As the researchers reported previously, the time spent exercising, as well as the number of steps taken daily, increased significantly in the group assigned the exercise program, whereas the amount of time spent seated fell in both groups ( Br J Sports Med. 2011;45:158). Various risk factors for heart disease and stroke also improved in both groups, particularly those on the exercise program, who lost more weight than their counterparts left to their own devices.
In the current subset analysis, the researchers examined telomere length from blood samples drawn at baseline and after the 6-month intervention in 49 randomly chosen participants. They found that the number of daily steps taken was not associated with changes in telomere length.
An increase in moderate-intensity physical activity correlated to a shortening in telomere length in both groups, but this finding was not statistically significant. However, a reduction in the amount of time spent sitting in the exercise group was associated with telomere lengthening in blood cells.
"We hypothesise that a reduction in sitting hours is of greater importance than an increase in exercise time for elderly risk individuals," the researchers conclude.
Dr. Hellenius acknowledged that the study was a small one and needed to be replicated before drawing conclusions from it. However, she notes, other research has also shown that people who spend more time sitting have shorter lives, regardless of whether they also exercise regularly.
"So I think you can say to our patients yes, it's important to break up sedentary time and take a break for 1 or 2 minutes every 30 minutes," said Dr. Hellenius.
"We're excited about this study," said lead author Mai-Lis Hellenius, MD, PhD, a professor of cardiovascular disease prevention in the Department of Medicine, Karolinska Institutet, Karolinska University Hospital, Stockholm, Sweden. "Long telomeres are linked to a longer, healthier life."
Per Sjögren, PhD, from the Unit of Clinical Nutrition and Metabolism, Department of Public Health and Caring Sciences, Uppsala University, Sweden, and colleagues published the study online September 3 in the British Journal of Sports Medicine.
Telomeres are sections of DNA that protect the ends of chromosomes from damage during cell division. Telomeres tend to shorten with age, and previous research has linked shorter telomeres with shorter life spans and increased risk for some types of cancer and cardiovascular disease.
"Our DNA will be damaged during a lifetime," Dr. Hellenius said. "It's damaged by bad diets, smoking, infections, and so on. Our capability to copy and produce new DNA and new cells is so important."
Previous research has shown that people who exercise more tend to have longer telomeres.
In the current study, Dr. Sjögren and colleagues analyzed the length of telomeres a subset of people who had participated in a randomized exercise trial. The original study included 101 predominantly sedentary and overweight people adults aged 68 years at baseline. The investigators randomly assigned half of the participants to an exercise program and the other half to their regular behavior.
They asked the participants to keep a 7-day diary and to wear a pedometer to measure the number of steps taken every day. They also asked them to fill out a questionnaire tallying up the amount of time spent sitting down each day.
As the researchers reported previously, the time spent exercising, as well as the number of steps taken daily, increased significantly in the group assigned the exercise program, whereas the amount of time spent seated fell in both groups ( Br J Sports Med. 2011;45:158). Various risk factors for heart disease and stroke also improved in both groups, particularly those on the exercise program, who lost more weight than their counterparts left to their own devices.
In the current subset analysis, the researchers examined telomere length from blood samples drawn at baseline and after the 6-month intervention in 49 randomly chosen participants. They found that the number of daily steps taken was not associated with changes in telomere length.
An increase in moderate-intensity physical activity correlated to a shortening in telomere length in both groups, but this finding was not statistically significant. However, a reduction in the amount of time spent sitting in the exercise group was associated with telomere lengthening in blood cells.
"We hypothesise that a reduction in sitting hours is of greater importance than an increase in exercise time for elderly risk individuals," the researchers conclude.
Dr. Hellenius acknowledged that the study was a small one and needed to be replicated before drawing conclusions from it. However, she notes, other research has also shown that people who spend more time sitting have shorter lives, regardless of whether they also exercise regularly.
"So I think you can say to our patients yes, it's important to break up sedentary time and take a break for 1 or 2 minutes every 30 minutes," said Dr. Hellenius.
Hotter breath linked to lung cancer
MUNICH, Germany — A preliminary study of exhaled breath suggests that temperatures higher than 34°C may be a strong indicator of the presence of lung cancer. In a sample of 82 patients who presented with clinical suspicion of lung cancer, 96% of patients had the disease.
"Airway inflammation and angiogenesis play a key role in the pathogenesis of lung cancer," Giulia Scioscia, MD, a fellow in pulmonary medicine at Hospital Clínic i Provincial de Barcelona, said at a news conference here at the European Respiratory Society (ERS) International Congress 2014. "Exhaled breath temperature has been proven to be an indicator of airway inflammation and increased vascularity. The aim of our research is to determine possible correlations of exhaled breath with progression, metastasis or other clinical outcomes of patients with cancers."
The researchers aimed to find breath temperature values that correlate with lung cancer. They analyzed 82 consecutive participants with a radiologic suspicion of lung cancer using the X-Halo (Delmedica Investments) breath thermometer. Lung cancer was later diagnosed in 40 people. A total of 42 did not have lung cancer and served as controls. All patients underwent standard diagnostic and staging procedures.
The team compared breath temperatures in patients with lung cancer and controls in various categories, including by sex, age, smoking habit, and the presence or absence of chronic obstructive pulmonary disease. In each category, people with cancer had statistically significantly higher breath temperatures than controls.
The researchers then used a receiver operating characteristic curve to determine an ideal temperature threshold for signaling lung cancer. That analysis showed that values above 34°C would be a good cutoff. Most — 96% — who had breath temperatures in that range were found to have lung cancer.
Pilot Study
The researchers stressed that this was a pilot study. "It's not sensitive enough," coauthor Giovanna Elisiana Carpagnano, MD, from the University of Foggia in Italy, said at the news conference. However, Dr. Carpagnano added that it could eventually become part of a screening panel. The researchers plan to explore the test in patients with other inflammatory conditions to get a better idea of its performance.
"I think the data are quite interesting," chair of the ERS lung cancer group Fernando Gamarra, MD, from Klinikum St. Elisabeth Straubing in Germany said. It's also novel. "I'm not aware that anybody has done it before," he added. Dr. Gamarra said he thinks that the researchers will have to account for other factors that could affect breath temperature, such as whether the person is currently smoking, the velocity with which a person exhales, or overall size. Large people may have higher breath temperatures, he pointed out.
It also remains to be seen if the approach could work consistently in early-stage cancers, which is the population that would most benefit from screening. The current study drew from patients already presenting at a hospital, and so they tended to have more advanced disease, although the researchers did note higher temperatures even in early cancers. "I would like to see a study in a limited group, say just in patients with stage 1 or 2," Dr. Gamarra said.
"Airway inflammation and angiogenesis play a key role in the pathogenesis of lung cancer," Giulia Scioscia, MD, a fellow in pulmonary medicine at Hospital Clínic i Provincial de Barcelona, said at a news conference here at the European Respiratory Society (ERS) International Congress 2014. "Exhaled breath temperature has been proven to be an indicator of airway inflammation and increased vascularity. The aim of our research is to determine possible correlations of exhaled breath with progression, metastasis or other clinical outcomes of patients with cancers."
The researchers aimed to find breath temperature values that correlate with lung cancer. They analyzed 82 consecutive participants with a radiologic suspicion of lung cancer using the X-Halo (Delmedica Investments) breath thermometer. Lung cancer was later diagnosed in 40 people. A total of 42 did not have lung cancer and served as controls. All patients underwent standard diagnostic and staging procedures.
The team compared breath temperatures in patients with lung cancer and controls in various categories, including by sex, age, smoking habit, and the presence or absence of chronic obstructive pulmonary disease. In each category, people with cancer had statistically significantly higher breath temperatures than controls.
The researchers then used a receiver operating characteristic curve to determine an ideal temperature threshold for signaling lung cancer. That analysis showed that values above 34°C would be a good cutoff. Most — 96% — who had breath temperatures in that range were found to have lung cancer.
Pilot Study
The researchers stressed that this was a pilot study. "It's not sensitive enough," coauthor Giovanna Elisiana Carpagnano, MD, from the University of Foggia in Italy, said at the news conference. However, Dr. Carpagnano added that it could eventually become part of a screening panel. The researchers plan to explore the test in patients with other inflammatory conditions to get a better idea of its performance.
"I think the data are quite interesting," chair of the ERS lung cancer group Fernando Gamarra, MD, from Klinikum St. Elisabeth Straubing in Germany said. It's also novel. "I'm not aware that anybody has done it before," he added. Dr. Gamarra said he thinks that the researchers will have to account for other factors that could affect breath temperature, such as whether the person is currently smoking, the velocity with which a person exhales, or overall size. Large people may have higher breath temperatures, he pointed out.
It also remains to be seen if the approach could work consistently in early-stage cancers, which is the population that would most benefit from screening. The current study drew from patients already presenting at a hospital, and so they tended to have more advanced disease, although the researchers did note higher temperatures even in early cancers. "I would like to see a study in a limited group, say just in patients with stage 1 or 2," Dr. Gamarra said.
Prediabetes Increases Cancer Risk
Prediabetes — even at lower levels of impaired fasting glucose (IFG) as defined by the American Diabetes Association (ADA) — is associated with a significant increase in cancer risk that is independent of the effects of obesity, according to the results of a large new meta-analysis.
Yuli Huang, MD, PhD, from First People's Hospital of Shunde, Foshan, China, and colleagues found that prediabetes was associated with a 15% increased risk for cancer overall, based on data derived from 16 prospective cohort studies, with differing risks depending on the type of cancer. The risk was particularly increased for liver cancer and stomach or colorectal cancer.
"Epidemiological studies have shown that diabetes increases the incidence of cancer and mortality from it, but we thought that maybe not only diabetes but prediabetes may be associated with an elevated cancer risk," Dr. Huang said.
"And considering the high prevalence of prediabetes as well as the robust and significant association between prediabetes and cancer we demonstrated in our study, a successful lifestyle intervention could have a major public-health impact on cancer prevention," he added.
"This information is important to health professionals and those engaged in the prevention of cancer," Dr. Huang and colleagues stress.
The study was published online September 8 in Diabetologia.
Screen for Prediabetes Using ADA Criteria for Cancer Prevention
A total of 891,426 participants from the 16 studies, 4 of which were from Asia, 11 from the United States and Europe, and 1 from Africa, were included in the meta-analysis. Prediabetes included individuals with impaired glucose tolerance (IGT), IFG, or a combination of the 2.
Over 10 years ago, the ADA lowered its definition of prediabetes to that of a fasting plasma glucose concentration ranging from 5.6 to 6.9 mmol/L, but other organizations have maintained their definition at a higher level, as a fasting plasma glucose concentration between 6.1 and 6.9 mmol/L.
The risks of site-specific cancer among individuals with prediabetes was highest for liver cancer (relative risk [RR], 2.01), followed by colorectal and stomach cancer (RR, 1.55 for both), endometrial cancer (RR, 1.60), and breast as well as pancreatic cancer (RR, 1.19 for both).
But prediabetes was not associated with cancer of the bronchus, lung, prostate, ovary, kidney, or bladder.
The risks were increased even when a lower fasting plasma glucose value of 5.6 to 6.9 mmol/L was used, as well as in participants with IGT — "an interesting finding," investigators note, "and one that reaffirms the importance of screening for prediabetes using the ADA criteria with a view to cancer prevention."
As the authors point out, obesity — in itself a key risk factor for diabetes — has also been linked to cancer.
To rule out obesity as a potential confounder, the authors did a sensitivity analysis that included only studies that adjusted for body mass index (BMI).
After controlling for BMI, "we found that...the presence of prediabetes remained associated with an increased risk of cancer by 22%," they state. "So we believe that the increased risk of cancer in individuals with prediabetes is independent of obesity."
"Our study indicates that, on the basis of a snapshot blood glucose measurement, prediabetes is associated with an increased risk of cancer," they state.
Need for Trials of Metformin to Prevent Cancer
Speculating on the reasons for the association between prediabetes and increased cancer risk, Dr. Huang and colleagues suggest that chronic hyperglycemia and resulting states, including chronic oxidative stress and the accumulation of advanced glycated end products, may serve as carcinogens.
Alternatively, increased insulin resistance characteristic of prediabetes increases insulin secretion, and increased insulin levels may promote the growth of cancer cells.
Certain genetic mutations may also predispose individuals with prediabetes to an elevated cancer risk, they suggest.
As the authors point out, it is noteworthy that metformin is associated with an approximately 30% reduction in the lifetime risk of cancer in diabetic patients.
"Considering the protective anticancer properties of metformin in patients with diabetes, as well as its ability to delay progress to diabetes in patients with prediabetes, we feel studies are urgently needed to explore the effects of metformin on cancer risk in people with prediabetes, and [if shown to be protective], metformin might be recommended in select high-risk individuals, especially those with IGT or a combination of IGT and IFG," they conclude.
Yuli Huang, MD, PhD, from First People's Hospital of Shunde, Foshan, China, and colleagues found that prediabetes was associated with a 15% increased risk for cancer overall, based on data derived from 16 prospective cohort studies, with differing risks depending on the type of cancer. The risk was particularly increased for liver cancer and stomach or colorectal cancer.
"Epidemiological studies have shown that diabetes increases the incidence of cancer and mortality from it, but we thought that maybe not only diabetes but prediabetes may be associated with an elevated cancer risk," Dr. Huang said.
"And considering the high prevalence of prediabetes as well as the robust and significant association between prediabetes and cancer we demonstrated in our study, a successful lifestyle intervention could have a major public-health impact on cancer prevention," he added.
"This information is important to health professionals and those engaged in the prevention of cancer," Dr. Huang and colleagues stress.
The study was published online September 8 in Diabetologia.
Screen for Prediabetes Using ADA Criteria for Cancer Prevention
A total of 891,426 participants from the 16 studies, 4 of which were from Asia, 11 from the United States and Europe, and 1 from Africa, were included in the meta-analysis. Prediabetes included individuals with impaired glucose tolerance (IGT), IFG, or a combination of the 2.
Over 10 years ago, the ADA lowered its definition of prediabetes to that of a fasting plasma glucose concentration ranging from 5.6 to 6.9 mmol/L, but other organizations have maintained their definition at a higher level, as a fasting plasma glucose concentration between 6.1 and 6.9 mmol/L.
The risks of site-specific cancer among individuals with prediabetes was highest for liver cancer (relative risk [RR], 2.01), followed by colorectal and stomach cancer (RR, 1.55 for both), endometrial cancer (RR, 1.60), and breast as well as pancreatic cancer (RR, 1.19 for both).
But prediabetes was not associated with cancer of the bronchus, lung, prostate, ovary, kidney, or bladder.
The risks were increased even when a lower fasting plasma glucose value of 5.6 to 6.9 mmol/L was used, as well as in participants with IGT — "an interesting finding," investigators note, "and one that reaffirms the importance of screening for prediabetes using the ADA criteria with a view to cancer prevention."
As the authors point out, obesity — in itself a key risk factor for diabetes — has also been linked to cancer.
To rule out obesity as a potential confounder, the authors did a sensitivity analysis that included only studies that adjusted for body mass index (BMI).
After controlling for BMI, "we found that...the presence of prediabetes remained associated with an increased risk of cancer by 22%," they state. "So we believe that the increased risk of cancer in individuals with prediabetes is independent of obesity."
"Our study indicates that, on the basis of a snapshot blood glucose measurement, prediabetes is associated with an increased risk of cancer," they state.
Need for Trials of Metformin to Prevent Cancer
Speculating on the reasons for the association between prediabetes and increased cancer risk, Dr. Huang and colleagues suggest that chronic hyperglycemia and resulting states, including chronic oxidative stress and the accumulation of advanced glycated end products, may serve as carcinogens.
Alternatively, increased insulin resistance characteristic of prediabetes increases insulin secretion, and increased insulin levels may promote the growth of cancer cells.
Certain genetic mutations may also predispose individuals with prediabetes to an elevated cancer risk, they suggest.
As the authors point out, it is noteworthy that metformin is associated with an approximately 30% reduction in the lifetime risk of cancer in diabetic patients.
"Considering the protective anticancer properties of metformin in patients with diabetes, as well as its ability to delay progress to diabetes in patients with prediabetes, we feel studies are urgently needed to explore the effects of metformin on cancer risk in people with prediabetes, and [if shown to be protective], metformin might be recommended in select high-risk individuals, especially those with IGT or a combination of IGT and IFG," they conclude.
Wednesday, 12 March 2014
How Selenium, Vitamin E Increase Prostate Cancer Risk
New data from the much publicized Selenium and Vitamin E Cancer Prevention Trial (SELECT), which sought to determine whether these supplements could protect against the development of prostate cancer, confirm that both antioxidants can be risky business for men.
As previously reported, men receive no preventive benefit from either selenium or vitamin E supplements; in fact, for certain men, these supplements actually increased the risk for prostate cancer.
The new study, published online February 22 in the Journal of the National Cancer Institute, explored which men who take these supplements are most at risk for prostate cancer, and why.
However, the ongoing public health message from the trial remains the same, said a trial investigator.
"Men using these supplements should stop, period. Neither selenium nor vitamin E supplementation confer any known [health] benefits — only risks," said lead author Alan Kristal, DrPH, from the Fred Hutchinson Cancer Research Center in Seattle, in a press statement.
"Many people think that dietary supplements are helpful or at the least innocuous. This is not true," he added.
The cohort of 4856 men was culled from SELECT, the larger phase 3 placebo-controlled trial in which more than 35,000 men were randomized to high-dose vitamin E (400 IU/day) and/or selenium (200 µg/day) supplements.
SELECT began in 2001 and was expected to run for 12 years, but it was stopped early, in 2008, after participants had been on the supplements for an average of 5 years. The results demonstrated that there was no protective effect from selenium and suggested that vitamin E increased prostate cancer risk.
Although the use of the supplements stopped, the study actually continued. After 2 years of follow-up, the men who took vitamin E had a statistically significant 17% increased risk for prostate cancer, as previously reported.
Notably, the rate of prostate cancer detection was higher in the groups that received either supplement alone or a combination of the 2 than in the placebo group (but the difference was significant only in the vitamin E group).
Selenium is a nonmetallic trace element found in plant in foods such as rice, wheat, and Brazil nuts, and in seafood and meat. In a previous large skin cancer prevention trial, it was associated with a reduced risk for prostate cancer. According to the National Cancer Institute, it is an antioxidant that might help control cell damage that can lead to cancer.
Vitamin E is found in a wide range of foods, especially vegetables, vegetable oils, nuts, and egg yolks. Like selenium, vitamin E is considered an antioxidant.
Key: Increased Risk Depends on Baseline Selenium
In this new case–cohort study, 1739 men diagnosed with prostate cancer during SELECT were compared with 3117 men who were not.
Dr. Kristal and colleagues found that baseline selenium status alone, in the absence of supplementation, was not associated with prostate cancer risk.
However, they also found that the effects of the supplements differed substantially between men with low levels at baseline and those with high levels.
Specifically, selenium supplementation increased the risk for prostate cancer in men who already had high selenium levels at baseline.
Before SELECT even began, there was evidence that selenium supplementation would not benefit men who already had an adequate intake of the nutrient.
For this reason, at baseline, the investigators measured the concentration of selenium in the toenails of participants. The plan was to test whether supplementation would benefit only the subset of men with low selenium levels at baseline, they explain.
Instead, they found that men with high selenium levels at baseline who took selenium supplements increased their risk for high-grade cancer by 91% (P = .007). In other words, the levels of selenium in these men became toxic.
The investigators also report that vitamin E increased prostate cancer risk in men, but only in those with low selenium levels at baseline.
Specifically, in the men with low levels of selenium randomized to receive vitamin E alone, the total risk for prostate cancer increased by 63% (P = .02) and the risk for high-grade cancer increased by 111% (P = .01).
This might explain why, in the 2008 SELECT results, only the men randomized to receive vitamin E alone, not those who received both vitamin E and selenium, had an increased risk for prostate cancer.
There is some evidence from basic science to support the idea of a meaningful dynamic. "An interaction between vitamin E and selenium has long been hypothesized because of their activities in preventing lipid peroxidation," Dr. Kristal and colleagues write.
Selenium, whether from dietary sources or supplements, might protect men from the harmful effects of vitamin E, they suggest. So selenium, at low levels, is not necessarily harmful to men.
Nevertheless, these new results are consistent with the medical literature on supplements and cancer, the investigators report. The message is that nothing good is gained in healthy people.
The literature "suggests that effects of supplementation are dependent upon the nutrient status of the target population, such that supplementation of populations with adequate nutrient status, leading to supraphysiological exposure, has either no effect or increases cancer risk," they write.
As previously reported, men receive no preventive benefit from either selenium or vitamin E supplements; in fact, for certain men, these supplements actually increased the risk for prostate cancer.
The new study, published online February 22 in the Journal of the National Cancer Institute, explored which men who take these supplements are most at risk for prostate cancer, and why.
However, the ongoing public health message from the trial remains the same, said a trial investigator.
"Men using these supplements should stop, period. Neither selenium nor vitamin E supplementation confer any known [health] benefits — only risks," said lead author Alan Kristal, DrPH, from the Fred Hutchinson Cancer Research Center in Seattle, in a press statement.
"Many people think that dietary supplements are helpful or at the least innocuous. This is not true," he added.
The cohort of 4856 men was culled from SELECT, the larger phase 3 placebo-controlled trial in which more than 35,000 men were randomized to high-dose vitamin E (400 IU/day) and/or selenium (200 µg/day) supplements.
SELECT began in 2001 and was expected to run for 12 years, but it was stopped early, in 2008, after participants had been on the supplements for an average of 5 years. The results demonstrated that there was no protective effect from selenium and suggested that vitamin E increased prostate cancer risk.
Although the use of the supplements stopped, the study actually continued. After 2 years of follow-up, the men who took vitamin E had a statistically significant 17% increased risk for prostate cancer, as previously reported.
Notably, the rate of prostate cancer detection was higher in the groups that received either supplement alone or a combination of the 2 than in the placebo group (but the difference was significant only in the vitamin E group).
Selenium is a nonmetallic trace element found in plant in foods such as rice, wheat, and Brazil nuts, and in seafood and meat. In a previous large skin cancer prevention trial, it was associated with a reduced risk for prostate cancer. According to the National Cancer Institute, it is an antioxidant that might help control cell damage that can lead to cancer.
Vitamin E is found in a wide range of foods, especially vegetables, vegetable oils, nuts, and egg yolks. Like selenium, vitamin E is considered an antioxidant.
Key: Increased Risk Depends on Baseline Selenium
In this new case–cohort study, 1739 men diagnosed with prostate cancer during SELECT were compared with 3117 men who were not.
Dr. Kristal and colleagues found that baseline selenium status alone, in the absence of supplementation, was not associated with prostate cancer risk.
However, they also found that the effects of the supplements differed substantially between men with low levels at baseline and those with high levels.
Specifically, selenium supplementation increased the risk for prostate cancer in men who already had high selenium levels at baseline.
Before SELECT even began, there was evidence that selenium supplementation would not benefit men who already had an adequate intake of the nutrient.
For this reason, at baseline, the investigators measured the concentration of selenium in the toenails of participants. The plan was to test whether supplementation would benefit only the subset of men with low selenium levels at baseline, they explain.
Instead, they found that men with high selenium levels at baseline who took selenium supplements increased their risk for high-grade cancer by 91% (P = .007). In other words, the levels of selenium in these men became toxic.
The investigators also report that vitamin E increased prostate cancer risk in men, but only in those with low selenium levels at baseline.
Specifically, in the men with low levels of selenium randomized to receive vitamin E alone, the total risk for prostate cancer increased by 63% (P = .02) and the risk for high-grade cancer increased by 111% (P = .01).
This might explain why, in the 2008 SELECT results, only the men randomized to receive vitamin E alone, not those who received both vitamin E and selenium, had an increased risk for prostate cancer.
There is some evidence from basic science to support the idea of a meaningful dynamic. "An interaction between vitamin E and selenium has long been hypothesized because of their activities in preventing lipid peroxidation," Dr. Kristal and colleagues write.
Selenium, whether from dietary sources or supplements, might protect men from the harmful effects of vitamin E, they suggest. So selenium, at low levels, is not necessarily harmful to men.
Nevertheless, these new results are consistent with the medical literature on supplements and cancer, the investigators report. The message is that nothing good is gained in healthy people.
The literature "suggests that effects of supplementation are dependent upon the nutrient status of the target population, such that supplementation of populations with adequate nutrient status, leading to supraphysiological exposure, has either no effect or increases cancer risk," they write.
Adding Flavanols to Dark Chocolate Doesn't Improve Health Benefits
Excuses for regularly indulging in dark chocolate keep accumulating, with the added possibility that its ingredients, thought to promote relaxed arteries, also have biochemical effects that may discourage atherosclerosis. But special flavanol-enriched formulations of dark chocolate may do little to enhance its vascular benefits, suggests a report published in the March 2014 issue of the FASEB Journal.
In a randomized, double-blind study, eating dark chocolate—acutely and over weeks—not only improved objective measures of endovascular function, it also improved biochemical markers that reflect leukocyte activation, inflammation, and other signs of atherogenesis.
The study further compared normal- vs high-flavanol dark-chocolate consumption by its participants, a few dozen overweight and mildly obese middle-aged male volunteers. It saw few important differences in vascular functional or biochemical effects following ingestion of either formulation of chocolate.
Changes in endothelial function were reflected in improved flow-mediated dilation (FMD), blood pressure, and augmentation index (AIX), while "changes in leukocyte-cell counts, plasma cytokines, and leukocyte adherence markers after chocolate consumption point toward a less-activated state of cellular adherence and, hence, a less atherogenic milieu," according to the authors, led by Dr Diederik Esser (Top Institute Food and Nutrition and Wageningen University, the Netherlands).
"Extra flavanols did not augment these effects," they write, "but did affect taste and had a negative effect on the motivation to eat chocolate."
Beneficial vascular changes and even improved clinical outcomes have long been attributed to consumption of flavanol- and polyphenol-rich foods like chocolate, red wine, cinnamon, and tea in both observational and interventional studies. Chocolate itself has been associated with improved blood pressure and cerebral perfusion and a reduction in stroke, heart failure, and other cardiovascular-event outcomes.
Acute Chocolate Effects
In a double-blind crossover randomization, 29 men consumed 70 g of chocolate on each of two days separated by at least one week. On one occasion, the chocolate contained normal levels of flavanols, and on the other it was supplemented with extra flavanols.
Ingestion of both types of chocolate decreased AIX (indicating decreased central aortic pressure) but didn't affect FMD (reflecting endothelial function). Similarly, hematocrit and counts of thrombocytes, lymphocytes, monocytes, and neutrophils went up within two hours. Plasma soluble intercellular adhesion molecule (sICAM) 3, interleukin (IL)-1β, and von Willebrand factor levels went up, and plasma IL-6 levels fell two hours after ingestion of both kinds of chocolate. Insulin also went up after both chocolates, but more so after the high-flavanol kind.
The Four-Week Crossover Trial
The same 29 subjects plus an additional 15 participated in a second double-blind experiment in which four weeks of 70 g per day of either high- or low-flavanol chocolate was followed by a four-week washout period and then crossover to four weeks of the other kind of chocolate. They followed certain dietary restrictions to minimize background flavanol intake.
Four weeks of consuming chocolate of either kind elevated fasting FMD by a percentage point and dropped AIX to a similar extent; both measures reverted to baseline after the washout period. The literature suggests a significant 13% reduction in cardiovascular-event risk for every 1% rise in FMD after chocolate consumption, according to the authors.
In a novel finding, four weeks of chocolate consumption significantly lowered counts of circulating leukocytes, suggesting decreased inflammation; also, "leukocytes can transmigrate through the endothelium and therefore play a crucial role in the formation of atherosclerosis."
Chocolate also lowered protein expression of lymphocytes CD62L and CD11b, monocyte CD62L, and neutrophils CD66b and CD11c. "These cell-surface molecules are involved in leukocyte recruitment and adherence to the endothelium during the initial steps of atherosclerosis," the group writes.
"A minor but significant" 0.1-mL rise in fasting plasma glucose and 0.3-mL increase in plasma free fatty acids was evident after four weeks of chocolate of either kind.
Taking the High-Fat Challenge
"High-fat-challenge" tests conducted after each four-week period of chocolate intake required drinking a 95 fat-gram shake of yogurt, canola oil, and other ingredients followed by endothelial-function and metabolic marker tests at 1.5 and 3.0 hours to assess postprandial responses.
Consumption of the high-fat shake was followed by average declines in FMD by 1.8 percentage points, in AIX by four percentage points, and in diastolic blood pressure by 2 mm Hg, regardless of whether the background chocolate contained normal or high flavanol levels. Also seen at postprandial testing were rises in hematocrit; counts of thrombocytes, lymphocytes, monocytes, and neutrophils; plasma concentrations of sICAM1, soluble vascular cell adhesion molecule (sVCAM) 1, sICAM3, P-selectin, IL-8, and tumor necrosis factor-alpha (TNF-α); and expression of lymphocytes CD11c and CD11b, monocyte CD11c and CD11b, and neutrophils CD62l and CD11b adhesion molecules. At the same time, monocyte CD62l expression and IL-6 plasma levels fell.
"Our results indicate that flavanol-enriched chocolate was not healthier than regular dark chocolate with respect to vascular health markers," according to Esser et al.
Limits on flavanol absorption didn't account for the similar effects, they determined, based on plasma and urine measures of flavanol intake; normal-flavanol–level chocolate may simply attain maximal flavanol effects on vascular health, the group writes.
In a randomized, double-blind study, eating dark chocolate—acutely and over weeks—not only improved objective measures of endovascular function, it also improved biochemical markers that reflect leukocyte activation, inflammation, and other signs of atherogenesis.
The study further compared normal- vs high-flavanol dark-chocolate consumption by its participants, a few dozen overweight and mildly obese middle-aged male volunteers. It saw few important differences in vascular functional or biochemical effects following ingestion of either formulation of chocolate.
Changes in endothelial function were reflected in improved flow-mediated dilation (FMD), blood pressure, and augmentation index (AIX), while "changes in leukocyte-cell counts, plasma cytokines, and leukocyte adherence markers after chocolate consumption point toward a less-activated state of cellular adherence and, hence, a less atherogenic milieu," according to the authors, led by Dr Diederik Esser (Top Institute Food and Nutrition and Wageningen University, the Netherlands).
"Extra flavanols did not augment these effects," they write, "but did affect taste and had a negative effect on the motivation to eat chocolate."
Beneficial vascular changes and even improved clinical outcomes have long been attributed to consumption of flavanol- and polyphenol-rich foods like chocolate, red wine, cinnamon, and tea in both observational and interventional studies. Chocolate itself has been associated with improved blood pressure and cerebral perfusion and a reduction in stroke, heart failure, and other cardiovascular-event outcomes.
Acute Chocolate Effects
In a double-blind crossover randomization, 29 men consumed 70 g of chocolate on each of two days separated by at least one week. On one occasion, the chocolate contained normal levels of flavanols, and on the other it was supplemented with extra flavanols.
Ingestion of both types of chocolate decreased AIX (indicating decreased central aortic pressure) but didn't affect FMD (reflecting endothelial function). Similarly, hematocrit and counts of thrombocytes, lymphocytes, monocytes, and neutrophils went up within two hours. Plasma soluble intercellular adhesion molecule (sICAM) 3, interleukin (IL)-1β, and von Willebrand factor levels went up, and plasma IL-6 levels fell two hours after ingestion of both kinds of chocolate. Insulin also went up after both chocolates, but more so after the high-flavanol kind.
The Four-Week Crossover Trial
The same 29 subjects plus an additional 15 participated in a second double-blind experiment in which four weeks of 70 g per day of either high- or low-flavanol chocolate was followed by a four-week washout period and then crossover to four weeks of the other kind of chocolate. They followed certain dietary restrictions to minimize background flavanol intake.
Four weeks of consuming chocolate of either kind elevated fasting FMD by a percentage point and dropped AIX to a similar extent; both measures reverted to baseline after the washout period. The literature suggests a significant 13% reduction in cardiovascular-event risk for every 1% rise in FMD after chocolate consumption, according to the authors.
In a novel finding, four weeks of chocolate consumption significantly lowered counts of circulating leukocytes, suggesting decreased inflammation; also, "leukocytes can transmigrate through the endothelium and therefore play a crucial role in the formation of atherosclerosis."
Chocolate also lowered protein expression of lymphocytes CD62L and CD11b, monocyte CD62L, and neutrophils CD66b and CD11c. "These cell-surface molecules are involved in leukocyte recruitment and adherence to the endothelium during the initial steps of atherosclerosis," the group writes.
"A minor but significant" 0.1-mL rise in fasting plasma glucose and 0.3-mL increase in plasma free fatty acids was evident after four weeks of chocolate of either kind.
Taking the High-Fat Challenge
"High-fat-challenge" tests conducted after each four-week period of chocolate intake required drinking a 95 fat-gram shake of yogurt, canola oil, and other ingredients followed by endothelial-function and metabolic marker tests at 1.5 and 3.0 hours to assess postprandial responses.
Consumption of the high-fat shake was followed by average declines in FMD by 1.8 percentage points, in AIX by four percentage points, and in diastolic blood pressure by 2 mm Hg, regardless of whether the background chocolate contained normal or high flavanol levels. Also seen at postprandial testing were rises in hematocrit; counts of thrombocytes, lymphocytes, monocytes, and neutrophils; plasma concentrations of sICAM1, soluble vascular cell adhesion molecule (sVCAM) 1, sICAM3, P-selectin, IL-8, and tumor necrosis factor-alpha (TNF-α); and expression of lymphocytes CD11c and CD11b, monocyte CD11c and CD11b, and neutrophils CD62l and CD11b adhesion molecules. At the same time, monocyte CD62l expression and IL-6 plasma levels fell.
"Our results indicate that flavanol-enriched chocolate was not healthier than regular dark chocolate with respect to vascular health markers," according to Esser et al.
Limits on flavanol absorption didn't account for the similar effects, they determined, based on plasma and urine measures of flavanol intake; normal-flavanol–level chocolate may simply attain maximal flavanol effects on vascular health, the group writes.
Wednesday, 12 February 2014
Low-Dose Aspirin Ups Survival in Heart Failure
Daily low-dose aspirin, defined as 75 mg/day, was followed by a 42% mortality reduction over several years in a cohort of patients participating in a heart-failure disease-management program [1] . Aspirin prolonged survival regardless of whether patients had a standard aspirin indication, such as ischemic heart disease, peripheral vascular disease, or stroke, reported investigators in their study, published online February 3, 2014 in Circulation: Heart Failure. There was no such survival benefit, however, at dosages higher than 75 mg/day, although both low- and higher-dose aspirin improved heart-failure hospitalizations.
The findings contrast with and go beyond prior studies of narrower heart-failure populations treated with aspirin, observe its authors, led by Dr Margaret Bermingham (St Vincent's University Hospital, Dublin, Ireland). Of those randomized trials, they note,WARCEF saw more bleeding and otherwise no clinical advantage for warfarin over aspirin; and in WASH and WATCH , neither aspirin nor warfarin seemed to protect against events.
But the group notes that aspirin in those trials was given at two to four times the low dose of the current analysis, which suggests "that low-dose aspirin may have a continuing role in secondary prevention once patients are diagnosed with HF" and points to a need for "more, preferably prospective, trials of low-dose aspirin use in HF."
Of the 1476 patients in the retrospective analysis, who were followed for a median of 2.15 years (range one day to 12 years), 892 (about 60%) were prescribed aspirin at baseline. Of those, 91% received aspirin at 75 mg/day and the remainder received aspirin at higher dosages.
Another antiplatelet, such as clopidogrel, was also given to 18.4% of the aspirin group, while 27.8% of them took added warfarin, and 2.1% received dual antiplatelets and warfarin.
The findings "add to the controversy on aspirin use in HF by presenting reassuring results on low-dose aspirin use in a clinical-practice population," the group writes. "They challenge the belief that aspirin should be avoided in secondary-prevention patients who go on to develop HF and suggest that patients on higher antiplatelet doses may benefit from dose reduction."
The findings contrast with and go beyond prior studies of narrower heart-failure populations treated with aspirin, observe its authors, led by Dr Margaret Bermingham (St Vincent's University Hospital, Dublin, Ireland). Of those randomized trials, they note,WARCEF saw more bleeding and otherwise no clinical advantage for warfarin over aspirin; and in WASH and WATCH , neither aspirin nor warfarin seemed to protect against events.
But the group notes that aspirin in those trials was given at two to four times the low dose of the current analysis, which suggests "that low-dose aspirin may have a continuing role in secondary prevention once patients are diagnosed with HF" and points to a need for "more, preferably prospective, trials of low-dose aspirin use in HF."
Of the 1476 patients in the retrospective analysis, who were followed for a median of 2.15 years (range one day to 12 years), 892 (about 60%) were prescribed aspirin at baseline. Of those, 91% received aspirin at 75 mg/day and the remainder received aspirin at higher dosages.
Another antiplatelet, such as clopidogrel, was also given to 18.4% of the aspirin group, while 27.8% of them took added warfarin, and 2.1% received dual antiplatelets and warfarin.
The findings "add to the controversy on aspirin use in HF by presenting reassuring results on low-dose aspirin use in a clinical-practice population," the group writes. "They challenge the belief that aspirin should be avoided in secondary-prevention patients who go on to develop HF and suggest that patients on higher antiplatelet doses may benefit from dose reduction."
Low fat yoghurt cuts risk of type 2 diabetes
Swapping snacks — such as potato chips — for low-fat yogurt can cut the risk of developing type 2 diabetes by almost half, according to the results of a new observational study.
The findings, involving a randomly selected subgroup from the larger EPIC-Norfolk study, were published online February 5 inDiabetologia by Nita G. Forouhi, MBB, PhD, from the Medical Research Council (MRC) Epidemiology Unit, University of Cambridge School of Clinical Medicine, United Kingdom, and colleagues.
"A good place to include yogurt in your diet to reap the benefits against onset of diabetes would be instead of a packet of crisps [potato chips].We found that swapping crisps for yogurt offered protection against onset of diabetes. This may form a practical dietary change suggestion for patients at risk of diabetes as well as among the general population," Dr. Forouhi revealed in an email.
In the study — the first to examine the relationship of dairy consumption with incident type 2 diabetes using prospective 7-day food diaries — subjects with the highest low-fat yogurt consumption had a 28% lower risk of developing type 2 diabetes over 11 years of follow-up compared with those who ate very little or no yogurt, after adjustment for a variety of possible confounders.
"Current US dietary guidelines recommend increasing intake of fat-free or low-fat milk and milk products such as milk, yogurt, and cheese… Recommending yogurt intake is therefore in keeping with dietary guidelines for all patients," Dr. Forouhi said.
And, she noted, "At a time when other research has shown that certain foods raise health risks, such as consuming high amounts of added sugar, it is reassuring to have messages about other foods, like yogurt and low-fat fermented dairy products, that could be good for our health."
Dietary Dairy Examined
The study compared a detailed daily record of all the food and drink consumed over a week at the time of study entry among 753 people who developed new-onset type 2 diabetes over 11 years of follow-up, with 3502 randomly selected study participants from EPIC-Norfolk, which comprised more than 25,000 men and women.
At baseline, subjects had a mean age of 59 years and body mass index (BMI) of 26 kg/m 2; they filled in detailed diary questionnaires that asked about all foods containing dairy as the main ingredient and the type of dairy product (yogurt, cheese, or milk). These were divided into high-fat and low-fat based on a 3.9% cutoff for total fat content. Fermented dairy products (all yogurt, all cheese, sour cream, and crème fraiche) were also categorized separately into high- and low-fat.
Milk was the most-consumed dairy product, accounting for 82%, followed by cheese (9%) and yogurt (8%). Total average dairy consumption was 269 g/day, of which 65% was low-fat.
Total overall dairy consumption was not associated with the development of diabetes, but low-fat dairy intake was, after adjustment for age and sex. Yet even this became nonsignificant after further adjustment for other confounders including BMI, smoking, alcohol consumption, social class, physical activity, and other dietary components.
Similarly, total fermented dairy-product consumption was associated with a 19% lower risk, but this also became nonsignificant after adjustment for possible confounders.
Low-Fat Fermented Dairy Products Linked to Lower Risk
But after full adjustment, a significant relationship remained between intake of low-fat fermented dairy products and the development of type 2 diabetes, with a hazard ratio of 0.76 for the highest vs lowest tertiles (P for trend = .049).
"In public-health terms this equates to 4.5 standard-size portions (125 g) per week of low-fat fermented dairy products, largely composed of yogurt (all types) and including low-fat unripened cheese such as low-fat cottage cheese and fromage frais," the researchers say.
Low-fat yogurt in particular was associated with a 35% reduced risk after adjustment for age and sex ( P for trend < .001), and this relationship remained significant even after adjustment for other potential confounders (HR 0.72, P for trend = .017).
In a separate analysis, the effect of substituting dairy products for snacks — such as cake, pudding, biscuits, or chips — was examined. Eating yogurt instead of such snacks resulted in a 47% lower risk for diabetes, but none of the other substitutions resulted in a significant reduction in diabetes risk.
Several possible mechanisms could explain the relationship between fermented dairy and diabetes, including promotion of the synthesis of menaquinone (vitamin K 2), which has been linked to reduced rates of type 2 diabetes, or the actions of probiotic bacteria, which have been found to improve lipid profiles and antioxidant status in patients with type 2 diabetes, the researchers suggest.
Moreover, low-fat fermented dairy products are "naturally low in fat and high in water content and are, therefore, low energy-dense foods. Studies have shown an independent association of low energy-dense foods with lower fasting insulin levels and the metabolic syndrome and a lower risk of type 2 diabetes," they say.
Dr. Forouhi said that the use of 7-day food diaries allows for a much more careful analysis of dairy intake than has been done in prior studies.
"This work gives us important…and more detailed information than our own past research and that of others. We were able to estimate people's diets in much more detail and in real time with all food and drink that was consumed being recorded with a 7-day food diary.
"This gets around an important limitation of the past research, which has relied on participants' recall of foods they have eaten in the past. This gives us greater confidence that low-fat fermented dairy and yogurt specifically within that group can be important for the prevention of diabetes as part of an overall healthy lifestyle," she concluded.
The findings, involving a randomly selected subgroup from the larger EPIC-Norfolk study, were published online February 5 inDiabetologia by Nita G. Forouhi, MBB, PhD, from the Medical Research Council (MRC) Epidemiology Unit, University of Cambridge School of Clinical Medicine, United Kingdom, and colleagues.
"A good place to include yogurt in your diet to reap the benefits against onset of diabetes would be instead of a packet of crisps [potato chips].We found that swapping crisps for yogurt offered protection against onset of diabetes. This may form a practical dietary change suggestion for patients at risk of diabetes as well as among the general population," Dr. Forouhi revealed in an email.
In the study — the first to examine the relationship of dairy consumption with incident type 2 diabetes using prospective 7-day food diaries — subjects with the highest low-fat yogurt consumption had a 28% lower risk of developing type 2 diabetes over 11 years of follow-up compared with those who ate very little or no yogurt, after adjustment for a variety of possible confounders.
"Current US dietary guidelines recommend increasing intake of fat-free or low-fat milk and milk products such as milk, yogurt, and cheese… Recommending yogurt intake is therefore in keeping with dietary guidelines for all patients," Dr. Forouhi said.
And, she noted, "At a time when other research has shown that certain foods raise health risks, such as consuming high amounts of added sugar, it is reassuring to have messages about other foods, like yogurt and low-fat fermented dairy products, that could be good for our health."
Dietary Dairy Examined
The study compared a detailed daily record of all the food and drink consumed over a week at the time of study entry among 753 people who developed new-onset type 2 diabetes over 11 years of follow-up, with 3502 randomly selected study participants from EPIC-Norfolk, which comprised more than 25,000 men and women.
At baseline, subjects had a mean age of 59 years and body mass index (BMI) of 26 kg/m 2; they filled in detailed diary questionnaires that asked about all foods containing dairy as the main ingredient and the type of dairy product (yogurt, cheese, or milk). These were divided into high-fat and low-fat based on a 3.9% cutoff for total fat content. Fermented dairy products (all yogurt, all cheese, sour cream, and crème fraiche) were also categorized separately into high- and low-fat.
Milk was the most-consumed dairy product, accounting for 82%, followed by cheese (9%) and yogurt (8%). Total average dairy consumption was 269 g/day, of which 65% was low-fat.
Total overall dairy consumption was not associated with the development of diabetes, but low-fat dairy intake was, after adjustment for age and sex. Yet even this became nonsignificant after further adjustment for other confounders including BMI, smoking, alcohol consumption, social class, physical activity, and other dietary components.
Similarly, total fermented dairy-product consumption was associated with a 19% lower risk, but this also became nonsignificant after adjustment for possible confounders.
Low-Fat Fermented Dairy Products Linked to Lower Risk
But after full adjustment, a significant relationship remained between intake of low-fat fermented dairy products and the development of type 2 diabetes, with a hazard ratio of 0.76 for the highest vs lowest tertiles (P for trend = .049).
"In public-health terms this equates to 4.5 standard-size portions (125 g) per week of low-fat fermented dairy products, largely composed of yogurt (all types) and including low-fat unripened cheese such as low-fat cottage cheese and fromage frais," the researchers say.
Low-fat yogurt in particular was associated with a 35% reduced risk after adjustment for age and sex ( P for trend < .001), and this relationship remained significant even after adjustment for other potential confounders (HR 0.72, P for trend = .017).
In a separate analysis, the effect of substituting dairy products for snacks — such as cake, pudding, biscuits, or chips — was examined. Eating yogurt instead of such snacks resulted in a 47% lower risk for diabetes, but none of the other substitutions resulted in a significant reduction in diabetes risk.
Several possible mechanisms could explain the relationship between fermented dairy and diabetes, including promotion of the synthesis of menaquinone (vitamin K 2), which has been linked to reduced rates of type 2 diabetes, or the actions of probiotic bacteria, which have been found to improve lipid profiles and antioxidant status in patients with type 2 diabetes, the researchers suggest.
Moreover, low-fat fermented dairy products are "naturally low in fat and high in water content and are, therefore, low energy-dense foods. Studies have shown an independent association of low energy-dense foods with lower fasting insulin levels and the metabolic syndrome and a lower risk of type 2 diabetes," they say.
Dr. Forouhi said that the use of 7-day food diaries allows for a much more careful analysis of dairy intake than has been done in prior studies.
"This work gives us important…and more detailed information than our own past research and that of others. We were able to estimate people's diets in much more detail and in real time with all food and drink that was consumed being recorded with a 7-day food diary.
"This gets around an important limitation of the past research, which has relied on participants' recall of foods they have eaten in the past. This gives us greater confidence that low-fat fermented dairy and yogurt specifically within that group can be important for the prevention of diabetes as part of an overall healthy lifestyle," she concluded.
Thursday, 6 February 2014
Coffee Linked to Lower Type 2 Diabetes Risk
Coffee consumption is strongly associated with a reduced risk for type 2 diabetes, regardless of whether the coffee is caffeinated or not, according to a new meta-analysis of 28 prospective studies, published in the February issue of Diabetes Care.
"Compared with no coffee consumption…6 cups/day of coffee was associated with a 33% lower risk of type 2 diabetes," the authors write, adding, "Caffeinated coffee and decaffeinated coffee consumption were both associated with a lower risk of type 2 diabetes."
Drinking coffee has been well-established in previous meta-analyses as being associated with a reduced risk for type 2 diabetes; however, the new review was needed to account for more recent trials evaluating the benefits of caffeinated vs decaffeinated coffee, the authors explain.
"We found that a 1-cup/day increment of regular coffee was associated with a 9% reduction in diabetes, and 1 cup/day of decaf was associated with 6% reduction in diabetes, but the difference in risk reduction between the 2 types of coffee was not statistically different," said senior author Frank B. Hu, MD, PhD, a professor of nutrition and epidemiology with the Harvard School of Public Health in Boston, Massachusetts.
"This study provides strong evidence that regular consumption of coffee is beneficial for prevention of diabetes," Dr. Hu said. "For individuals who already drink coffee, they may enjoy this and other potential health benefits, and for those who are sensitive to the effects of caffeine, decaf may confer similar benefits."
Robust Findings
The 28 studies in the analysis included 1,109,272 participants, and all had the outcome of risk for type 2 diabetes; there were 45,335 cases of diabetes, with follow-up ranging from 10 months to 20 years.
The relative risk for type 2 diabetes with coffee consumption, compared with no or rare consumption, ranged from 0.92 for 1 cup per day, 0.85 for 2 cups, 0.79 for 3 cups, 0.75 for 4 cups, 0.71 for 5 cups, and 0.67 for 6 cups.
Meanwhile, the relative risk for diabetes associated with 1 cup of caffeinated coffee per day (compared to no or rare coffee consumption) was 0.91 compared with 0.94 for 1 cup of decaffeinated coffee per day ( P = .17).
The findings were consistent for men as well as women, and while coffee-brewing methods were not assessed in the studies, the inclusion of diverse populations likely covered a wide array of methods, the authors note.
"Most coffee is likely to be filtered coffee, and the results from studies conducted in various populations, including US, European, and Asian, were similar, indicating consistency of the results despite potentially different preparation and processing methods," they observe.
And while none of the studies assessed levels of sugar and dairy added to coffee, the amounts are "likely to be small compared with other food sources," they add.
While this meta-analysis does provide "strong evidence" that higher consumption of coffee is associated with a significantly lower risk for diabetes," longer-term randomised controlled trials are needed to establish causality and to elucidate the underlying mechanisms," they conclude.
Coffee Just a Small Part of the Diabetes Equation
One possible explanation for the reduced risk for diabetes with coffee consumption could be the role of chlorogenic acid, a phenolic compound and a major component of coffee, Dr. Hu said.
"Chlorogenic acid has been shown to improve insulin sensitivity and reduce blood sugar absorption," he said.
"Compounds in coffee also include antioxidant polyphenols, minerals such as magnesium and chromium, [and] vitamins; however, it is impossible to tease out the effects of individual compounds, because they don't exist in isolation in coffee and they may have synergistic effects."
He stressed also that coffee consumption remains a small piece of the picture.
"Coffee is only one of many dietary and lifestyle factors that can contribute to diabetes prevention. Clearly, maintaining a healthy weight through diet and exercise is the most important way to reduce risk of diabetes. For those who drink coffee regularly, they should enjoy it, but they still need to watch their weight and be physically active."
"Compared with no coffee consumption…6 cups/day of coffee was associated with a 33% lower risk of type 2 diabetes," the authors write, adding, "Caffeinated coffee and decaffeinated coffee consumption were both associated with a lower risk of type 2 diabetes."
Drinking coffee has been well-established in previous meta-analyses as being associated with a reduced risk for type 2 diabetes; however, the new review was needed to account for more recent trials evaluating the benefits of caffeinated vs decaffeinated coffee, the authors explain.
"We found that a 1-cup/day increment of regular coffee was associated with a 9% reduction in diabetes, and 1 cup/day of decaf was associated with 6% reduction in diabetes, but the difference in risk reduction between the 2 types of coffee was not statistically different," said senior author Frank B. Hu, MD, PhD, a professor of nutrition and epidemiology with the Harvard School of Public Health in Boston, Massachusetts.
"This study provides strong evidence that regular consumption of coffee is beneficial for prevention of diabetes," Dr. Hu said. "For individuals who already drink coffee, they may enjoy this and other potential health benefits, and for those who are sensitive to the effects of caffeine, decaf may confer similar benefits."
Robust Findings
The 28 studies in the analysis included 1,109,272 participants, and all had the outcome of risk for type 2 diabetes; there were 45,335 cases of diabetes, with follow-up ranging from 10 months to 20 years.
The relative risk for type 2 diabetes with coffee consumption, compared with no or rare consumption, ranged from 0.92 for 1 cup per day, 0.85 for 2 cups, 0.79 for 3 cups, 0.75 for 4 cups, 0.71 for 5 cups, and 0.67 for 6 cups.
Meanwhile, the relative risk for diabetes associated with 1 cup of caffeinated coffee per day (compared to no or rare coffee consumption) was 0.91 compared with 0.94 for 1 cup of decaffeinated coffee per day ( P = .17).
The findings were consistent for men as well as women, and while coffee-brewing methods were not assessed in the studies, the inclusion of diverse populations likely covered a wide array of methods, the authors note.
"Most coffee is likely to be filtered coffee, and the results from studies conducted in various populations, including US, European, and Asian, were similar, indicating consistency of the results despite potentially different preparation and processing methods," they observe.
And while none of the studies assessed levels of sugar and dairy added to coffee, the amounts are "likely to be small compared with other food sources," they add.
While this meta-analysis does provide "strong evidence" that higher consumption of coffee is associated with a significantly lower risk for diabetes," longer-term randomised controlled trials are needed to establish causality and to elucidate the underlying mechanisms," they conclude.
Coffee Just a Small Part of the Diabetes Equation
One possible explanation for the reduced risk for diabetes with coffee consumption could be the role of chlorogenic acid, a phenolic compound and a major component of coffee, Dr. Hu said.
"Chlorogenic acid has been shown to improve insulin sensitivity and reduce blood sugar absorption," he said.
"Compounds in coffee also include antioxidant polyphenols, minerals such as magnesium and chromium, [and] vitamins; however, it is impossible to tease out the effects of individual compounds, because they don't exist in isolation in coffee and they may have synergistic effects."
He stressed also that coffee consumption remains a small piece of the picture.
"Coffee is only one of many dietary and lifestyle factors that can contribute to diabetes prevention. Clearly, maintaining a healthy weight through diet and exercise is the most important way to reduce risk of diabetes. For those who drink coffee regularly, they should enjoy it, but they still need to watch their weight and be physically active."
Resistance training boosts testosterone in older men
Resistance training can increase the production of sex steroids in the muscles of older men, a new study shows.
If borne out by further research, the findings could have important implications for a wide range of conditions experienced by many aging individuals and linked to a decline in sex steroid hormones, say Koji Sato, PhD, and colleagues from Ritsumeikan University in Kyoto, Japan in their report in the Journal of the Federation of American Societies for Experimental Biology published online January 17.
"Resistance-training–induced increased muscular sex steroid hormone may positively affect age-related concerns such as accidental falls, diabetes, sarcopenia, and osteoporosis and may improve the quality of life for older individuals," they note.
More and more men are using testosterone supplements in the hope of improving their strength, energy, and sexual performance. But this use has become increasingly controversial, because many experts believe such supplementation is not necessary and there is also mounting evidence of concerning side effects associated with the use of testosterone, particularly in older individuals.
Just last week, the US Food and Drug Administration (FDA) announced it is investigating the potential that FDA-approved testosterone products increase the risk for serious adverse cardiovascular outcomes. The agency cited two studies, the most recent of which shows that men treated with testosterone were significantly more likely to have a myocardial infarction (MI) in the first 90 days after starting the medication.
The second study, of Veterans Affairs (VA) patients, published last year, also found that testosterone therapy in men was linked with an increased risk of death, MI, or ischemic stroke.
Sex Hormones in Skeletal Muscles Boosted by Training
In their new paper, Dr. Sato and colleagues explain that while the testes, ovaries, and adrenal cortex produce most steroid sex hormones in humans, recent animal work has shown that skeletal muscle can also synthesize testosterone, estradiol, dehyroepiandrosterone (DHEA), and dihydrotestosterone (DHT). However, a previous study in men and women in their 20s showed no immediate sign of an increase in these hormones in the quadriceps muscle following training ( J Appl Phys. 2008;105: 1754-1760).
To see what effect resistance training might have on older men over a longer period of training, the Japanese researchers recruited 13 men with a mean age of 67.2 years and 6 men with a mean age of 24.3.
The subjects were moderately active, but none regularly performed resistance exercises.
The older men performed knee extensions and flexions 3 times a week on alternate days, using starting weights that were 70% of each subject's single-repetition maximum strength. Trainers retested this measurement and adjusted weights for each of these men every 4 weeks. The study lasted 12 weeks.
Levels of sex steroid hormones and precursors in each man's vastus lateralis (the largest muscle in the quadriceps) were measured by biopsy before the weight training. The same measurement was taken again in the older men after the weight training.
The researchers also sampled the hormones in the men's blood and measured the size of their quadriceps before and after the training.
The older men had much lower levels of steroidogenic enzymes as well as sex steroid hormones and insulinlike growth factor (IGF-1) than the younger men in both their blood and their muscles. The older men were also weaker and had smaller quadriceps.
Nevertheless, the levels of sex steroid hormones increased significantly in the older men's muscles after the weight training. The older men's isokinetic extension strength and quadriceps sizes also increased significantly.
Serum Testosterone Not Significantly Increased
In the blood, however, although the older men had significantly increased serum DHEA and DHT levels after training ( P < .01), their serum free testosterone did not rise significantly ( P = .052), nor did serum IGF-1 ( P= .055).
Levels of steroidogenic enzymes, as well as their androgen-receptor protein expression, did significantly increase, however.
The muscular steroid hormone levels significantly correlated with muscle strength and size, leading the investigators to conclude that "progressive resistance training seems to restore muscle sex steroid hormone levels via enhancement of steroidogenesis-related enzyme expressions in the skeletal muscle and may partly contribute to the increase in muscle strength and [size]."
Asked to comment on the findings, Jakob Vingren, PhD, an associate professor of kinesiology at the University of North Texas Denton, said that the results reinforce the importance of resistance training. "There are definitely benefits to aerobic exercises and there are definitely benefits to resistance training, and they combine, but there are unique benefits to both," he said.
Dr. Vingren added that men who are on androgen-deprivation therapy, for example, for prostate and other cancers, might particularly benefit from resistance training, since these findings indicate it could help strengthen them without increasing their serum testosterone.
If borne out by further research, the findings could have important implications for a wide range of conditions experienced by many aging individuals and linked to a decline in sex steroid hormones, say Koji Sato, PhD, and colleagues from Ritsumeikan University in Kyoto, Japan in their report in the Journal of the Federation of American Societies for Experimental Biology published online January 17.
"Resistance-training–induced increased muscular sex steroid hormone may positively affect age-related concerns such as accidental falls, diabetes, sarcopenia, and osteoporosis and may improve the quality of life for older individuals," they note.
More and more men are using testosterone supplements in the hope of improving their strength, energy, and sexual performance. But this use has become increasingly controversial, because many experts believe such supplementation is not necessary and there is also mounting evidence of concerning side effects associated with the use of testosterone, particularly in older individuals.
Just last week, the US Food and Drug Administration (FDA) announced it is investigating the potential that FDA-approved testosterone products increase the risk for serious adverse cardiovascular outcomes. The agency cited two studies, the most recent of which shows that men treated with testosterone were significantly more likely to have a myocardial infarction (MI) in the first 90 days after starting the medication.
The second study, of Veterans Affairs (VA) patients, published last year, also found that testosterone therapy in men was linked with an increased risk of death, MI, or ischemic stroke.
Sex Hormones in Skeletal Muscles Boosted by Training
In their new paper, Dr. Sato and colleagues explain that while the testes, ovaries, and adrenal cortex produce most steroid sex hormones in humans, recent animal work has shown that skeletal muscle can also synthesize testosterone, estradiol, dehyroepiandrosterone (DHEA), and dihydrotestosterone (DHT). However, a previous study in men and women in their 20s showed no immediate sign of an increase in these hormones in the quadriceps muscle following training ( J Appl Phys. 2008;105: 1754-1760).
To see what effect resistance training might have on older men over a longer period of training, the Japanese researchers recruited 13 men with a mean age of 67.2 years and 6 men with a mean age of 24.3.
The subjects were moderately active, but none regularly performed resistance exercises.
The older men performed knee extensions and flexions 3 times a week on alternate days, using starting weights that were 70% of each subject's single-repetition maximum strength. Trainers retested this measurement and adjusted weights for each of these men every 4 weeks. The study lasted 12 weeks.
Levels of sex steroid hormones and precursors in each man's vastus lateralis (the largest muscle in the quadriceps) were measured by biopsy before the weight training. The same measurement was taken again in the older men after the weight training.
The researchers also sampled the hormones in the men's blood and measured the size of their quadriceps before and after the training.
The older men had much lower levels of steroidogenic enzymes as well as sex steroid hormones and insulinlike growth factor (IGF-1) than the younger men in both their blood and their muscles. The older men were also weaker and had smaller quadriceps.
Nevertheless, the levels of sex steroid hormones increased significantly in the older men's muscles after the weight training. The older men's isokinetic extension strength and quadriceps sizes also increased significantly.
Serum Testosterone Not Significantly Increased
In the blood, however, although the older men had significantly increased serum DHEA and DHT levels after training ( P < .01), their serum free testosterone did not rise significantly ( P = .052), nor did serum IGF-1 ( P= .055).
Levels of steroidogenic enzymes, as well as their androgen-receptor protein expression, did significantly increase, however.
The muscular steroid hormone levels significantly correlated with muscle strength and size, leading the investigators to conclude that "progressive resistance training seems to restore muscle sex steroid hormone levels via enhancement of steroidogenesis-related enzyme expressions in the skeletal muscle and may partly contribute to the increase in muscle strength and [size]."
Asked to comment on the findings, Jakob Vingren, PhD, an associate professor of kinesiology at the University of North Texas Denton, said that the results reinforce the importance of resistance training. "There are definitely benefits to aerobic exercises and there are definitely benefits to resistance training, and they combine, but there are unique benefits to both," he said.
Dr. Vingren added that men who are on androgen-deprivation therapy, for example, for prostate and other cancers, might particularly benefit from resistance training, since these findings indicate it could help strengthen them without increasing their serum testosterone.
Wednesday, 22 January 2014
Berries, tea, grapes and wine lower diabetes risk
A new study in healthy women suggests that consuming high levels of flavonoids, including compounds found in berries, tea, grapes, and wine, could potentially lower the risk of type 2 diabetes.
The study, published in the February issue of the Journal of Nutrition, indicates that greater intake of these dietary compounds is associated with lower insulin resistance and better blood glucose regulation. The researchers, led by Amy Jennings, PhD, from the department of nutrition, University of East Anglia, Norwich, United Kingdom, say their study is one of the first to examine consumption of different flavonoid subclasses and insulin resistance.
"We found that those who consumed plenty of anthocyanins and flavones had lower insulin resistance. So what we are seeing is that people who eat foods rich in these 2 compounds — such as berries, herbs, red grapes, wine — are less likely to develop the disease," said senior author Aedin Cassidy, PhD, also from the department of nutrition, University of East Anglia, in a statement.
Researchers also found that those who ate the most anthocyanins were least likely to suffer chronic inflammation, which is associated with diabetes, obesity, cardiovascular disease, and cancer. And those who consumed the most flavone compounds had improved levels of adiponectin, which helps regulate a number of metabolic processes, including glucose levels, Dr. Cassidy noted.
Importantly, the difference between the highest and lowest intakes of foods containing these compounds was small, consisting of just one portion of grapes or berries or a couple of oranges, say the authors. Also, the effects on insulin that were associated with high consumption of such foods was equivalent to those observed for other lifestyle factors, such as an hour's walk a day or low-fat diet for a year, they noted.
Nevertheless, Dr. Cassidy said it is not yet know exactly how much of one of these compounds is necessary to potentially reduce the risk of type 2 diabetes. "Dose–response trials are needed to ascertain optimal intakes for the potential reduction of type 2 diabetes risk," she and her colleagues stress.
One of the First Large Human Studies of Flavonoid Subclasses
Researchers note that a previous prospective study, published last year in the American Journal of Clinical Nutrition suggested a 15% reduction type 2 diabetes risk by comparing the highest and lowest quintiles of anthocyanin intake. However, the researchers emphasize that their current study is one of the first large-scale human trials to examine all subclasses of these powerful bioactive compounds to see how they might affect insulin resistance, blood glucose regulation, and inflammation.
The cross-sectional study was conducted in almost 2000 women aged 18 to 76 years from the Twins UK registry. Women who had high glucose levels were excluded. Participants completed a 131-item food-frequency questionnaire, from which flavonoid intakes were estimated using a United States Department of Agriculture database.
The researchers looked at the self-reported intake of 6 subclasses of flavonoids: flavanones, anthocyanins, flavan-3-ols, polymeric flavonoids, flavanols, and flavones.
In multivariable analyses, higher anthocyanin and flavone intakes were associated with significantly lower peripheral insulin resistance (homeostasis model assessment of insulin resistance; quintile 5 [Q5] to Q1 = 20.1, P-trend = .04 for anthocyanins and flavones), as a result of a decrease in insulin concentrations (Q5–Q1 = 20.7 mU/mL, P-trend = .02 anthocyanins; Q5–Q1 = 20.5 mU/mL, P-trend = .02 flavones).
Tea was the main source of overall flavonoid intake, with 4 foods contributing more than 10% of anthocyanin intake (grapes, pears, berries, and wine) and 3 foods making up more than 10% of flavone consumption (oranges, wine, and peppers).
Higher anthocyanin intake was also associated with lower C-reactive protein (hs-CRP) levels (Q5–Q1 = 20.3 mg/L, P-trend = .04), whereas those in the highest quintile of flavone intake had improved adiponectin levels (Q5–Q1 = 0.7 mg/L, P-trend = .01).
Higher intakes of both anthocyanins and flavones were associated with improvements in insulin resistance and hs-CRP, the researchers note.
No significant associations were observed for total or other flavonoid subclasses.
Findings Are Clinically Relevant, Easy to Achieve
Although these findings are from cross-sectional data and require confirmation, they are clinically relevant because of the 0.7-mU/mL difference in insulin observed between the top and bottom quintiles of anthocyanin intake, the researchers note.
The difference in anthocyanin intake between the top and bottom quintiles was 35 mg, which can be readily incorporated into the diet by consuming approximately one portion of grapes (78 g) or berries, such as strawberries (105 g), raspberries (90 g), blueberries (21 g), or blackberries (39 g).
Similarly, the difference in flavones between the top and bottom quintiles was 3.6 mg, equivalent to that found in approximately 2.5 oranges.
These results "are of public-health importance because the intakes associated with these findings are easily achievable through the habitual diet" and make a significant contribution to the knowledge base needed to refine the current fruit and vegetable dietary recommendations, the authors conclude.
The study, published in the February issue of the Journal of Nutrition, indicates that greater intake of these dietary compounds is associated with lower insulin resistance and better blood glucose regulation. The researchers, led by Amy Jennings, PhD, from the department of nutrition, University of East Anglia, Norwich, United Kingdom, say their study is one of the first to examine consumption of different flavonoid subclasses and insulin resistance.
"We found that those who consumed plenty of anthocyanins and flavones had lower insulin resistance. So what we are seeing is that people who eat foods rich in these 2 compounds — such as berries, herbs, red grapes, wine — are less likely to develop the disease," said senior author Aedin Cassidy, PhD, also from the department of nutrition, University of East Anglia, in a statement.
Researchers also found that those who ate the most anthocyanins were least likely to suffer chronic inflammation, which is associated with diabetes, obesity, cardiovascular disease, and cancer. And those who consumed the most flavone compounds had improved levels of adiponectin, which helps regulate a number of metabolic processes, including glucose levels, Dr. Cassidy noted.
Importantly, the difference between the highest and lowest intakes of foods containing these compounds was small, consisting of just one portion of grapes or berries or a couple of oranges, say the authors. Also, the effects on insulin that were associated with high consumption of such foods was equivalent to those observed for other lifestyle factors, such as an hour's walk a day or low-fat diet for a year, they noted.
Nevertheless, Dr. Cassidy said it is not yet know exactly how much of one of these compounds is necessary to potentially reduce the risk of type 2 diabetes. "Dose–response trials are needed to ascertain optimal intakes for the potential reduction of type 2 diabetes risk," she and her colleagues stress.
One of the First Large Human Studies of Flavonoid Subclasses
Researchers note that a previous prospective study, published last year in the American Journal of Clinical Nutrition suggested a 15% reduction type 2 diabetes risk by comparing the highest and lowest quintiles of anthocyanin intake. However, the researchers emphasize that their current study is one of the first large-scale human trials to examine all subclasses of these powerful bioactive compounds to see how they might affect insulin resistance, blood glucose regulation, and inflammation.
The cross-sectional study was conducted in almost 2000 women aged 18 to 76 years from the Twins UK registry. Women who had high glucose levels were excluded. Participants completed a 131-item food-frequency questionnaire, from which flavonoid intakes were estimated using a United States Department of Agriculture database.
The researchers looked at the self-reported intake of 6 subclasses of flavonoids: flavanones, anthocyanins, flavan-3-ols, polymeric flavonoids, flavanols, and flavones.
In multivariable analyses, higher anthocyanin and flavone intakes were associated with significantly lower peripheral insulin resistance (homeostasis model assessment of insulin resistance; quintile 5 [Q5] to Q1 = 20.1, P-trend = .04 for anthocyanins and flavones), as a result of a decrease in insulin concentrations (Q5–Q1 = 20.7 mU/mL, P-trend = .02 anthocyanins; Q5–Q1 = 20.5 mU/mL, P-trend = .02 flavones).
Tea was the main source of overall flavonoid intake, with 4 foods contributing more than 10% of anthocyanin intake (grapes, pears, berries, and wine) and 3 foods making up more than 10% of flavone consumption (oranges, wine, and peppers).
Higher anthocyanin intake was also associated with lower C-reactive protein (hs-CRP) levels (Q5–Q1 = 20.3 mg/L, P-trend = .04), whereas those in the highest quintile of flavone intake had improved adiponectin levels (Q5–Q1 = 0.7 mg/L, P-trend = .01).
Higher intakes of both anthocyanins and flavones were associated with improvements in insulin resistance and hs-CRP, the researchers note.
No significant associations were observed for total or other flavonoid subclasses.
Findings Are Clinically Relevant, Easy to Achieve
Although these findings are from cross-sectional data and require confirmation, they are clinically relevant because of the 0.7-mU/mL difference in insulin observed between the top and bottom quintiles of anthocyanin intake, the researchers note.
The difference in anthocyanin intake between the top and bottom quintiles was 35 mg, which can be readily incorporated into the diet by consuming approximately one portion of grapes (78 g) or berries, such as strawberries (105 g), raspberries (90 g), blueberries (21 g), or blackberries (39 g).
Similarly, the difference in flavones between the top and bottom quintiles was 3.6 mg, equivalent to that found in approximately 2.5 oranges.
These results "are of public-health importance because the intakes associated with these findings are easily achievable through the habitual diet" and make a significant contribution to the knowledge base needed to refine the current fruit and vegetable dietary recommendations, the authors conclude.
Tuesday, 21 January 2014
Chinese herbal formula reduces progression to diabetes by 32%
A combination of 10 Chinese medicinal herbs in a capsule, known as Tianqi, reduced progression to type 2 diabetes in people with impaired glucose tolerance (IGT) in a randomized controlled trial in China.
This is the first study to show that a Chinese herbal medicine can "reduce the progression of prediabetes to diabetes," says study author Chun-Su Yuan, MD, PhD, from the Tang Center for Herbal Medicine Research at the University of Chicago, Illinois. Tianqi "could provide a new option for diabetes management, using herbal medicine alone or as an adjuvant to currently used therapies," he noted.
The results are published online January 16 in the Journal of Clinical Endocrinology & Metabolism, and Dr. Yuan said robust data such as these are needed to help promote the use of Chinese herbal medicine by physicians in different countries.
Most significantly, the researchers found that taking Tianqi reduced the risk for diabetes by almost a third (32.1%) compared with placebo, after adjustment for age and gender.
A Role for Chinese Medicine in Diabetes Prevention?
The findings show that the Chinese herbal medicine was comparable to some pharmaceuticals in reducing progression to type 2 diabetes, say the researchers. For example, the results seen with Tianqi were similar to those found with acarbose, at 25%, and metformin, at 31%.
"Although no direct comparison has been made between Tianqi and antidiabetic prescription drugs, our data indicate that this Chinese herbal medicine had similar effects to metformin," reported Dr. Yuan.
Asked to comment on whether diabetes prevention was regularly practiced in the United States, he remarked that unacceptable adverse effects limited regular use of conventional therapies in the prevention of type 2 diabetes, with reports showing that long-term administration of acarbose or metformin had often been associated with unfavorable gastrointestinal events.
Around 79 million individuals in the United States aged over 20 years have prediabetes, a state in which blood glucose levels are higher than normal but do not meet the diagnostic criteria for type 2 diabetes.
Dr. Yuan added that their data also show that after a period of cessation of the Tianqi treatment, the preventive effects on type 2 diabetes development remained significant. "Moreover, the safety profile of this herbal medicine is very good without obvious adverse effects," he commented.
Chinese Herb Combination and Study Design
The Chinese medicine comprises several herbs that have been shown to lower blood glucose levels after meals. The Tianqi capsule is manufactured by Heilongjiang Baoquan Pharmaceutical and consists of 10 Chinese herbal medicines: Astragali Radix, Coptidis Rhizoma, Trichosanthis Radix, Ligustri Lucidi Fructus, Dendrobii Caulis, Ginseng Radix, Lycii Cortex, Ecliptae Herba, Galla Chinensis, and Corni Fructus. The quality of these herbs and decoction preparation was in accordance with the Chinese Pharmacopoeia, the researchers note.
Dr. Yuan said the key herb in the combination was Huanglian (Coptidis Rhizoma). "The critical component of this herb is berberine, which has been reported to have good antidiabetic effects," he said in an interview. "Huanglian has been used traditionally in Chinese medicine in treating diabetic symptoms."
A total of 420 participants with IGT recruited from 11 research sites in China underwent double-blind randomization to receive either Tianqi (n=210) or a placebo (n=210) for 12 months. Participants had IGT with a 2-hour plasma glucose concentration of 7.8–11.1 mmol/L after a 75-g oral glucose tolerance test and fasting plasma glucose greater than 7.0 mmol/L.
Oral glucose tolerance tests were conducted every 3 months to assess the development of diabetes or restoration to normal glucose tolerance. In addition, all participants received similar lifestyle education.
The primary end point was the conversion of IGT to type 2 diabetes; body weight, body mass index, and adverse effects were monitored.
Need For More Controlled Trials of Chinese Medicinal Herbs
"Although the results of the present study need to be confirmed in future larger clinical trials, Tianqi holds promising potential as an effective and practical means to prevent type 2 diabetes, particularly in places in which herbal medicines are culturally accepted and widely used," say the authors.
They note that treating diabetes with Chinese herbal medicines is popular in China, particularly in rural areas.
"Our encouraging data should initiate further studies, both in China and in the West, to evaluate the role of Chinese herbal medicine in preventing and treating diabetes," said Dr. Yuan. "We are currently conducting several studies in this field."
This is the first study to show that a Chinese herbal medicine can "reduce the progression of prediabetes to diabetes," says study author Chun-Su Yuan, MD, PhD, from the Tang Center for Herbal Medicine Research at the University of Chicago, Illinois. Tianqi "could provide a new option for diabetes management, using herbal medicine alone or as an adjuvant to currently used therapies," he noted.
The results are published online January 16 in the Journal of Clinical Endocrinology & Metabolism, and Dr. Yuan said robust data such as these are needed to help promote the use of Chinese herbal medicine by physicians in different countries.
Most significantly, the researchers found that taking Tianqi reduced the risk for diabetes by almost a third (32.1%) compared with placebo, after adjustment for age and gender.
A Role for Chinese Medicine in Diabetes Prevention?
The findings show that the Chinese herbal medicine was comparable to some pharmaceuticals in reducing progression to type 2 diabetes, say the researchers. For example, the results seen with Tianqi were similar to those found with acarbose, at 25%, and metformin, at 31%.
"Although no direct comparison has been made between Tianqi and antidiabetic prescription drugs, our data indicate that this Chinese herbal medicine had similar effects to metformin," reported Dr. Yuan.
Asked to comment on whether diabetes prevention was regularly practiced in the United States, he remarked that unacceptable adverse effects limited regular use of conventional therapies in the prevention of type 2 diabetes, with reports showing that long-term administration of acarbose or metformin had often been associated with unfavorable gastrointestinal events.
Around 79 million individuals in the United States aged over 20 years have prediabetes, a state in which blood glucose levels are higher than normal but do not meet the diagnostic criteria for type 2 diabetes.
Dr. Yuan added that their data also show that after a period of cessation of the Tianqi treatment, the preventive effects on type 2 diabetes development remained significant. "Moreover, the safety profile of this herbal medicine is very good without obvious adverse effects," he commented.
Chinese Herb Combination and Study Design
The Chinese medicine comprises several herbs that have been shown to lower blood glucose levels after meals. The Tianqi capsule is manufactured by Heilongjiang Baoquan Pharmaceutical and consists of 10 Chinese herbal medicines: Astragali Radix, Coptidis Rhizoma, Trichosanthis Radix, Ligustri Lucidi Fructus, Dendrobii Caulis, Ginseng Radix, Lycii Cortex, Ecliptae Herba, Galla Chinensis, and Corni Fructus. The quality of these herbs and decoction preparation was in accordance with the Chinese Pharmacopoeia, the researchers note.
Dr. Yuan said the key herb in the combination was Huanglian (Coptidis Rhizoma). "The critical component of this herb is berberine, which has been reported to have good antidiabetic effects," he said in an interview. "Huanglian has been used traditionally in Chinese medicine in treating diabetic symptoms."
A total of 420 participants with IGT recruited from 11 research sites in China underwent double-blind randomization to receive either Tianqi (n=210) or a placebo (n=210) for 12 months. Participants had IGT with a 2-hour plasma glucose concentration of 7.8–11.1 mmol/L after a 75-g oral glucose tolerance test and fasting plasma glucose greater than 7.0 mmol/L.
Oral glucose tolerance tests were conducted every 3 months to assess the development of diabetes or restoration to normal glucose tolerance. In addition, all participants received similar lifestyle education.
The primary end point was the conversion of IGT to type 2 diabetes; body weight, body mass index, and adverse effects were monitored.
Need For More Controlled Trials of Chinese Medicinal Herbs
"Although the results of the present study need to be confirmed in future larger clinical trials, Tianqi holds promising potential as an effective and practical means to prevent type 2 diabetes, particularly in places in which herbal medicines are culturally accepted and widely used," say the authors.
They note that treating diabetes with Chinese herbal medicines is popular in China, particularly in rural areas.
"Our encouraging data should initiate further studies, both in China and in the West, to evaluate the role of Chinese herbal medicine in preventing and treating diabetes," said Dr. Yuan. "We are currently conducting several studies in this field."
Wednesday, 8 January 2014
High Blood Glucose Linked to Dementia Even Without Diabetes
Higher blood glucose levels, shy of the diabetic range, increase the risk for dementia in adults, new data from the longitudinal Adult Changes in Thought (ACT) study suggest.
"We considered blood glucose levels far into the normal (nondiabetic) range, and even there found an association between higher glucose levels and dementia risk," said Paul K. Crane, MD, associate professor of medicine at the University of Washington School of Medicine, Seattle, and affiliate investigator at the Group Health Research Institute in Seattle.
He said the results suggest that the "clinical determination of diabetes/not diabetes may miss important associations still there for people who are categorized as not having diabetes."
Diabetes Already a Risk Factor
Although diabetes is a risk factor for dementia, it's been unclear whether higher glucose levels, short of diabetes, also increase the risk for dementia. "Most studies that have investigated associations between glucose metabolism and the risk of dementia have focused on diabetes itself, and they have yielded inconsistent results," the authors note.
For their study, Dr. Crane and colleagues had access to 35,264 clinical measurements of glucose levels and 10,208 measurements of glycated hemoglobin from 839 men and 1228 women without dementia at baseline. Their mean age at baseline was 76 years.
During a median follow-up of 6.8 years, 524 participants developed dementia, including 74 with diabetes and 450 without.
Among participants without diabetes, higher average glucose levels within the preceding 5 years were associated with increased risk for dementia (P = .01). At a glucose level of 115 mg/dL, as compared with 100 mg/dL, the adjusted hazard ratio for dementia was 1.18 (95% confidence interval [CI], 1.04 - 1.33).
Higher average glucose levels were also related to an increased risk for dementia in those with diabetes (P = .002). At a glucose level of 190 mg/dL, as compared with 160 mg/dL, the adjusted hazard ratio for dementia risk was 1.40 (95% CI, 1.12 - 1.76). The findings were consistent across of variety of sensitivity analyses, the authors say.
The data suggest that higher levels of glucose may have "deleterious effects on the aging brain," the authors say.
The ACT study is "unique in being linked to clinical data from a healthcare delivery system along with research-quality dementia diagnoses," said Dr. Crane.
The findings are based on an average of 17 blood glucose measurements per person, "very rich data," senior author Eric B. Larson, MD, from Group Health Research Institute and University of Washington, added in a statement.
"This work is increasingly relevant because of the worldwide epidemics of dementia, obesity and diabetes," Dr. Crane said.
Modest Effect
Last month in a publication in JAMA Neurology, data from the Baltimore Longitudinal Study of Aging failed to show a link between glucose intolerance, diabetes or insulin resistance Alzheimer's disease or amyloid-β accumulation in the brain.
Richard J. O'Brien, MD, PhD, chair, Department of Neurology, Johns Hopkins Bayview Medical Center, Baltimore, Maryland, was an author on that study. Asked for his thoughts on the ACT data, Dr. O'Brien said it should be noted that "the effect is modest but they have a much larger sample size and we could have missed a small effect like that."
He said it should also be noted that the outcomes of the 2 studies are different. "Their outcome is dementia — they have no way of knowing about the specific Alzheimer's component to that outcome. It is certainly possible that vascular disease related to diabetes is the important factor increasing the rate of dementia. Dementia is often multifactorial — with Alzheimer's and vascular disease being the two biggest contributors," he noted. "The vascular component is difficult to detect clinically because it is often asymptomatic — only pathology can detect it."
Dr. Crane and colleagues say the underlying mechanisms of the association between elevated glucose levels and dementia need to be clarified in future studies.
"This was an observational study, and clinical practice really should not change based on the results of this study," Dr. Crane cautioned. "We do not know whether interventions designed to lower blood glucose would also lower dementia risk. That important question requires a different study design.
"However, from other research, including research from the ACT study, we know that exercise is associated with reduced dementia risk," he added. "Exercises such as walking or aerobic exercises in swimming pools are well tolerated and fun."
"We considered blood glucose levels far into the normal (nondiabetic) range, and even there found an association between higher glucose levels and dementia risk," said Paul K. Crane, MD, associate professor of medicine at the University of Washington School of Medicine, Seattle, and affiliate investigator at the Group Health Research Institute in Seattle.
He said the results suggest that the "clinical determination of diabetes/not diabetes may miss important associations still there for people who are categorized as not having diabetes."
Diabetes Already a Risk Factor
Although diabetes is a risk factor for dementia, it's been unclear whether higher glucose levels, short of diabetes, also increase the risk for dementia. "Most studies that have investigated associations between glucose metabolism and the risk of dementia have focused on diabetes itself, and they have yielded inconsistent results," the authors note.
For their study, Dr. Crane and colleagues had access to 35,264 clinical measurements of glucose levels and 10,208 measurements of glycated hemoglobin from 839 men and 1228 women without dementia at baseline. Their mean age at baseline was 76 years.
During a median follow-up of 6.8 years, 524 participants developed dementia, including 74 with diabetes and 450 without.
Among participants without diabetes, higher average glucose levels within the preceding 5 years were associated with increased risk for dementia (P = .01). At a glucose level of 115 mg/dL, as compared with 100 mg/dL, the adjusted hazard ratio for dementia was 1.18 (95% confidence interval [CI], 1.04 - 1.33).
Higher average glucose levels were also related to an increased risk for dementia in those with diabetes (P = .002). At a glucose level of 190 mg/dL, as compared with 160 mg/dL, the adjusted hazard ratio for dementia risk was 1.40 (95% CI, 1.12 - 1.76). The findings were consistent across of variety of sensitivity analyses, the authors say.
The data suggest that higher levels of glucose may have "deleterious effects on the aging brain," the authors say.
The ACT study is "unique in being linked to clinical data from a healthcare delivery system along with research-quality dementia diagnoses," said Dr. Crane.
The findings are based on an average of 17 blood glucose measurements per person, "very rich data," senior author Eric B. Larson, MD, from Group Health Research Institute and University of Washington, added in a statement.
"This work is increasingly relevant because of the worldwide epidemics of dementia, obesity and diabetes," Dr. Crane said.
Modest Effect
Last month in a publication in JAMA Neurology, data from the Baltimore Longitudinal Study of Aging failed to show a link between glucose intolerance, diabetes or insulin resistance Alzheimer's disease or amyloid-β accumulation in the brain.
Richard J. O'Brien, MD, PhD, chair, Department of Neurology, Johns Hopkins Bayview Medical Center, Baltimore, Maryland, was an author on that study. Asked for his thoughts on the ACT data, Dr. O'Brien said it should be noted that "the effect is modest but they have a much larger sample size and we could have missed a small effect like that."
He said it should also be noted that the outcomes of the 2 studies are different. "Their outcome is dementia — they have no way of knowing about the specific Alzheimer's component to that outcome. It is certainly possible that vascular disease related to diabetes is the important factor increasing the rate of dementia. Dementia is often multifactorial — with Alzheimer's and vascular disease being the two biggest contributors," he noted. "The vascular component is difficult to detect clinically because it is often asymptomatic — only pathology can detect it."
Dr. Crane and colleagues say the underlying mechanisms of the association between elevated glucose levels and dementia need to be clarified in future studies.
"This was an observational study, and clinical practice really should not change based on the results of this study," Dr. Crane cautioned. "We do not know whether interventions designed to lower blood glucose would also lower dementia risk. That important question requires a different study design.
"However, from other research, including research from the ACT study, we know that exercise is associated with reduced dementia risk," he added. "Exercises such as walking or aerobic exercises in swimming pools are well tolerated and fun."
Chocolate reduces stroke risk in men
More evidence supports a link between chocolate consumption and reduced stroke risk.
A new study shows that men who consume the most chocolate have a 17% lower risk for stroke than those who consume the least. A meta-analysis included in the study showed an overall 19% decreased risk for stroke for the highest consumers of chocolate — male and female — compared with those who ate the least.
Although results "suggest that chocolate consumption is inversely associated with risk of stroke," further research is needed to confirm these findings before any recommendations about chocolate consumption can be given, according to the authors, led by Susanna C. Larsson, PhD, of the Division of Nutritional Epidemiology, National Institute for Health and Welfare, in Helsinki, Finland.
Chocolate Categories
The study included 37,103 men from the Cohort of Swedish Men, a prospective study that began in 1997, when all men aged 45 to 79 years residing in 2 counties in central Sweden were asked to provide detailed information on diet and other lifestyle factors. Researchers assessed chocolate consumption using a self-administered food frequency questionnaire that included 96 foods and beverages.
For chocolate consumption, respondents could choose from 8 categories, ranging from never, to eating this confection 3 or more times per day. To determine consumption in grams, researchers multiplied the frequency of consumption by 4 age-specific portion sizes to arrive at quartiles of chocolate consumption: less than 12.0 g/week (0); 12.0 to 19.5 g/week (12.5); 19.8 to 51.3 g/week (38.4); and at least 51.8 g/week (62.9).
By linking the study population to the Swedish Hospital Discharge Register, researchers identified incident cases of stroke. They classified stroke events as cerebral infarction, intracerebral hemorrhage, subarachnoid hemorrhage, and unspecified stroke.
From 1998 to 2008, 1995 cases of first stroke were reported, including 1511 cerebral infarctions, 321 hemorrhagic strokes, and 163 unspecified strokes. Compared with men who had the lowest chocolate consumption, those who ate the most chocolate were younger on average but were less likely to be current smokers. They tended to be slightly leaner, and to consume more alcohol, red meat, fruits, and vegetables, but less fish.
Adjusted Risk
After adjustment for several factors, including age, education, smoking status, body mass index, physical activity, aspirin use, history of hypertension, and atrial fibrillation, high chocolate consumption was associated with a statistically significant lower risk for total stroke.
Compared with men in the lowest quartile of consumption (0 g/week), those in the highest category (62.9 g/week) had a reduced risk for stroke of 17%. Results were similar for cerebral infarction and for hemorrhagic stroke.
Age-standardized incidence rates of stroke were 85 per 100,000 person-years for those consuming the least chocolate, and 73 per 100,000 person-years for those in the highest quartile.
In an analysis stratified by history of hypertension, an inverse relation between chocolate consumption and risk for total stroke was observed in men without hypertension, but not in men with a history of hypertension.
It is possible that the blood pressure–lowering effect of chocolate consumption helps explain the association in men without a history of hypertension; such men may have had normal blood pressure at baseline as the result of treatment for hypertension, say the authors.
The researchers also carried out a meta-analysis that included results from the current study as well as from 4 other prospective studies assessing the association between chocolate consumption and stroke risk. In this analysis, which included 4260 stroke cases, the overall RR for stroke for the highest vs lowest category of chocolate consumption was 0.81, or a 19% reduction in risk.
Chocolate is a rich source of flavonoids that may protect against cardiovascular disease through antioxidant, antiplatelet, and anti-inflammatory effects. Flavonoids may also lower blood pressure, increase high-density lipoprotein (HDL) cholesterol, and improve endothelial function. Chocolate contains antioxidant-rich caffeine, albeit in low amounts.
One of the limitations of the study is that chocolate consumption was self-reported and was measured only at a single time point. Another is that information on the type of chocolate consumed was not available. As a result, any association with dark chocolate, previously linked to health benefits, could not be examined, but in general, about 90% of chocolate consumed in Sweden is milk chocolate, containing about 30% cocoa solids, they write.
The authors stress also that chocolate is high in sugar, saturated fat, and calories and should be consumed in moderation.
Confirms Observations
Reached for a comment, Gustavo Saposnik, MD, director of the Stroke Outcomes Research Center, and associate professor and clinician scientist in the Departments of Medicine (Neurology) and Health Policy, Management, and Evaluation, at St. Michael's Hospital, University of Toronto, in Ontario, Canada, said the present study confirms previous observations, including those of his own systematic review, that eating chocolate is associated with lower risk for stroke.
Results of his and his colleagues' review, presented at the American Academy of Neurology Meeting in 2010 showed that in one study, a 22% reduction in stroke risk was noted for participants who had a single serving of chocolate per week, and in another, a 46% reduction in stroke mortality resulted from weekly consumption of flavonoids in 50 g of chocolate vs no consumption. However, the number of studies looking at the relationship between chocolate consumption and stroke risk, and included in the analysis, was small.
Dr. Saposnik found it interesting that in the current study, even very low chocolate consumption (median, 62.9 g/week) was associated with a 17% stroke reduction.
He noted that the study was limited by the use of questionnaires that were not specifically intended to target the association between chocolate and risk for stroke. This, he said, could introduce potential risk for misclassification. As well, Dr. Saposnik said that the study provided limited information about the composition of the chocolate consumed.
For Dr. Saposnik, the strength of the study was the meta-analysis, which included results from 5 studies.
A new study shows that men who consume the most chocolate have a 17% lower risk for stroke than those who consume the least. A meta-analysis included in the study showed an overall 19% decreased risk for stroke for the highest consumers of chocolate — male and female — compared with those who ate the least.
Although results "suggest that chocolate consumption is inversely associated with risk of stroke," further research is needed to confirm these findings before any recommendations about chocolate consumption can be given, according to the authors, led by Susanna C. Larsson, PhD, of the Division of Nutritional Epidemiology, National Institute for Health and Welfare, in Helsinki, Finland.
Chocolate Categories
The study included 37,103 men from the Cohort of Swedish Men, a prospective study that began in 1997, when all men aged 45 to 79 years residing in 2 counties in central Sweden were asked to provide detailed information on diet and other lifestyle factors. Researchers assessed chocolate consumption using a self-administered food frequency questionnaire that included 96 foods and beverages.
For chocolate consumption, respondents could choose from 8 categories, ranging from never, to eating this confection 3 or more times per day. To determine consumption in grams, researchers multiplied the frequency of consumption by 4 age-specific portion sizes to arrive at quartiles of chocolate consumption: less than 12.0 g/week (0); 12.0 to 19.5 g/week (12.5); 19.8 to 51.3 g/week (38.4); and at least 51.8 g/week (62.9).
By linking the study population to the Swedish Hospital Discharge Register, researchers identified incident cases of stroke. They classified stroke events as cerebral infarction, intracerebral hemorrhage, subarachnoid hemorrhage, and unspecified stroke.
From 1998 to 2008, 1995 cases of first stroke were reported, including 1511 cerebral infarctions, 321 hemorrhagic strokes, and 163 unspecified strokes. Compared with men who had the lowest chocolate consumption, those who ate the most chocolate were younger on average but were less likely to be current smokers. They tended to be slightly leaner, and to consume more alcohol, red meat, fruits, and vegetables, but less fish.
Adjusted Risk
After adjustment for several factors, including age, education, smoking status, body mass index, physical activity, aspirin use, history of hypertension, and atrial fibrillation, high chocolate consumption was associated with a statistically significant lower risk for total stroke.
Compared with men in the lowest quartile of consumption (0 g/week), those in the highest category (62.9 g/week) had a reduced risk for stroke of 17%. Results were similar for cerebral infarction and for hemorrhagic stroke.
Age-standardized incidence rates of stroke were 85 per 100,000 person-years for those consuming the least chocolate, and 73 per 100,000 person-years for those in the highest quartile.
In an analysis stratified by history of hypertension, an inverse relation between chocolate consumption and risk for total stroke was observed in men without hypertension, but not in men with a history of hypertension.
It is possible that the blood pressure–lowering effect of chocolate consumption helps explain the association in men without a history of hypertension; such men may have had normal blood pressure at baseline as the result of treatment for hypertension, say the authors.
The researchers also carried out a meta-analysis that included results from the current study as well as from 4 other prospective studies assessing the association between chocolate consumption and stroke risk. In this analysis, which included 4260 stroke cases, the overall RR for stroke for the highest vs lowest category of chocolate consumption was 0.81, or a 19% reduction in risk.
Chocolate is a rich source of flavonoids that may protect against cardiovascular disease through antioxidant, antiplatelet, and anti-inflammatory effects. Flavonoids may also lower blood pressure, increase high-density lipoprotein (HDL) cholesterol, and improve endothelial function. Chocolate contains antioxidant-rich caffeine, albeit in low amounts.
One of the limitations of the study is that chocolate consumption was self-reported and was measured only at a single time point. Another is that information on the type of chocolate consumed was not available. As a result, any association with dark chocolate, previously linked to health benefits, could not be examined, but in general, about 90% of chocolate consumed in Sweden is milk chocolate, containing about 30% cocoa solids, they write.
The authors stress also that chocolate is high in sugar, saturated fat, and calories and should be consumed in moderation.
Confirms Observations
Reached for a comment, Gustavo Saposnik, MD, director of the Stroke Outcomes Research Center, and associate professor and clinician scientist in the Departments of Medicine (Neurology) and Health Policy, Management, and Evaluation, at St. Michael's Hospital, University of Toronto, in Ontario, Canada, said the present study confirms previous observations, including those of his own systematic review, that eating chocolate is associated with lower risk for stroke.
Results of his and his colleagues' review, presented at the American Academy of Neurology Meeting in 2010 showed that in one study, a 22% reduction in stroke risk was noted for participants who had a single serving of chocolate per week, and in another, a 46% reduction in stroke mortality resulted from weekly consumption of flavonoids in 50 g of chocolate vs no consumption. However, the number of studies looking at the relationship between chocolate consumption and stroke risk, and included in the analysis, was small.
Dr. Saposnik found it interesting that in the current study, even very low chocolate consumption (median, 62.9 g/week) was associated with a 17% stroke reduction.
He noted that the study was limited by the use of questionnaires that were not specifically intended to target the association between chocolate and risk for stroke. This, he said, could introduce potential risk for misclassification. As well, Dr. Saposnik said that the study provided limited information about the composition of the chocolate consumed.
For Dr. Saposnik, the strength of the study was the meta-analysis, which included results from 5 studies.
Cocoa, Even With Few Flavonoids, Boosts Cognition
Drinking cocoa, whether rich in flavonoids or not, appears to boost the effect of blood flow on neuronal activity in the brain, known as neurovascular coupling (NVC).
A new study shows not only that drinking flavonoid-rich or flavonoid-poor cocoa improves NVC but also that higher NVC is associated with better cognitive performance and greater cerebral white matter structural integrity in elderly patients with vascular risk factors.
As researchers search for ways to detect dementia at the earliest possible stage, the study results could pave the way for using NVC as a biomarker for vascular function in those at high risk for dementia, said lead author Farzaneh A. Sorond, MD, PhD, Department of Neurology, Stroke Division, Brigham and Women's Hospital, Boston, Massachusetts.
"Our study shows that NVC is modifiable and can be enhanced with cocoa consumption," said Dr. Sorond.
The study is published online August 7 in Neurology.
Tight Correlation
The double-blind proof-of-concept study included 60 community-dwelling participants, mean age 72.9 years. About 90% of the participants were hypertensive, but with well-controlled blood pressure, and half had diabetes mellitus type 2 with reasonably good control. Three quarters were overweight or obese.
Participants were randomly assigned to 2 cups a day of cocoa rich in flavonoids (609 mg per serving) or cocoa with little flavonoids (13 mg per serving). Diets were adjusted to incorporate the cocoa, each cup of which contained 100 calories. Participants were also asked to abstain from eating chocolate.
Researchers measured cerebral blood flow in these participants using transcranial Doppler ultrasonography. Among other things, they documented changes in the middle cerebral artery and blood flow velocity at rest and in response to cognitive tasks (NVC).
The study showed that NVC was tightly correlated with cognition; scores for Trail making Test B, a test of executive function, were significantly better in those with intact NVC (89 seconds vs 167 seconds; P = .002). Participants with intact NVC also had significantly better performance on the 2-Back Task, a test for both attention and memory (82% vs 75%; P = .02).
"The higher you increase your blood flow during a cognitive task, the better your cognitive performance," commented Dr. Sorond, adding that this is something that has never been shown before.
NVC was also correlated with cerebral white matter structural integrity. Higher NVC was associated with overall less white matter macro- and micro-structural damage. In general, those with intact NVC had a greater volume of normal white matter and smaller volume of white matter hyperintensities, higher fractional anisotropy, and lower mean diffusivity in the normal white matter and WMH.
Therapeutic Target
These results suggest that NVC could be an important therapeutic target. But before NVC can be considered a biomarker, it has to be shown to be changeable, and the clinical importance of the modification must be shown.
To that end, the study authors opted to use cocoa. They could have chosen many other potential modifiers but chose cocoa because the literature has shown the beneficial effects of cocoa on brain health and also because it's something that many people enjoy, said Dr. Sorond.
The study found that blood pressure, blood flow, and change in NVC were not significantly different between the 2 cocoa groups. In the combined cocoa groups, 30-day blood pressures were not significantly different from baseline (P > .5).
In contrast, response to cocoa differed significantly depending on NVC status. Cocoa had a significant effect on NVC in those with impaired (<5%) coupling at baseline. Of those with impaired NVC, 89% responded to 30 days of cocoa consumption and increased NVC compared with only 36% of those with intact NVC (P = .0002). In those with impaired baseline coupling, cocoa consumption was associated with an 8.3% (P < .0001) increase in NVC at 30 days.
A new study shows not only that drinking flavonoid-rich or flavonoid-poor cocoa improves NVC but also that higher NVC is associated with better cognitive performance and greater cerebral white matter structural integrity in elderly patients with vascular risk factors.
As researchers search for ways to detect dementia at the earliest possible stage, the study results could pave the way for using NVC as a biomarker for vascular function in those at high risk for dementia, said lead author Farzaneh A. Sorond, MD, PhD, Department of Neurology, Stroke Division, Brigham and Women's Hospital, Boston, Massachusetts.
"Our study shows that NVC is modifiable and can be enhanced with cocoa consumption," said Dr. Sorond.
The study is published online August 7 in Neurology.
Tight Correlation
The double-blind proof-of-concept study included 60 community-dwelling participants, mean age 72.9 years. About 90% of the participants were hypertensive, but with well-controlled blood pressure, and half had diabetes mellitus type 2 with reasonably good control. Three quarters were overweight or obese.
Participants were randomly assigned to 2 cups a day of cocoa rich in flavonoids (609 mg per serving) or cocoa with little flavonoids (13 mg per serving). Diets were adjusted to incorporate the cocoa, each cup of which contained 100 calories. Participants were also asked to abstain from eating chocolate.
Researchers measured cerebral blood flow in these participants using transcranial Doppler ultrasonography. Among other things, they documented changes in the middle cerebral artery and blood flow velocity at rest and in response to cognitive tasks (NVC).
The study showed that NVC was tightly correlated with cognition; scores for Trail making Test B, a test of executive function, were significantly better in those with intact NVC (89 seconds vs 167 seconds; P = .002). Participants with intact NVC also had significantly better performance on the 2-Back Task, a test for both attention and memory (82% vs 75%; P = .02).
"The higher you increase your blood flow during a cognitive task, the better your cognitive performance," commented Dr. Sorond, adding that this is something that has never been shown before.
NVC was also correlated with cerebral white matter structural integrity. Higher NVC was associated with overall less white matter macro- and micro-structural damage. In general, those with intact NVC had a greater volume of normal white matter and smaller volume of white matter hyperintensities, higher fractional anisotropy, and lower mean diffusivity in the normal white matter and WMH.
Therapeutic Target
These results suggest that NVC could be an important therapeutic target. But before NVC can be considered a biomarker, it has to be shown to be changeable, and the clinical importance of the modification must be shown.
To that end, the study authors opted to use cocoa. They could have chosen many other potential modifiers but chose cocoa because the literature has shown the beneficial effects of cocoa on brain health and also because it's something that many people enjoy, said Dr. Sorond.
The study found that blood pressure, blood flow, and change in NVC were not significantly different between the 2 cocoa groups. In the combined cocoa groups, 30-day blood pressures were not significantly different from baseline (P > .5).
In contrast, response to cocoa differed significantly depending on NVC status. Cocoa had a significant effect on NVC in those with impaired (<5%) coupling at baseline. Of those with impaired NVC, 89% responded to 30 days of cocoa consumption and increased NVC compared with only 36% of those with intact NVC (P = .0002). In those with impaired baseline coupling, cocoa consumption was associated with an 8.3% (P < .0001) increase in NVC at 30 days.
Gut Bacteria Affect Colorectal Cancer Risk
The gut microbiome of patients with colorectal cancer (CRC) was less diverse than that of matched patients without cancer, and the presence of some taxa was associated with increased CRC risk, according to research published online December 6 in the Journal of the National Cancer Institute.
"Because of the potentially modifiable nature of the gut bacteria, our findings may have implications for CRC prevention," write Jiyoung Ahn, PhD, assistant professor of population health and a member of the NYU Cancer Institute, New York University School of Medicine, New York City, and colleagues.
The results came from an analysis of fecal bacterial DNA from 47 patients with CRC and 94 control participants matched to the CRC group by sex and body mass index. The investigators amplified 794,217 16S rRNA genes and then classified the sequences taxonomically.
They note that this is the first study to compare the gut microbiomes of people with and without CRC that included multiple comparisons of bacteria while controlling for possible confounders.
The researchers found decreased microbiome community diversity in patients with CRC , compared with that of healthy participants(P= .02). In an analysis by taxa, patients with CRC had lower relative abundances of Clostridia, at 68.6% compared with 77.8% in people without CRC. In contrast, patients with CRC carried a higher relative abundance of Fusobacterium (31.9% vs 11.7% for control patients).
A higher relative abundance of Fusobacterium was associated with increased CRC risk (multivariable odds ratio [OR], 4.11; 95% confidence interval [CI], 1.62 - 10.47), after adjusting for age, sex, body mass index, race, smoking, and sequencing batch.
Actinobacteria Atopobium (OR, 14.36; 95% CI, 2.78 - 74.30; P < .001) and the Bacteriodetes Porphyromonas(OR, 5.17; 95% CI, 1.75 - 15.25; P = .001) were also associated with CRC risk. The Gram-positiveAtopobium is associated with Crohn's disease and is reported to inhibit colon cancer apoptosis in vitro.Polyphyromonas, which is often found in the mouth and gastrointestinal tract, is associated with periodontal disease.
Patients with CRC tended to have more Bacteroidetes phylum bacteria (16.2% relative abundance vs 9.9% for control participants) and fewer Firmicutes (74.0% for patients with CRC compared with 80.3% for control participants). The depletion of Firmicutes was highest for the class Clostridia (68.6% for patients with CRC vs 77.8% for control participants; P = .005; false discovery rate–adjusted P < .05). Among the depleted taxa was Coprococcus in the Clostridia family Lachnospiracea. Coprococcus is responsible for the efficient fermentation of dietary fiber and other complex carbohydrates into butyrate, a major colonic metabolite that may inhibit colonic inflammation and carcinogenesis, the authors write.
In an additional assay using quantitative polymerase chain reaction for Porphyromonas and Fusobacterium, the association between CRC and these taxa remained significant. In that study, the OR for CRC with the presence of Porphyromonas was 1.44 (P = .05), and the OR for CRC with the presence of Fusobacteriumwas 1.44 (P = 0.01).
"Because of the potentially modifiable nature of the gut bacteria, our findings may have implications for CRC prevention," write Jiyoung Ahn, PhD, assistant professor of population health and a member of the NYU Cancer Institute, New York University School of Medicine, New York City, and colleagues.
The results came from an analysis of fecal bacterial DNA from 47 patients with CRC and 94 control participants matched to the CRC group by sex and body mass index. The investigators amplified 794,217 16S rRNA genes and then classified the sequences taxonomically.
They note that this is the first study to compare the gut microbiomes of people with and without CRC that included multiple comparisons of bacteria while controlling for possible confounders.
The researchers found decreased microbiome community diversity in patients with CRC , compared with that of healthy participants(P= .02). In an analysis by taxa, patients with CRC had lower relative abundances of Clostridia, at 68.6% compared with 77.8% in people without CRC. In contrast, patients with CRC carried a higher relative abundance of Fusobacterium (31.9% vs 11.7% for control patients).
A higher relative abundance of Fusobacterium was associated with increased CRC risk (multivariable odds ratio [OR], 4.11; 95% confidence interval [CI], 1.62 - 10.47), after adjusting for age, sex, body mass index, race, smoking, and sequencing batch.
Actinobacteria Atopobium (OR, 14.36; 95% CI, 2.78 - 74.30; P < .001) and the Bacteriodetes Porphyromonas(OR, 5.17; 95% CI, 1.75 - 15.25; P = .001) were also associated with CRC risk. The Gram-positiveAtopobium is associated with Crohn's disease and is reported to inhibit colon cancer apoptosis in vitro.Polyphyromonas, which is often found in the mouth and gastrointestinal tract, is associated with periodontal disease.
Patients with CRC tended to have more Bacteroidetes phylum bacteria (16.2% relative abundance vs 9.9% for control participants) and fewer Firmicutes (74.0% for patients with CRC compared with 80.3% for control participants). The depletion of Firmicutes was highest for the class Clostridia (68.6% for patients with CRC vs 77.8% for control participants; P = .005; false discovery rate–adjusted P < .05). Among the depleted taxa was Coprococcus in the Clostridia family Lachnospiracea. Coprococcus is responsible for the efficient fermentation of dietary fiber and other complex carbohydrates into butyrate, a major colonic metabolite that may inhibit colonic inflammation and carcinogenesis, the authors write.
In an additional assay using quantitative polymerase chain reaction for Porphyromonas and Fusobacterium, the association between CRC and these taxa remained significant. In that study, the OR for CRC with the presence of Porphyromonas was 1.44 (P = .05), and the OR for CRC with the presence of Fusobacteriumwas 1.44 (P = 0.01).
Daily Multivitamin Lowers Risk for Cataracts
Taking a multivitamin supplement daily may stave off development of cataracts in middle-aged and older men, according to an article published online November 21 in Ophthalmology. However, there appeared to be no association between daily vitamins and visually significant age-related macular degeneration (AMD).
William G. Christen, ScD, from the Division of Cardiovascular Disease, Brigham and Women's Hospital and Harvard Medical School, Boston, Massachusetts, and colleagues analyzed results of the Physicians' Health Study (PHS) II, in which cataract and AMD were secondary endpoints. PHS II results for cardiovascular and cancer risk were published earlier.
The eye disease component of PHS II involved 11,497 male physicians randomly assigned to take a daily multivitamin (Centrum Silver, Pfizer) (n = 5736) or placebo (n = 5761) in the cataract part of the study, and 14,233 men randomly assigned to a multivitamin (n = 7111) or placebo (n = 7122) in the AMD part of the study. Study participants reported no cataract or AMD incidence at baseline.
The physicians completed annual questionnaires between 1997 and 2011 and provided information about pill-taking adherence, possible adverse events, risk factors, and occurrence of any new cataract or AMD cases. Researchers followed the participants for a mean of 11.2 years, confirming the occurrence of 1817 cataract and 1337 cataract extractions, along with 538 cases of AMD. Of the AMD cases, 281 were visually significant and 144 were advanced.
The researchers found a 9% lower risk for cataract among participants in the multivitamin group compared with the placebo group (hazard ratio [HR], 0.91; 95% confidence interval [CI], 0.83 - 0.99; P = .04). The benefit was greater in older men.
They found no statistically significant difference for visually significant AMD, although there was a trend for increased risk among the multivitamin group (152 vs 129 cases; HR, 1.19; 95% CI, 0.94 - 1.50; P = .15). However, the increased risk associated with vitamin use reached statistical significance when the investigators analyzed any type of AMD, with or without visual loss (294 vs 244 cases; HR, 1.22; 95% CI, 1.03 - 1.44; P = .02).
Small Percentage, Large Impact
"Cataract is a very common source of ocular morbidity in US adults. If our findings are real, that is, if multivitamins really do reduce the risk of cataract, even by a modest 10%, this rather small reduction would nonetheless have a large public health impact," Dr. Christen wrote.
Emily Y. Chew, MD, deputy clinical director at the National Eye Institute in Bethesda, Maryland, agrees. "It's akin to cardiovascular disease. Cardiovascular disease is very common, so even a 10% reduction is a huge relief of the burden of disease," she said. "Cataract is the most common surgery that Medicare pays for, and it's going to get worse. Anything we can do to reduce it is a major feat."
Dr. Chew was a leader in another prominent research project, Age-Related Eye Disease Study, or AREDS, which found similar results for multivitamin use. "We found that Centrum was protective for cataracts, just like they did here, but not for macular degeneration," she said. She is now a leader on the AREDS2.
The new data "at least suggests to the general clinician or the general ophthalmologist, maybe even the primary care physician, that the notion of taking a multivitamin is probably something that we should encourage in more patients than not," Matthew J. Welch, MD, a vitreoretinal surgeon at the Loyola University of Chicago Stritch School of Medicine in Maywood, Illinois, said. "You're probably doing yourself some good — you're definitely not doing yourself any harm — by taking a multivitamin, and maybe that does stave off your cataract progression."
Why Not AMD?
Researchers still have to tease out why no significant benefit was shown for AMD in PHS II.
"A true lack of effect is one possibility," Dr. Christen said. "Another possibility is that our finding of no significant benefit or harm may simply be due to the play of chance. Other possibilities include inadequate dosages or inadequate follow-time. It may also be that a different combination of nutrients is required."
The newly published results differ from the results of AREDS, in which participants taking high-dose antioxidant combination of vitamin C, vitamin E, β-carotene, and zinc showed a slower progression to AMD. But in PHS II, the doses were only recommended daily allowances, and AREDS participants were higher-risk individuals.
One issue with the PHS II study, according to Dr. Welch, is that it was designed primarily for finding cardiovascular and cancer risks and that it was based on self-reported information from physicians.
"You're dealing with a group of individuals that view their time as highly valuable and aren't necessarily going to be running to the doctor every time they notice something," he explained. "I think probably if this had been designed specifically to evaluate the eye, they may have found more of a difference [for AMD]. There may be more cataracts and AMD found if individuals were systematically screened."
Dr. Welch said he thinks unknown benefits from dietary supplements probably exist: "It's just that it's hard to tease out. It's not as easy as it is in a drug trial."
William G. Christen, ScD, from the Division of Cardiovascular Disease, Brigham and Women's Hospital and Harvard Medical School, Boston, Massachusetts, and colleagues analyzed results of the Physicians' Health Study (PHS) II, in which cataract and AMD were secondary endpoints. PHS II results for cardiovascular and cancer risk were published earlier.
The eye disease component of PHS II involved 11,497 male physicians randomly assigned to take a daily multivitamin (Centrum Silver, Pfizer) (n = 5736) or placebo (n = 5761) in the cataract part of the study, and 14,233 men randomly assigned to a multivitamin (n = 7111) or placebo (n = 7122) in the AMD part of the study. Study participants reported no cataract or AMD incidence at baseline.
The physicians completed annual questionnaires between 1997 and 2011 and provided information about pill-taking adherence, possible adverse events, risk factors, and occurrence of any new cataract or AMD cases. Researchers followed the participants for a mean of 11.2 years, confirming the occurrence of 1817 cataract and 1337 cataract extractions, along with 538 cases of AMD. Of the AMD cases, 281 were visually significant and 144 were advanced.
The researchers found a 9% lower risk for cataract among participants in the multivitamin group compared with the placebo group (hazard ratio [HR], 0.91; 95% confidence interval [CI], 0.83 - 0.99; P = .04). The benefit was greater in older men.
They found no statistically significant difference for visually significant AMD, although there was a trend for increased risk among the multivitamin group (152 vs 129 cases; HR, 1.19; 95% CI, 0.94 - 1.50; P = .15). However, the increased risk associated with vitamin use reached statistical significance when the investigators analyzed any type of AMD, with or without visual loss (294 vs 244 cases; HR, 1.22; 95% CI, 1.03 - 1.44; P = .02).
Small Percentage, Large Impact
"Cataract is a very common source of ocular morbidity in US adults. If our findings are real, that is, if multivitamins really do reduce the risk of cataract, even by a modest 10%, this rather small reduction would nonetheless have a large public health impact," Dr. Christen wrote.
Emily Y. Chew, MD, deputy clinical director at the National Eye Institute in Bethesda, Maryland, agrees. "It's akin to cardiovascular disease. Cardiovascular disease is very common, so even a 10% reduction is a huge relief of the burden of disease," she said. "Cataract is the most common surgery that Medicare pays for, and it's going to get worse. Anything we can do to reduce it is a major feat."
Dr. Chew was a leader in another prominent research project, Age-Related Eye Disease Study, or AREDS, which found similar results for multivitamin use. "We found that Centrum was protective for cataracts, just like they did here, but not for macular degeneration," she said. She is now a leader on the AREDS2.
The new data "at least suggests to the general clinician or the general ophthalmologist, maybe even the primary care physician, that the notion of taking a multivitamin is probably something that we should encourage in more patients than not," Matthew J. Welch, MD, a vitreoretinal surgeon at the Loyola University of Chicago Stritch School of Medicine in Maywood, Illinois, said. "You're probably doing yourself some good — you're definitely not doing yourself any harm — by taking a multivitamin, and maybe that does stave off your cataract progression."
Why Not AMD?
Researchers still have to tease out why no significant benefit was shown for AMD in PHS II.
"A true lack of effect is one possibility," Dr. Christen said. "Another possibility is that our finding of no significant benefit or harm may simply be due to the play of chance. Other possibilities include inadequate dosages or inadequate follow-time. It may also be that a different combination of nutrients is required."
The newly published results differ from the results of AREDS, in which participants taking high-dose antioxidant combination of vitamin C, vitamin E, β-carotene, and zinc showed a slower progression to AMD. But in PHS II, the doses were only recommended daily allowances, and AREDS participants were higher-risk individuals.
One issue with the PHS II study, according to Dr. Welch, is that it was designed primarily for finding cardiovascular and cancer risks and that it was based on self-reported information from physicians.
"You're dealing with a group of individuals that view their time as highly valuable and aren't necessarily going to be running to the doctor every time they notice something," he explained. "I think probably if this had been designed specifically to evaluate the eye, they may have found more of a difference [for AMD]. There may be more cataracts and AMD found if individuals were systematically screened."
Dr. Welch said he thinks unknown benefits from dietary supplements probably exist: "It's just that it's hard to tease out. It's not as easy as it is in a drug trial."
Depression Linked to Low Zinc Levels in Blood
People who are depressed have lower concentrations of zinc in their peripheral blood compared with nondepressed individuals, a meta-analysis suggests.
"The pathophysiological relationships between zinc status and depression, and the potential benefits of zinc supplementation in depressed patients, warrant further investigation," write Walter Swardfager, PhD, from the Sunnybrook Research Institute, University of Toronto, Canada, and colleagues.
Their findings were published in the December 15 issue of Biological Psychiatry.
Link to Zinc Deficiency
"A growing body of evidence demonstrates that experimental zinc deficiency can induce depressive-like behavior in animals, which can be effectively reversed by zinc supplementation," the authors write. Furthermore, preliminary clinical trials have suggested that zinc added to antidepressant therapy may produce more rapid or more effective improvement in depressive symptoms.
Many, although not all, studies that have measured zinc concentrations of peripheral blood in depressed and nondepressed individuals during the past several decades have suggested that depression might be associated with lower zinc concentrations in various population samples, the researchers note.
The aim of this meta-analysis was to determine whether the clinical evidence collectively supports lower zinc concentrations in depressed patients compared with nondepressed individuals.
The analysis included 17 studies that measured peripheral blood–zinc concentrations in 1643 depressed patients and 804 control participants. Ten of these studies reported on psychiatric inpatients, and 7 reported on community samples. Of the participants, 34.4% were male; the mean age was 37.7 years.
The researchers found that mean peripheral blood–zinc concentrations were lower by approximately 1.85 µmol/L in depressed individuals compared with control participants (95% confidence interval [CI], -2.51 to -1.19; P < .00001).
More severe depression was associated with greater differences in zinc levels between depressed and control participants (P = .026).
Further, the authors report that effect sizes were larger in studies of inpatients (weighted mean difference [WMD], -2.543; 95% CI, -3.522 to -1.564; P < .0001) vs community samples (WMD, -.943; 95% CI, -1.563 to -.323; P = .003) and in studies with higher methodologic quality (WMD, -2.354; 95% CI, -2.901 to -1.807; P < .0001).
Biologically Plausible
"Although association studies cannot determine the direction of causation, a causal association between zinc status and depression is biologically plausible," the authors write.
"Zinc has antioxidant properties, helps to maintain endocrine homeostasis and immune function, and plays multiple roles in regulating the hippocampal and cortical glutamatergic circuits that subserve affective regulation and cognitive function. Thus, changes in zinc homeostasis might compromise neuroplasticity and contribute to long-term neuropsychological and psychiatric decline," they write.
Zinc also plays an important role in immune function. Lower serum levels of zinc have been associated with disturbances in fatty acid metabolism and in serum lipid levels, which might affect brain function and vascular health. Lower zinc levels have also been associated cardiovascular disease, a common comorbidity of major depressive disorder (MDD), the authors note.
The investigators point out limitations of their study. The quality and risk of bias "were uneven among studies included in this meta-analysis," they write.
Not all studies reported demographic data "sufficiently to be included in investigations of heterogeneity, the use of antidepressants and other concomitant medications were not consistently reported, and data on diet and alcohol use were often not reported," the authors write.
Limitations notwithstanding, the authors conclude that their results suggest that depression is associated with reduced concentrations of zinc in peripheral blood.
"The findings suggest the need to further investigate potential roles of zinc in the pathophysiology of depression, the potential utility of zinc and related biomarkers in monitoring MDD and its clinical sequelae, and potential benefits of zinc supplementation in MDD patients," they conclude.
The study was supported by the Ontario Mental Health Foundation. Dr. Swardfager was supported by fellowships from the Heart and Stroke Foundation Center for Stroke Recovery and the Toronto Rehabilitation Institute. Dr. Swardfager and the other authors report no relevant financial relationships.
"The pathophysiological relationships between zinc status and depression, and the potential benefits of zinc supplementation in depressed patients, warrant further investigation," write Walter Swardfager, PhD, from the Sunnybrook Research Institute, University of Toronto, Canada, and colleagues.
Their findings were published in the December 15 issue of Biological Psychiatry.
Link to Zinc Deficiency
"A growing body of evidence demonstrates that experimental zinc deficiency can induce depressive-like behavior in animals, which can be effectively reversed by zinc supplementation," the authors write. Furthermore, preliminary clinical trials have suggested that zinc added to antidepressant therapy may produce more rapid or more effective improvement in depressive symptoms.
Many, although not all, studies that have measured zinc concentrations of peripheral blood in depressed and nondepressed individuals during the past several decades have suggested that depression might be associated with lower zinc concentrations in various population samples, the researchers note.
The aim of this meta-analysis was to determine whether the clinical evidence collectively supports lower zinc concentrations in depressed patients compared with nondepressed individuals.
The analysis included 17 studies that measured peripheral blood–zinc concentrations in 1643 depressed patients and 804 control participants. Ten of these studies reported on psychiatric inpatients, and 7 reported on community samples. Of the participants, 34.4% were male; the mean age was 37.7 years.
The researchers found that mean peripheral blood–zinc concentrations were lower by approximately 1.85 µmol/L in depressed individuals compared with control participants (95% confidence interval [CI], -2.51 to -1.19; P < .00001).
More severe depression was associated with greater differences in zinc levels between depressed and control participants (P = .026).
Further, the authors report that effect sizes were larger in studies of inpatients (weighted mean difference [WMD], -2.543; 95% CI, -3.522 to -1.564; P < .0001) vs community samples (WMD, -.943; 95% CI, -1.563 to -.323; P = .003) and in studies with higher methodologic quality (WMD, -2.354; 95% CI, -2.901 to -1.807; P < .0001).
Biologically Plausible
"Although association studies cannot determine the direction of causation, a causal association between zinc status and depression is biologically plausible," the authors write.
"Zinc has antioxidant properties, helps to maintain endocrine homeostasis and immune function, and plays multiple roles in regulating the hippocampal and cortical glutamatergic circuits that subserve affective regulation and cognitive function. Thus, changes in zinc homeostasis might compromise neuroplasticity and contribute to long-term neuropsychological and psychiatric decline," they write.
Zinc also plays an important role in immune function. Lower serum levels of zinc have been associated with disturbances in fatty acid metabolism and in serum lipid levels, which might affect brain function and vascular health. Lower zinc levels have also been associated cardiovascular disease, a common comorbidity of major depressive disorder (MDD), the authors note.
The investigators point out limitations of their study. The quality and risk of bias "were uneven among studies included in this meta-analysis," they write.
Not all studies reported demographic data "sufficiently to be included in investigations of heterogeneity, the use of antidepressants and other concomitant medications were not consistently reported, and data on diet and alcohol use were often not reported," the authors write.
Limitations notwithstanding, the authors conclude that their results suggest that depression is associated with reduced concentrations of zinc in peripheral blood.
"The findings suggest the need to further investigate potential roles of zinc in the pathophysiology of depression, the potential utility of zinc and related biomarkers in monitoring MDD and its clinical sequelae, and potential benefits of zinc supplementation in MDD patients," they conclude.
The study was supported by the Ontario Mental Health Foundation. Dr. Swardfager was supported by fellowships from the Heart and Stroke Foundation Center for Stroke Recovery and the Toronto Rehabilitation Institute. Dr. Swardfager and the other authors report no relevant financial relationships.
Subscribe to:
Posts (Atom)