Medical News

Social care reforms announced

Medical News - Tue, 01/19/2038 - 06:14

Most of the UK media is covering the announcement made in Parliament by Jeremy Hunt, Secretary of State for Health, about proposed changes to social care.

The two confirmed points to have garnered the most media attention in the run-up to the announcement are:

  • a ‘cost cap’ of £75,000 worth of care costs – after this point the state would step in to meet these care costs
  • raising the current means-testing threshold for people to be eligible for state-funded social care from £23,520 to £123,000

The government expects these changes will lead to fewer people having to sell their homes in order to pay for their long-term care needs.

Speaking in Parliament, Mr Hunt said that the current system was ‘desperately unfair’ as many older people face ‘limitless, often ruinous’ costs. The minister stated that he wants the country to be ‘one of the best places in the world to grow old’.

 

What is social care?

The term social care covers a range of services provided to help vulnerable people improve their quality of life and assist them with their day-to-day living.

People often requiring social care include:

  • people with chronic (long-term) diseases
  • people with disability
  • the elderly – particularly those with age-related conditions, such as dementia

Social care services can include:

  • healthcare
  • equipment
  • help in your home or in a care home
  • community support and activities
  • day centres

 

How does the current adult social care system work?

Currently, state funding for social care is based on two criteria:

  • means – people with assets of more than £23,520 do not qualify for funding
  • needs – most local authorities will only fund care for people assessed to have substantial or critical needs

The majority of people currently requiring social care pay for it privately. These are known as ‘self-funders’.

 

What prompted these reforms to adult social care?

Put simply, on average, the UK population is getting older.

When the welfare state was created in the early 20th century, it was not expected that people would someday routinely live into their 70s, 80s, and even 90s.

The increase in life expectancy is a good thing, however, it brings a new set of challenges.

While people are living longer, they are also spending more of their lives in ill health. Older people are more likely to have potentially complex care needs that can be expensive to manage.

Many people are currently ineligible for state-funded social care under the existing laws. To meet the costs of these care needs, these ‘self-funders’ have, in many cases, had to sell or remortgage their home, or sell other assets to pay for the costs of their care.

Without reforms, experts agree that the cost of social care for both the state (through taxes) and to ‘self-funders’ is likely to become increasingly problematic.

To try and find the best way to resolve some of the difficulties of fairly funding adult social care, the Department of Health set up a commission. This independent commission reported its findings to ministers in July 2011. The government considered these findings in its white paper on care and support published in July 2012, and in the drafting of the proposed new legislation.

 

What happens next?

The government has introduced a Social Care Bill which will need to be passed by the Houses of Parliament.

If the bill is successfully passed it is expected the amendments will come into force by 2017.

 

Edited by NHS Choices. Follow Behind the Headlines on twitter.

Links To The Headlines

Social care: Jeremy Hunt hails 'fully-funded solution'. BBC News, February 11 2013

Social care reforms: Almost 2 million pensioners will be denied state help. The Daily Telegraph, February 11 2013

Social care reform: how your family may be affected. The Daily Telegraph, February 11 2013

Dilnot 'regrets' decision to set social care cap at £75,000. The Guardian, February 11 2013

Hunt statement on adult social care cap: Politics live blog. The Guardian, February 11 2013

Categories: Medical News

Fit middle-aged men have lower cancer risk

Medical News - 9 hours 6 min ago

"Very fit men in their late 40s are less likely to get lung cancer and colorectal cancer than unfit men," says BBC News as it reports on a new US study.

The study involved a comprehensive fitness test of 13,949 US men. They were split into three fitness groups: lowest 20%, middle 40% and top 40%, and followed for an average of 6.5 years to see if fitness affected their chance of developing certain cancers.

Men in the fittest group were 55% less likely to develop lung cancer and 46% less likely to develop colorectal cancer compared with men in the lowest fitness group.

Perhaps surprisingly, men in the top group actually had a 22% higher risk of prostate cancer.

One obvious point is that men who exercise to stay fit are usually healthy in other ways too, such as eating a healthy diet and abstaining from alcohol. This could have influenced the results.

Still, there is evidence that exercise alone can reduce your cancer risk. Information provided by Cancer Research UK explains how exercise can reduce inflammation and prevent bowel damage, which may reduce cancer risk.

With its proven effect of preventing heart disease, regular exercise is always a good idea, whatever your age or sex. Read more about the benefits of exercise

Where did the story come from?

The study was carried out by researchers from the University of Vermont, the University of Texas Southwestern Medical Center, Duke University Medical Center in Dallas, and the Memorial Sloan Kettering Cancer Center in New York.

It was funded by the US National Institute of General Medical Sciences, the National Institutes of Health, and the National Cancer Institute.

The study was published in the peer-reviewed science journal JAMA Oncology. It was published as an open-access article, meaning it is free to read and download online.

Generally, the UK media reported the story accurately, but none mentioned the possibility that diet could be accounting for some of the improvements seen, not just fitness.

What kind of research was this?

This was a longitudinal study looking at whether cardiorespiratory fitness (having both a healthy heart and lungs) prevents or improves outcomes in cancer.

It used data already collected as part of the long-running Cooper Center Longitudinal study.

There are many risk factors for cancer, including age, diet and physical activity. This study focused on fitness and whether this helped men develop fewer cancers, and survive better if they did develop cancer.  

What did the research involve?

The research analysed fitness data on 13,949 US men collected as part of the Cooper Center Longitudinal study between 1971 and 2009.

The men were split into three fitness groups: lowest 20%, middle 40% and top 40%, and followed for an average of 6.5 years to see if fitness levels affected their chance of developing lung, colorectal or prostate cancer.

Fitness was assessed using an incremental treadmill test, which tests a person's ability to run to exhaustion.

The outcomes researchers were most interested in studying were:

  • new cases of prostate, lung and colorectal cancer 
  • death from any cause for men developing cancer over the age of 65
  • cause-specific death, such as cardiovascular disease, for men developing cancer over the age of 65

Cancer diagnosis and notification of death came from Medicare claims data, which is the US government health insurance system covering people over 65.

The statistical analysis took account of many common cancer risk factors, but not diet or the stage of cancer at diagnosis.

The confounding factors adjusted for included:

  • age
  • examination year
  • body mass index (BMI)
  • smoking
  • total cholesterol level
  • systolic blood pressure
  • diabetes mellitus
  • fasting glucose level  
What were the basic results?

Over the study period, 181 men were diagnosed with colon cancers, 200 with lung cancers, and 1,310 with prostate cancers.

The main message from the results is that exercise is very good at reducing the risk of developing lung and colorectal cancer, as well as helping reduce the risk of dying from cancer or cardiovascular disease. The pattern of risk for prostate cancer was less clear. 

Men in the fittest group were 55% less likely to develop lung cancer (hazard ratio [HR] 0.45; 95% confidence interval [CI], 0.29 to 0.68), and 46% less likely to develop colorectal cancer (HR, 0.56; 95%; CI, 0.36 to 0.87), compared with men in the lowest fitness group. The risk of prostate cancer was actually 22% higher (HR 1.22; 95%; CI, 1.02 to 1.46).

Similar benefits were seen comparing the middle exercise group with the lowest exercise group, but the risk differences were slightly smaller.

For example, risks were 43% lower for lung cancer and 33% lower for colon cancer compared with the lowest fitness group. This time there was no difference for prostate cancer. This analysis covered cancers diagnosed at any age.

Looking only at cancers diagnosed after the age of 65, the fittest group were 32% less likely to die from cancer compared with men in the lowest fitness group (HR, 0.68; 95%; CI, 0.47 to 0.98) – this included prostate cancer.

They were also 68% less likely to die from cardiovascular disease after a cancer diagnosis (HR, 0.32; 95%; CI, 0.16 to 0.64) compared with the least fit men. 

How did the researchers interpret the results?

The authors concluded that, "There is an inverse association between midlife CRF [cardiorespiratory fitness] and incident lung and colorectal cancer, but not prostate cancer. High midlife CRF is associated with lower risk of cause-specific mortality in those diagnosed as having cancer at Medicare age [over 65]." 

Conclusion

This study shows that cardiovascular fitness is likely to reduce men's chances of developing lung and colorectal cancer, and appears to boost survival from cancer or cardiovascular disease in those diagnosed after the age of 65. This was based on comparing the top 40% of fittest men with the 20% least fit.

The study focused on fitness and took account of major risk factors for cancer, such as smoking and blood pressure. However, it left out one important risk factor: diet. What people eat and drink is known to affect cancer risk.

The fittest group may also have been the healthiest in terms of eating well and drinking alcohol within safe limits. This probably accounted for some of the risk reductions seen in this study. What proportion? We don't know. 

This, in effect, makes this a study of healthiness incorporating fitness and diet. The evidence that eating well and being active reduces the risk of cancer, heart disease, stroke and diabetes is already well established. Studies have also shown regular physical activity also benefits our mental health.

Read more about reducing your cancer risk.

Although fitter men over the age of 65 diagnosed with cancer had better survival rates, there are other unmeasured factors that could have contributed. It is not known whether the fitter people were diagnosed with cancer at an earlier stage, which would have increased their chance of survival.

There was also a counterintuitive finding worth noting. The fittest group were more likely to be diagnosed with prostate cancer than the least fit. This is important, as prostate cancer risk was much higher than lung or colon cancer in the sample.

The study authors thought this might be because fitter men go for more cancer tests in the US than unfit men, so therefore the cancer is discovered and diagnosed more often in that group.

It could also be the case that men in the fittest group would probably live longer, and prostate cancer is an age-related disease.

But we don't know this for sure, and there could be other explanations worth investigating.

Would you know if you had prostate cancer? Read more about prostate cancer symptoms.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Fit middle-aged men 'at lower risk for some cancers'. BBC News, March 27 2015

Keep fit to beat cancer: Looking after yourself in middle age boosts men's chances. Daily Mirror, March 27 2015

Keeping fit helps men with cancer to boost survival chances by a third. The Times, March 27 2015

Links To Science

Lakoski SG, Willis BL, Barlow CE, et al. Midlife Cardiorespiratory Fitness, Incident Cancer, and Survival After Cancer in Men - The Cooper Center Longitudinal Study. JAMA Oncology. Published March 26 2015

Categories: Medical News

Crossing your fingers may help reduce pain

Medical News - 11 hours 6 min ago

"Crossing your fingers might reduce pain," says The Guardian. The study behind the news found crossing your fingers may confuse the way your brain processes feelings of hot and cold – and, in some cases, reduce painful sensations.

Rather than subjecting the participants to "normal" pain, the authors used a trick known as the thermal grill illusion. The thermal grill illusion is not the latest in BBQ technology, but an unusual – and well validated – phantom pain effect.

When the skin is subjected to an alternating pattern of harmless coldness followed by heat, it creates a sensation of "burning coldness", but does no damage to the skin. It is something akin to the burning sensation felt by anyone placing cold hands under warm water after a snowball fight.

The researchers applied hot and cold sensations to the ring, middle and index fingers to create phantom pain sensations in volunteers. The phantom pain reduced in some people when they crossed their fingers.

This artificial phantom set-up means the findings probably don't apply to most real-life experiences of pain. Would a woman crossing her fingers during childbirth feel some benefit, or would someone who has just hit their thumb with a hammer? Probably not.

We shouldn't get too hung up on the crossed finger idea, though. The concept behind it is more interesting. The study tentatively showed that pain might be influenced by how our bodies are organised in space and relative inputs from different parts of your body.

If it is found to be a regular and real occurrence through more research, this may have potential for use in pain management in healthcare.

Where did the story come from?

The study was carried out by researchers from University College London (UCL) and the University of Verona (Italy).

It was funded by the CooperInt Program from the University of Verona, the European Union Seventh Framework Programme, the Economic and Social Research Council, and the European Research Council.

The study was published in the peer-reviewed science journal Current Biology.

The Guardian reported the story accurately, making it clear it was not real-world pain, but phantom pain from the thermal grill illusion.

The paper interviewed Elisa Ferrè of UCL and a co-author, who said: "There might be applications for treating people with chronic pain … the position of your limbs or digits is something that would be very easy to manipulate."

Adding a welcome note of caution, The Guardian wrote: "The findings did not establish whether crossing your fingers would be as soothing with a real painful stimulus, rather than an illusory one, but Ferrè said her hunch is that it would help."

What kind of research was this?

This was a study of human volunteers investigating whether pain perception is influenced by the position of their fingers.

Rather than subjecting the participants to conventional pain, the team used a trick known as the thermal grill illusion to create a phantom pain sensation.

Controlled experiments such as these are useful for developing new ideas and testing them in the early stages. But testing pain in an indirect manner like this isn't ideal. It would be more useful to devise a test using actual pain, but this has ethical dimensions to consider.

What did the research involve?

The researchers used three heat pads under the index, middle and ring fingers of participants to test different combinations of the thermal grill illusion, and whether crossing fingers reduced the phantom pain.

Participants also adjusted a temperature delivered to the other hand until it matched their perception of the cold target finger (index or middle).

The thermal grill illusion works by applying a warm sensation to the index and ring fingers, and a cold sensation to the middle finger. The grill-like pattern of warm-cold-warm creates a burning sensation in the middle finger, even though it is in fact exposed to cold.

About half of people go as far as describing the feeling as painful. The sensation is much more intense than the hot or cold on their own.

According to the researchers, the illusion might work because the hot sensation in the outer two fingers blocks the activity in a certain cooling receptor under the skin. With this pathway blocked, the hot signals from the nearby hot areas are felt more intensely.

What were the basic results?

The study found significant temperature overestimation when the target finger was in the middle (warm-cold-warm) compared with on the end (cold-warm-warm).

The effect depended on the target finger being in the middle of thermal inputs, but it didn't matter whether this was the index or middle target fingers.

The thermal grill effect for the middle finger was abolished when it was crossed over the index. The same effect was generated for the index finger when it was crossed with the middle.

How did the researchers interpret the results?

The team concluded that, "Our results suggest that the locations of multiple stimuli are remapped into external space as a group; nociceptively mediated sensations [pain perception] depended not on the body posture, but rather on the external spatial configuration formed by the pattern of thermal stimuli in each posture."

Conclusion

This study investigated pain using a thermal grill trick, which applies hot and cold in different combinations to the index, middle and ring fingers to induce a phantom burning sensation.

This showed that crossing your fingers may confuse the way your brain processes feelings of hot and cold, and in some cases stopped the phantom pain.

The biggest limitation of this study is that it looked at phantom pain using the thermal grill trick, rather than actual pain. Phantom pain may be different from "normal" pain, so the results may not relate to a regular pain situation.

We shouldn't get too hung up on the crossed finger idea, though. The concept behind it is more interesting. The study tentatively showed that pain might be influenced by how our bodies are organised in space, and relative inputs from different parts of your body.

If found to be a regular and real occurrence through more research, this may have potential for use in pain management in healthcare.

For example, The Guardian says: "Scientists believe the phenomenon could ultimately be harnessed to help treat chronic pain patients, who suffer from painful sensations, often long after a physical injury has healed."

At present, this is largely speculative. The study only showed reduction in phantom pain, and only under a very specific and artificial set of circumstances. Research that is more relevant and applicable to real life would be the logical next step for this research field. 

Still, how we think about pain can sometimes alter how it much it affects us. Many people find cognitive behavioural therapy (CBT) techniques can be useful in helping people cope better with chronic pain.

Read more about coping with pain.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Crossing your fingers might reduce pain, says study. The Guardian, March 26 2015

Crossing your fingers actually DOES help – just not the way you expect. Metro, March 26 2015

Links To Science

Marotta A, Ferrè ER, Haggard P. Transforming the Thermal Grill Effect by Crossing the Fingers. Current Biology. Published online March 26 2015

Categories: Medical News

Do antibiotics in pregnancy cause cerebral palsy and epilepsy?

Medical News - Thu, 03/26/2015 - 15:00

"Antibiotic used in pregnancy linked to risk of epilepsy and cerebral palsy," The Guardian reports.

The results of a new study suggest women who take macrolide antibiotics were slightly more likely to give birth to a child with one (or both) of these conditions, compared with women who take penicillin.

But no association was found between taking antibiotics in general during pregnancy and cerebral palsy (a condition that causes movement disorders) or epilepsy (a condition that causes seizures).

However, a direct comparison between these groups of women is not entirely reliable. There could be other confounding factors that could account for the difference seen, such as the type and severity of infection.

The study does not prove that macrolides cause either cerebral palsy or epilepsy. It is possible an underlying infection in pregnancy increased the risk of these conditions, rather than the treatment itself.

There is no such thing as an entirely 100% risk-free medical intervention. This means we need to use the best evidence available to make an informed decision about the trade-off between the benefits and risks of various choices.

Most experts would agree that the benefits of treating bacterial infections in pregnancy far outweigh the potential risks of antibiotics – if infections are left untreated, it could result in the infection being passed on to the baby, or most seriously, miscarriage or stillbirth

Where did the story come from?

The study was carried out by researchers from University College London and the Farr Institute of Health Informatics Research, London, and was funded by the Medical Research Council.

It was published in the peer-reviewed journal PLOS One on an open-access basis, so it is free to read online.

The Guardian, The Daily Telegraph and the Mail Online's reporting was accurate and responsible. All three papers pointed out that the increased risk from macrolides found in the study was small and could be explained by other factors (confounders).

The papers also included advice from experts that women should continue to take antibiotics prescribed for infection.

It is a shame, then, that all three papers chose to run with alarmist headlines that failed to put the increase in risk into any useful context.

The papers also singled out a common antibiotic called erythromycin. This belongs to the group called macrolides, but it was not focused on by the study.

What kind of research was this?

This was a retrospective cohort study involving 195,909 women. It looked at whether antibiotics prescribed during pregnancy were linked to a higher risk of cerebral palsy or epilepsy in their children.

Cohort studies are often used to look at whether particular events are linked to certain health outcomes. The advantage with this type of study is that it can follow large groups of people over long periods of time, but they cannot prove cause and effect.

Retrospective cohort studies, which look back over time, may also be less reliable than those that follow people in time, called prospective cohort studies.

The authors say antibiotics are one of the most frequently prescribed drugs during pregnancy.

However, they say one large randomised controlled trial (RCT) found certain antibiotics given to women who had gone into premature labour were associated with an increased risk of cerebral palsy or epilepsy in their children at seven years of age.

The two antibiotics used in this previous trial were erythromycin, a macrolide, and co-amoxiclav, which is a type of penicillin.

What did the research involve?

The researchers used data on 195,909 women who had registered at their GP surgeries before pregnancy and had a baby born at or after term (37 weeks).

For women with multiple pregnancies (about one-quarter of the cohort), one pregnancy was selected randomly for analysis. Women whose children were born preterm were excluded because premature babies already have an increased risk of cerebral palsy and epilepsy.

They looked at whether the women had been treated with any oral antibiotics during pregnancy, and, if so, which class of antibiotics, the number of courses they had, and the timing of treatment during pregnancy.

The women's children were followed until seven years of age for any diagnosis of cerebral palsy or epilepsy, as recorded in the children's primary care records.

The researchers analysed the data using standard statistical methods. They adjusted their results for a wide range of maternal risk factors.

These included maternal age at delivery; pregnancy complications; chronic conditions such as obesity; treatment for chronic medical conditions during pregnancy; tobacco and alcohol use; social deprivation; and maternal infections that could potentially cause damage to the foetal brain.

What were the basic results?

A total of 64,623 (33.0%) of the women were prescribed antibiotics in pregnancy, and 1,170 (0.60%) children had records indicating they had cerebral palsy or epilepsy, or both.

Once the researchers adjusted their results for confounders, they found:

  • no association between antibiotics and cerebral palsy or epilepsy (hazard ratio [HR] 1.04, 95% confidence interval [CI] 0.91-1.19)
  • compared with penicillins, macrolide antibiotics were associated with a 78% increased risk of cerebral palsy or epilepsy (HR 1.78, 95% CI 1.18-2.69; number needed to harm 153, 95% CI 71-671)
  • children whose mothers received more than three antibiotic prescriptions during pregnancy had a 40% increased risk (HR 1.40; 95% CI 1.07-1.83) compared with those with no prescriptions
How did the researchers interpret the results?

The researchers say that their findings indicate the prescribing of macrolides in pregnancy is linked to an increased risk of cerebral palsy or epilepsy in childhood.

They speculated about why macrolides might be associated with harm – arguing, for example, that if women stopped taking the drugs because of side effects, the partially treated infection might prolong the foetal brain's exposure to inflammation.

However, they add that there is growing evidence that taking macrolides during pregnancy is associated with harm, and these drugs may have specific adverse effects on the foetus.

Conclusion

The findings from this large study indicate that antibiotic use in pregnancy was not associated with an increased risk of cerebral palsy or epilepsy. The apparent increased risk of macrolides compared with penicillin is not reliable.

A direct comparison between the women taking each type of antibiotic is inaccurate, as it does not take into account potential confounding factors. These include:

  • the type and severity of the infections, which could have affected the baby, rather than the antibiotic
  • whether the women took all of the course of antibiotic or not as a result of side effects; if stopped early, the infection may not have been fully cleared and could have then harmed the baby
  • other unmeasured maternal factors that influenced the type of antibiotic the women were given, such as other medications or health conditions

Additionally, the analysis for macrolides was based on small numbers of women, so the results could also have occurred by chance. It is important to stress that the risk to individual pregnancies is small.

Doctors will only prescribe antibiotics in pregnancy if they think there is a clear clinical need, where mother and baby are potentially at threat. Any risk to your pregnancy posed by antibiotics will probably be far outweighed by the benefits of treatment.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Antibiotic used in pregnancy linked to risk of epilepsy and cerebral palsy. The Guardian, March 25 2015

Antibiotic pills 'can hurt unborn baby': Common drug linked to cerebral palsy and epilepsy. Mail Online, March 26 2015

Common antibiotic may double the risk of epilepsy or cerebral palsy in unborn babies. The Daily Telegraph, March 26 2015

Links To Science

Meeraus WH, Petersen I, Gilbert R. Association between Antibiotic Prescribing in Pregnancy and Cerebral Palsy or Epilepsy in Children Born at Term: A Cohort Study Using The Health Improvement Network. PLOS One. Published online March 25 2015

Categories: Medical News

Milk and dairy 'good for the brain' claim unproven

Medical News - Thu, 03/26/2015 - 03:00

"Three glasses of milk every day ‘helps prevent Alzheimer's and Parkinson's’," is the misleading headline in The Daily Telegraph. The study it reports on only found that a high-dairy diet was linked to increased levels of an antioxidant called glutathione.

It is also unclear what, if any, protective effects higher levels of glutathione would have against Alzheimer’s or Parkinson’s disease.

The study, funded by the US Dairy Research Institute, looked at brain MRI scans of 60 adults aged between 60 and 85 using a new technique that could measure levels of glutathione.

This antioxidant is said to "neutralise" potentially harmful chemicals in the brain. Lower levels are found in conditions such as Parkinson’s disease and Alzheimer’s disease, but it is not known whether this is part of the cause of the conditions or a consequence of them.

The level of glutathione was determined once, at the same time as participants were asked about their diets. This study therefore cannot tell us that a high-dairy diet caused the increased levels of glutathione. It is also unable to show what happens to glutathione levels over time or whether the higher levels are protective.

So, all in all, this study proves little. Dairy products are important for bone health and are recommended in moderation as part of a healthy diet, but we just don’t know if they are good for the brain.

 

Where did the story come from?

The study was carried out by researchers from the University of Kansas Medical Center. It was funded by the US Dairy Research Institute, with further funding provided by the National Institute for Health and the Hoglund Family Foundation. The funding organisations did not have a role in study design, implementation, analysis, or interpretation of data.

The study was published in the peer-reviewed American Journal of Clinical Nutrition.

The Daily Telegraph’s reporting on the story was poor and its headline was inaccurate. It says that people "who guzzled the white stuff were more likely to have healthy brains", when in fact all of the people in the study were healthy. It is also not known whether increased levels of glutathione prevent neurodegenerative disorders, so we can’t say that people with higher levels definitely have "healthier" brains.

The Mail Online’s coverage was slightly more restrained, opting to say that it "may help protect" rather than "will help protect".

 

What kind of research was this?

This was a cross-sectional study, which measured the level of glutathione in the brain using a new MRI scanning technique. Glutathione is an antioxidant which helps prevent damage to cells. Reduced levels of glutathione have been found in the early stages of Parkinson’s disease, though it is unclear if this might contribute to the development of Parkinson’s or is the result of Parkinson’s.

The researchers wanted to see if drinking milk was associated with higher levels of glutathione in the brain. As it was a cross-sectional study, it only measured the level of glutathione at one time point, and did not follow people up over time to find out what happened to them. This means it was not able to show whether dietary consumption might directly affect glutathione levels in the brain, or indeed whether higher levels were protective against brain diseases such as Parkinson’s disease or Alzheimer’s.

 

What did the research involve?

The researchers recruited 60 healthy older adults, assessed their dairy intake and measured their level of glutathione in the brain using an MRI scan. They then analysed whether increased consumption of milk was associated with higher levels of glutathione.

The participants were adults aged between 60 and 85, who were healthy and did not have a history of:

  • neurologic (brain and nervous system) disorders
  • head injury
  • claustrophobia (which would make them unsuitable for MRI scanning, as getting a scan involves lying in a small metal tube)
  • diabetes
  • unstable medical conditions
  • lactose or gluten intolerance
  • taking glutathione or N-acetylcysteine supplements

The participants completed three 24-hour food frequency questionnaires by telephone with a dietician, and a seven-day diet record was filled in before the MRI scan. From these assessments, the researchers categorised the participants into the following three groups, according to their daily consumption of dairy products:

  • low dairy intake, less than one serving per day
  • moderate dairy intake, one to two servings per day
  • "recommended" dairy intake, three or more servings per day (this was based on US recommendations)

They also had other measurements taken, including body mass index (BMI), waist circumference and body composition of fat and muscle. Finally, they had a brain MRI scan using a new process (known as chemical shift imaging) that had been developed by the researchers to measure the level of glutathione.

The results were then analysed to see if increased dairy consumption was associated with higher levels of glutathione.

 

What were the basic results?

The participants’ characteristics were similar across the three groups in terms of age, BMI, educational level and quality of their diet.

Glutathione levels in the front and sides (parietal region) of the brain were higher in people who consumed more dairy products, milk and calcium.

The study did not assess whether this difference would affect a person’s health in any way, or how levels fluctuate over time.

 

How did the researchers interpret the results?

The researchers concluded that "glutathione concentrations were significantly related to adults’ reported consumption of dairy foods and calcium". They say that further research is required to see if increased levels of glutathione prove to be effective in "strengthening cerebral antioxidant defenses [sic] and, thereby, improving brain health in the aging population".

 

Conclusion

This small study found people with higher dairy, milk and calcium consumption had higher levels of glutathione in the frontal and parietal regions of the brain. Glutathione is an antioxidant that helps to "neutralise" potentially harmful chemicals in the brain.

Research into glutathione and its role in neurodegenerative diseases is in the early stages. It is known that the levels reduce with age and in certain conditions such as Alzheimer’s disease and Parkinson’s disease, but it is not known whether this is part of what leads to the disease or a consequence of the disease. This study does not show whether increasing the level of glutathione would protect against these types of conditions.

This study was cross-sectional, so measured the level of glutathione at one time point in older adults who were healthy. It therefore does not answer the question of whether people with more glutathione in their brains are less likely to develop neurodegenerative disorders.

In addition, previous research has found that in Parkinson’s disease, glutathione levels are only reduced in an area of the brain called the substantia nigra, which is located in the middle of the brain. This study did not look at levels in this part of the brain.

This was a relatively small study, which found a relatively wide range of glutathione levels ranged in different areas of the brain. A much larger study would be required to understand what the normal range is in the population, and how this differs in various disease states. The study is also reliant on self-reporting of dietary intake which can be inaccurate. There is also little information about other factors which could influence the results such as socioeconomic status, ethnicity, family history of Alzheimer’s disease or Parkinson’s disease, other conditions or medication use.

In conclusion, this study has found that increased reported consumption of dairy and milk products was associated with increased levels of the antioxidant glutathione in the brain, but it cannot prove that this was due to the diet or that this will prevent brain disease.

Larger studies into the role of both dairy products and glutathione on neurodegenerative diseases would be useful.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Three glasses of milk every day 'helps prevent Alzheimer's and Parkinson's'. The Daily Telegraph, March 25 2015

Three glasses of milk a day 'to beat dementia': People who drink large amounts of the white stuff have higher levels of antioxidant that helps protects brain cells. Mail Online, March 26 2015

Links To Science

Choi I, Lee P, Denney DR, et al. Dairy intake is associated with brain glutathione concentration in older adults. The American Journal of Clinical Nutrition. Published online December 10 2014

Categories: Medical News

Frequent antibiotic use linked to higher type 2 diabetes risk

Medical News - Wed, 03/25/2015 - 16:30

"Repeated antibiotic use linked to diabetes," BBC News reports.

New research has studied over 200,000 people from the UK who were diagnosed with diabetes between 1995 and 2013. Researchers counted the number of antibiotic prescriptions they had during an average five-year period before they were diagnosed. They compared the number of prescriptions given to an age- and gender-matched control group of over 800,000 people.

They found that people taking antibiotics were more likely to develop diabetes, and those taking more were at a higher risk. For example, people who took five or more antibiotic courses in the five-year period before diagnosis had around a third higher risk of developing type 2 diabetes than those taking no antibiotics.

We should not assume that the results mean antibiotics definitely cause diabetes. It could be the other way round. 

Diabetes is known to increase the risk of infection, especially skin and urinary infections, so it could be diabetes leading to antibiotic use, and not vice versa.

Researchers attempted to adjust for this by only looking at antibiotic use for more than one year before a diagnosis of diabetes was made. However, this may not have been long enough.

It should also be noted the researchers did not take into account other factors that could have caused the results, such as the use of other medications known to increase the risk of diabetes and infections, such as steroids.

If you do find yourself having recurring infections, you should discuss the issue with your GP. There may be an underlying cause that needs investigating.

 

Where did the story come from?

The study was carried out by researchers from the University of Pennsylvania, and Tel-Aviv Sourasky Medical Center and Tel-Aviv University in Israel. It was funded by the US National Institutes of Health.

The study was published in the peer-reviewed European Journal of Endocrinology.

BBC News explained the study well, saying that since people with type 2 diabetes were at a higher risk of developing infections anyway, it was hard to find out which caused which. It quoted professor Jodi Lindsay from St George's, University of London, who explained: "This is a very large and helpful study linking diabetes with antibiotic consumption in the UK, but at this stage we don't know which is the chicken and which is the egg."

While appropriate use of antibiotics is a pressing issue, the study did not look at whether the prescriptions were appropriate or not, they simply counted how many were made.

Read about how a new blood test could help prevent antibiotic misuse – a news story that we published last week.   

 

What kind of research was this?

This was a case-control study looking at whether antibiotic use raised the risk of developing diabetes.

This type of study matches people with a condition, in this case type 2 diabetes, with a control group without the condition who are the same age and sex. They compare many risk factors, in this case antibiotic use, to see if any might be linked with the disease. This type of study can show links between risk factors and disease, but cannot prove cause and effect. This is mainly because it cannot completely control for confounding factors (confounders).

 

What did the research involve?

Using a UK database of medical records, the researchers picked people diagnosed with diabetes and compared their exposure to antibiotics with people of the same age and sex who did not have a diagnosis of diabetes.

The researchers used medical records from 1995 to 2013 from a UK population-based database called The Health Improvement Network (THIN).

They identified 208,002 people who were diagnosed with diabetes during this time period, excluding people who already had a diagnosis of diabetes and those diagnosed within the first six months of the study.

The control group consisted of 815,576 people who were matched for age and sex with the cases. Importantly, they did not have diabetes at the date the case was diagnosed – called the index date.

Both groups were, on average, 60 years old and there was an even gender split.

Using the medical records, the researchers documented how many outpatient antibiotic prescriptions people had been given more than a year before the index date. They collected information on seven commonly used antibiotics, as well as antiviral and antifungal medications.

They analysed differences in antibiotic use, taking into account the following potential confounders, where available:

  • body mass index (BMI)
  • smoking
  • coronary artery disease
  • hyperlipidaemia (high cholesterol) that needed to be treated with statins
  • glucose level before the date of diabetes diagnosis
  • number of urinary tract, skin and respiratory infections before date of diabetes diagnosis

 

What were the basic results?

People with diabetes had a higher rate of infection before the diagnosis index date compared to the controls. Urinary infections, for example, occurred in 19.3% of cases, compared with 15.1% of controls.

Analysis not accounting for confounders showed antibiotic use was linked with higher diabetes risk for all seven antibiotics documented, and for both diabetes types. However, this is a simple analysis, and potentially misleading. Analysis taking account of the confounders is more reliable. This showed higher risks only in those taking more than one course of penicillin, cephalosporins, macrolides and quinolones, and showed almost no change in risk for participants with type 1 diabetes. The increase in risk in type 2 diabetes was higher the more antibiotics people had taken.

Treatment with two to five courses of the following antibiotics was associated with an increased risk of diabetes compared to no use of antibiotics, after adjusting the results for the confounders listed above:

  • 8% increase in risk for penicillin (odds ratio (OR) 1.08, 95% confidence interval (CI) 1.05 to 1.11)
  • 11% increase in risk for cephalosporins, such as cefalexin (OR 1.11, 95% CI 1.06 to 1.17)
  • 11% increase in risk for macrolides, such as erythromycin (OR 1.11, 95% CI 1.07 to 1.16)
  • 15% increase in risk for quinolones, such as ciprofloxacin (OR 1.15, 95% CI 1.08 to 1.23)

Taking more than five courses of antibiotics raised the risk to 23% for penicillin and 37% for quinolones, compared to taking none.

There was no increase in risk for antivirals or antifungals.

 

How did the researchers interpret the results?

The researchers concluded that there is "a higher adjusted risk for type 2 diabetes among individuals with recurrent exposures to penicillin, cephalosporins, macrolides and quinolones". They also found "no increase in adjusted risk for exposure to antiviral or antifungal medications".

 

Conclusion

This large population-based study found a higher risk of diabetes in people taking between two and five courses of antibiotics over a year before diagnosis. This risk was even higher after more than five courses.

Strengths of the study include its large sample size, direct relevance to the UK, and the accuracy of the data.

Despite the strengths, the study does not prove that antibiotics cause diabetes, as its design cannot prove cause and effect. There are both plausible explanations for how antibiotic use may cause diabetes, and how the development of diabetes may cause more antibiotic use. 

For example, people with diabetes are more prone to contracting bacterial infections. It could be that some of the study’s participants were in a prediabetes or undiagnosed diabetes stage when they started taking antibiotics. The researchers tried to take this into account by not including any antibiotic prescription given in the year before diabetes diagnosis, but it is possible that the diagnosis was delayed by more than a year, or signs appeared more than a year before diagnosis.

The second option is that antibiotics contribute to diabetes by altering a person’s microbiota – our internal stock of "good" bacteria and other micro-organisms present in our digestive system.

Other confounders could have accounted for the increased risk found:

  • Increased use of antibiotics is also common in people who take steroids, such as prednisolone. Steroids are known to increase the risk of diabetes.
  • Obesity increases the risk of diabetes, but BMI was not available for 30% of the study's participants.
  • The number of prescriptions of antibiotics was only recorded from 1995 up until the date of diagnosis of diabetes.
  • As the average age of the participants was 60 at the time of diagnosis, this means, at best, that the study did not capture antibiotic use up to the age of 40.
  • The study only recorded outpatient prescriptions; it did not include antibiotics given during hospital admissions.

A further limitation of the study was that the main analysis included people with either type 1 or type 2 diabetes. This muddies the water, as they have different causes. Type 1 diabetes is autoimmune and typically starts in childhood or adolescence, and no clear risk factors have been identified (although a viral cause has been suggested). However, type 2 diabetes has a number of risk factors, including family history, ethnic background and obesity.

The study does provide more of an incentive to only take antibiotics when strictly required. Known risk factors for diabetes that you can change include reducing your waistline, maintaining a healthy weight, reducing high blood pressure, eating healthily and taking regular physical exercise.

Read more about how to reduce your diabetes risk.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Repeated antibiotic use linked to diabetes. BBC News, March 25 2015

Antibiotics linked to a higher risk of Type 2 diabetes: Repeated use of some types of treatment can increase chance by 23%. Mail Online, March 25 2015

Unnecessary prescriptions of antibiotics such as penicillin ‘raise risk of diabetes’. Daily Express, March 25 2015

 

Links To Science

Boursi B, Mamtani R, Haynes K, Yang Y. The effect of past antibiotic exposure on diabetes risk. European Journal of Endocrinology. Published online March 24 2015

Categories: Medical News

Study finds link between air pollution and stroke risk

Medical News - Wed, 03/25/2015 - 14:30

"Air pollution is linked to an increased risk of stroke," BBC News reports, prompted by a large global study in The BMJ. Researchers found an association even with brief upsurges in air pollution levels.

Previous research has shown a strong link between air pollution and heart attacks, but until now the research looking at air pollution and stroke has had mixed results.

In this study, the researchers summarised all the relevant research published on the topic worldwide. This showed stroke risk was higher on the day of an increase in air pollution and the days immediately after. They also found the effect of pollution was stronger in lower-income countries such as China.

While this type of study cannot prove air pollution is responsible for some strokes, it shows people are more likely to have strokes in the immediate aftermath of episodes of raised air pollution.

The researchers speculated that the association could be the result of a number of possible factors, such as pollution raising blood pressure or constricting blood vessels.

They concluded that governments around the world need to continue to take efforts to reduce the public health burden caused by air pollution.

Where did the story come from?

The study was carried out by researchers from the University of Edinburgh and was funded by the British Heart Foundation. No conflicts of interest were reported.

It was published in the peer-reviewed BMJ on an open-access basis, so it is free to read online or download as a PDF.

Generally, the study was reported accurately in the media. The Mail Online and BBC News gave more detail about the types of pollution causing the problems, and asked questions about why some regions of the UK breach EU pollution limits.

The study has been published in tandem with a related study looking at the links between air pollution and anxiety, which is also open access.

Some media sources have combined a report on both studies into a single story. We have not analysed this second study, so we cannot comment on whether the coverage is accurate.

What kind of research was this?

This was a systematic review of 103 observational studies that looked at the link between air pollution levels and risk of stroke.

The studies included two types of observational study: case-crossover studies and time series studies. A meta-analysis was performed on 94 of the studies, which pooled the results.

Observational studies cannot show that a risk factor such as pollution directly causes an event such as a stroke, although this type of study can show if there is a likely link between the two. The difficulty is adjusting the figures to take account of anything else that might have affected the chances of having a stroke (confounders).  

What did the research involve?

Researchers trawled the scientific literature for studies that included measures of air pollution, deaths from stroke, or admissions to hospital for stroke. They then pooled the estimates of risk of stroke from the individual studies to come up with an overall risk figure for each type of pollutant studied.

The researchers specified the types of studies they would include at the start of their work, and explained in the paper how they excluded research that did not meet quality requirements or did not give the data in a way that they could use.

They included research published in any language, which increased their chances of including research from low- and middle-income countries.

They assessed 2,748 articles and included 103 in the review. Of these, 94 provided data that they were able to include in their analysis. The papers provided information about 6.2 million stroke hospital admissions or deaths from 28 countries.

The researchers used standard analytical techniques to show the increase in stroke risk for each incremental increase in pollution levels for the gases sulphur dioxide, nitrogen dioxide and ozone (all assessed by additional 10 parts per billion), as well as carbon monoxide (assessed by additional part per million).

They also analysed the increase in stroke risk for each incremental increase in fine particles or coarse particles. In addition, they looked at the time lag between stroke and raised pollution levels, and the nation's income status.  

What were the basic results?

Researchers found a "robust and clear" link between gas and particle air pollution levels and admission to hospital for stroke or death from stroke. The link was weakest for ozone and strongest for sulphur dioxide.

Fine particles were more strongly linked to stroke risk than coarse particles, and the link with higher stroke risk lasted longer for high levels of polluting particles than high levels of polluting gases. The increase in relative risk of stroke for each additional increment of pollutant ranged from around 1% to 2%.

To give one example, average (median) pollution levels measured in high-income countries were around 22.6 parts per billion for nitrogen dioxide (the most commonly measured polluting gas).

The increase in stroke risk for each additional 10 parts per billion for nitrogen dioxide was 1.4% (relative risk [RR] 1.014, 95% confidence interval [CI] 1.009 to 1.019).  

How did the researchers interpret the results?

The researchers say they have demonstrated a "marked and close" association between air pollution exposure and risk of stroke. They point out that the study shows low- and middle-income countries have the highest levels of air pollution, and also a "disproportionate burden" of the numbers of strokes worldwide.

They concluded that their study provides sufficient evidence to think that environmental policies intended to reduce air pollution "might reduce the burden of stroke", considering some potential ways that pollution might affect the risk of stroke.

They say air pollution can affect the linings of the blood vessels and the nervous system. This can lead the blood vessels to constrict, blood pressure to rise and blood clots to form – all of these things might increase the chances of having a stroke.  

Conclusion

This study showed a clear link between rises in gas and particle pollution and the chances of being admitted to hospital or dying because of a stroke. The researchers showed the link was strongest on the day of exposure to raised pollution levels.

But this study has some limitations. While systematic reviews are a good way to summarise all the research that has been published on a topic, they are only as good as the individual studies they include.

About two-thirds of the studies used a time series design, which the researchers say is less effective in taking account of trends such as the season of year, rather than the more reliable case-crossover design.

It's also possible that stroke was not diagnosed correctly in some studies. The air pollution data in some studies came from monitoring sites away from city centres, where most people live. This would be likely to underestimate the effect of pollution, as pollution levels are higher in the city centre.

The increase in the chances of having a stroke for any one individual, as demonstrated in this study, is small. However, people cannot usually choose to avoid exposure to air pollution, and many thousands of people are affected when pollution levels rise. According to the Stroke Association, there are around 152,000 strokes a year in the UK.

While there is little people can do to avoid air pollution on an individual level, the study provides new information that governments need to consider when setting policies likely to affect pollution.

Observational studies cannot prove beyond doubt that factors such as pollution directly cause events such as stroke. But this was a comprehensive and careful analysis where the evidence pointed in one direction.

We already know that pollution is likely to increase the risk of heart attacks, and a similar increase would seem to now exist with strokes.

It seems implausible that air pollution alone would trigger a stroke in a healthy individual. But a particularly heavy upsurge in pollution could be the tipping point in people with pre-existing risk factors for stroke, such as obesity and atherosclerosis (hardening of the arteries).

While much has been done to reduce levels of air pollution, it would appear there is much more we could be doing.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Air pollution 'link to stroke risk'. BBC News, March 25 2015

Air pollution 'linked to stroke risk'. ITV News, March 25 2015

Air pollution linked to raised stroke risk: Studies reveal smog connected to death, anxiety and hospital admissions. Mail Online, March 25 2015

Hundreds of thousands of people in stroke risk because of air pollution. Daily Mirror, March 24 2015

Air pollution can cause strokes and raise anxiety, researchers say. Daily Express, March 25 2015

Links To Science

Shah ASV, Lee KK, McAllister DA, et al. Short term exposure to air pollution and stroke: systematic review and meta-analysis. BMJ. Published online March 24 2015

Categories: Medical News

Are power naps a 'five-fold' memory booster?

Medical News - Tue, 03/24/2015 - 18:00

"A 45-minute power nap can boost your memory five-fold," reports The Independent.

This headline is based on a study that looked at the impact of napping on healthy volunteers’ ability to remember single words or word pairs in a memory test.

After being shown the words for the first time and then being tested on them, volunteers were split into two groups. The first group was allowed a 90-minute nap and the second group were made to stay awake.

It found that those who had a nap remembered similar numbers of word pairs after their nap as they had before their nap, while those who stayed awake tended to not remember as many.

The students tended to forget some of the single words between the two tests, regardless of whether they had a nap.

There are a number of limitations to this study – particularly its small size, with just 41 participants being analysed. This may be why the researchers were not quite able to rule out the idea that the differences between the groups occurred by chance. The limitations mean that we can’t conclusively say that napping is better for memory than not napping based on this study, particularly in real-world situations.

Sleep is known to be important for memory, and there is increasing interest in the effects of napping. For example, a study we discussed earlier this year suggested that napping improves memory retention in infants.

Where did the story come from?

The study was carried out by researchers from Saarland University in Germany. Funding was provided by the German Research Foundation.

The study was published in the peer-reviewed journal Neurobiology of Learning and Memory.

The UK media tended to overplay the findings of this small study. Most of them refer to a "five-fold" improvement in memory, which appears to come from a quote from one of the study authors. The author is also quoted as saying that: "A short nap at the office or in school is enough to significantly improve learning success."

This five-fold figure does not appear to be specifically mentioned in the research paper, and the differences between the groups at the end of the study were not quite large enough to rule out the idea that they occurred by chance.

Although the headlines talk about memory "improvement", what actually happened was that performance on the memory test stayed about the same after a nap, but got worse without one. We also can’t be sure whether the simple tests used in this study are representative of routine office or school tasks.

 

What kind of research was this?

This was a randomised controlled trial (RCT) looking at the effect of a nap on specific aspects of memory.

Sleep is thought to be important for "consolidating" our memories – essentially strengthening them and making it more likely that we remember. The researchers reported that a number of studies have shown that people perform better in certain memory tasks after sleeping than after staying awake for a similar period. However, they say that the effect of naps on different aspects of memory has been studied to a lesser degree.

The researchers wanted to look at the impact of naps on "associative memory" – the ability to learn and remember the relationship between two items – such as a person’s name, which relies on a part of the brain called the hippocampus. They also assessed "item memory" – the ability to remember whether we have seen or heard things before – which does not rely on the hippocampus.

An RCT is the best way to compare the effects of different treatments or interventions – in this case, a nap and a control (watching a DVD). This is because the groups being compared should be well balanced in terms of their characteristics, meaning only the intervention should differ between them and therefore be responsible for the differences in outcome. However, in small studies such as this, even randomly assigning people may not be able to achieve balanced groups.

 

What did the research involve?

The researchers enrolled healthy young university students and tested their memory for word pairs or single words they had been shown. They then randomly allocated them to either have up to a 90-minute nap and then watch a 30-minute DVD, or just watch DVDs for two hours. After this, they tested their memories for the words again, and compared the performance of those who napped and those who stayed awake.

There were 73 students who agreed to take part in the study, but 17 were excluded because the results of their initial memory test suggested that they were just guessing. An additional 15 were excluded after the test, as they performed particularly badly or they had not napped when they were meant to, or napped when they were not meant to. None of the students had sleep disorders or neurological problems, and they were all paid to take part in the study.

The memory test involved showing the students 120 unrelated word pairs (for the associative memory test) and 90 single words (for the item memory test), each appearing briefly on a screen, and asking them to remember them. About half an hour later, the students were shown 60 single words and 60 word pairs, and asked if these were words or pairs they had seen before.

The students then had their nap or watched the DVDs, depending on which group they had been assigned to. The DVDs only had music and images, and not words. Those who had a nap had their brainwaves monitored. They also watched about 30 minutes of one of the DVDs after they woke up to give them a bit of time to get over any residual sleepiness. The groups then did the word test again, this time with 120 word pairs and 120 single words.

The researchers compared the performance of those who napped and those who didn’t, both before and after the nap. They also looked at whether brainwave activity during the nap predicted a person’s performance on the memory test.

 

What were the basic results?

The napping group slept for about 64 minutes, on average.

The researchers found that both those who napped and those who didn’t performed worse on their second single word (item) memory test than they had at the start of the study shortly after they first saw the words.

The group who did not nap also performed worse in their second word pair (associative) memory task than they had at the start of the study. However, those who had a nap performed similarly on the word pair memory task at the start of the study and after their nap. This suggested that the nap had helped them to retain their memories of the words. The difference between the groups in their performance on the second word pair test was near to, but not quite reaching, what would be considered statistically significant (that is, enough to have a high level of certainty that it did not occur by chance).

 

How did the researchers interpret the results?

The researchers concluded that "these results speak for a selective beneficial impact of naps on hippocampus-dependent memories".

 

Conclusion

This small study has suggested that in healthy adults, a nap of about an hour might help to retain one type of newly formed memory – associative memory of unrelated pairs of words – but not item memory of single words.

While the study’s random allocation of participants is a strength, there are limitations:

  • The study was small and only included healthy young adults. The results may not apply to other groups of people, and ideally would be confirmed in larger studies.
  • While the reduction in associative memory in the group that stayed awake was statistically significant, the difference between the napping and non-napping groups in the word pair test at the end of the study was almost, but not quite, large enough to reach this level. That is, it was not quite enough to give a high level of certainty that it did not occur by chance. This may be due to the relatively small size of the study, and again suggests that larger studies are needed.
  • Some students were excluded after they had been randomly allocated to their groups; this can lead to imbalance between the groups and affect results. Ideally, results would have been shown both with and without those students included, to see if it made a difference. Analysing all the participants in the groups to which they were assigned, regardless of what happens to them, is an approach known as "intention to treat".
  • We don’t know how long the effect of the nap would last, as participants were only assessed a short time after their nap – with the tests all happening on one day.
  • The tests were simple word-based memory tests, and naps only affected one aspect of memory. We don’t know whether the naps might make a difference in remembering more complex information or different types of memory not tested in this study.

Overall, the study by itself does not conclusively show the benefits of naps on memory in our day-to-day lives.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

A 45-minute power nap can boost your memory five-fold, study finds. The Independent, March 23 2015

A power nap of just 45 minutes can boost the memory five-fold, according to new research. Mail Online, March 23 2015

How Power Naps Can Turbocharge Your Memory. The Huffington Post, March 23 2015

How a short nap can improve your memory five-fold. Daily Express, March 23 2015

Links To Science

Studte S, Bridger E, Mecklinger A. Nap sleep preserves associative but not item memory performance. Neurobiology of Learning and Memory. Published online February 23 2015

Categories: Medical News

'4D' ultrasound shows effects of smoking on unborn babies

Medical News - Tue, 03/24/2015 - 16:00

"Unborn baby shown grimacing in womb as mother smokes," is the somewhat misleading headline in The Daily Telegraph.

The news comes after researchers released dramatic images of babies in the womb taken using 4D ultrasound scanners. 4D scanners provide real-time moving images of babies in the womb.

Some newspapers have interpreted these images as showing distress caused by smoking. While smoking is certainly known to be harmful in pregnancy, the researchers may be reading too much into these images by claiming they show "grimaces" or expressions of pain in response to smoking.

The scans came from a small pilot study that showed differences between the movements of the unborn babies of four mothers who smoked, compared with the unborn babies of 16 non-smokers.

The paper says unborn babies touch their faces and move their mouths more in the earlier stages of their development, making these movements less often as they mature.

This study scanned babies between weeks 24 and 36 of pregnancy, and showed that babies carried by women who smoked appeared to move their mouths and touch their faces more than non-smokers' babies.

The implication is that this is a sign of slower development as a direct result of maternal smoking. But this has not been proven.

This study had a very small sample size, including just four smokers. And we don't know whether these observed differences in movement actually have any meaning in terms of the ongoing development of the unborn baby, or during infancy or childhood.

Still, we don't need any new research to tell us that smoking in pregnancy is harmful. Every cigarette you smoke contains more than 4,000 chemicals that will harm your baby.

Protecting your baby from tobacco smoke is one of the best things you can do to give your child a healthy start in life. It's never too late to stop smoking.  

Where did the story come from?

The study was carried out by researchers from Durham and Lancaster universities and James Cook University Hospital in Middlesbrough. We do not know who funded the study.

It was published in the peer-reviewed medical journal Acta Paediatrica.

Emotive images from the study were widely reproduced in the media. The Daily Telegraph suggested that the baby in the images was "grimacing" in response to cigarette smoke, while the Daily Mirror says the "dramatic pictures" show babies "suffering in the womb".

But the mother would not have been smoking in the hospital while being scanned, and we don't know the significance of the facial movements shown in the images, much less whether they represent suffering.

A case could be made that using the images to frighten mothers into quitting would be justifiable for the greater good, but it also wouldn't be entirely truthful or transparent.  

What kind of research was this?

This was a pilot observational study of a small number of pregnant women. It intended to see whether ultrasound scans could provide reliable information on subtle foetal movements (rather than asking mothers to count movements), and also see whether there are differences seen in the unborn babies of mothers who smoke.

This type of study can point to differences between different groups, but it cannot show what caused the differences. The researchers say a bigger study is needed to see if their findings are reliable and investigate them further. This study also can't tell us what the differences in movements mean for the development of the babies.

What did the research involve?

The researchers used 4D ultrasound to scan the unborn babies of 20 mothers, four of whom smoked. The babies were scanned four times from 24 to 36 weeks of pregnancy for 15 to 20 minutes at a time.

The scan is known as 4D because it provides detailed 3D-like moving images, time being the fourth dimension. 

Recordings were analysed for the numbers of times the babies moved their mouths and touched their faces.

The women filled in questionnaires at each scan to say how stressed they were feeling. They also completed a widely used depression questionnaire called the Hospital and Anxiety Depression Scale.

The ultrasound scans took images every half a second, producing detailed pictures of the babies' faces through time. Some scans were double-checked independently to confirm that the movement counts were accurate.

The researchers used standard methods to analyse the differences between smoking and non-smoking groups and how they changed over the four scans. They adjusted their results to take account of the babies' gestational age and sex, and the mothers' age, stress level and depression symptoms.

What were the basic results?

Babies whose mothers smoked (14 cigarettes a day on average) moved their mouths more often than babies whose mothers were non-smokers. This was true at the start of the study, and the gap widened as the study went on.

Babies whose mothers did not smoke reduced their number of mouth movements from their first to last scan by about 3% a week. This happened more slowly for babies whose mothers smoked, at 1.5% a week.

The results were less clear-cut for the number of times babies touched their faces. Researchers said the difference between the two groups was "borderline [statistically] significant", meaning they cannot be sure this was not down to chance.

However, the direction of the effect was similar – babies whose mothers smoked tended to touch their faces more often, and there was a decline in movement in both groups as the babies grew.

Mothers' stress levels also affected the baby's movements. Babies moved their mouths and touched their faces more often when their mothers reported higher stress levels.

All babies were born healthy, and there were no significant differences recorded between the babies born to smokers and non-smokers.

How did the researchers interpret the results?

The researchers say their study shows that the scanning technique they used provides a more sensitive way of assessing differences in babies' movements before they are born, compared with methods such as asking mothers to record how often they feel the baby move.

They say the comparison between smoking and stress levels shows that "smoking appears to be more important than stress" in terms of how it affects the baby's movements.

While they cannot be sure why the differences in face touching arise, the researchers suggest the babies may touch their faces to soothe themselves in a way that young infants have been seen to do after birth.

They also suggest the differences in mouth movements and face touching could be down to the rate at which the baby's central nervous system (brain and spinal cord) matures. Babies whose mothers smoke are thought to have slower-maturing nervous systems.

Conclusion

This pilot study looked at whether ultrasound scans could be a reliable way of assessing foetal movements. It then looked at whether movements differ between babies whose mothers smoke and those who do not.

The study found babies whose mothers smoked moved their mouths more often, and the rate at which they reduced their mouth movements was slow compared with babies whose mothers didn't smoke.

The main limitation of this study was its size – only four smokers and 16 non-smokers were included. This means the results are more likely to be down to chance than in a bigger study. We can't be sure that these results apply to all babies of smokers and non-smokers, and a bigger study is needed to confirm the results.

A further point is that if there are real differences between the movements of babies whose mothers smoke or don't smoke, we can't say exactly why these differences arise or what they mean for the baby.

The researchers have suggested reasons for the differences in movements, but this type of study is not designed to look at the reason behind the differences.

We need more research to investigate whether the differences seen in this study do represent slower development of the baby's nervous system, and whether they could have any meaning for the continued growth and development of the infant or child.

Another limitation is the potential influence of confounding – that is, any differences may not necessarily be a direct effect of smoking, but could be the result of the influence of other factors. The study did take into account the baby's gestational age and sex, or the mother's age, stress levels and depression symptoms, for instance.

However, other factors could have influenced the results, such as socioeconomic factors, whether the father smoked, or other health and lifestyle factors in the mother, such as diet, physical activity, BMI and alcohol intake. 

The photographs released to the media are from two 10-second scans of 32-week-old babies, one taken from a woman who smoked and one from a non-smoker. They are described as "illustrative". The first shows the baby covering its face with both hands, while the second shows the baby with a hand to its mouth.

The images are powerful and provoke an emotive result in most people, as the baby appears to be in distress. But it's important to bear in mind that these images may not be representative of the approximately 10 to 13 hours of scans taken. We cannot tell whether the babies pictured were distressed, contented, or showing another emotion.

Despite the limitations of this study, it is already well established that smoking during pregnancy has various harmful effects, both to mother and baby.

This small study found there may be differences in the movement of unborn babies of smoking and non-smoking mothers. Whether there are true differences, and whether they have any meaning or implications in terms of the baby or child's ongoing development, is something that needs to be examined in further larger studies.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Unborn baby shown grimacing in womb as mother smokes. The Daily Telegraph, March 23 2015

4D ultrasound study shows harmful effects of smoking on unborn babies. The Independent, March 23 2015

4D ultrasound scans show the harmful effects of smoking during pregnancy on babies. Daily Express, March 23 2015

Revealed, what smoking does to an unborn baby: Remarkable images of foetus holding its face show how babies exposed to cigarettes may have delayed development. Mail Online, March 23 2015

Links To Science

Reissland N, Francis B, Kumarendran K, Mason J. Ultrasound observations of subtle movements: a pilot study comparing fetuses of smoking and non-smoking mothers. Acta Paediatrica. Published online March 12 2015

Categories: Medical News

News analysis: Angelina Jolie's surgery to 'cut ovarian cancer risk'

Medical News - Tue, 03/24/2015 - 14:00

Writing in the New York Times, actress Angelina Jolie has announced she recently had her ovaries and fallopian tubes removed as tests showed she had an estimated 50% chance of developing ovarian cancer. This is because previous testing found she was carrying high-risk genes linked with ovarian as well as breast cancer.

This follows a previous announcement in 2013 when Ms Jolie announced she had undergone a double mastectomy (where both breasts are surgically removed) followed by breast reconstruction surgery. This was because the same high-risk genes gave her an 87% chance of developing breast cancer.

Jolie explained: "I had been planning this for some time. It is a less complex surgery than the mastectomy, but its effects are more severe. It puts a woman into forced menopause. So I was readying myself physically and emotionally, discussing options with doctors, researching alternative medicine, and mapping my hormones for estrogen or progesterone replacement.

"Regardless of the hormone replacements I’m taking, I am now in menopause. I will not be able to have any more children, and I expect some physical changes. But I feel at ease with whatever will come, not because I am strong but because this is a part of life. It is nothing to be feared."

 

What genes contribute to ovarian cancer risk?

BRCA1 and BRCA2 are faulty genes linked to ovarian cancer. They're also known to increase the risk of breast cancer.

Having a family history of ovarian cancer, especially if the cancer developed before the age of 50, could mean the faulty genes run in your family.

You may be at a high risk of having a faulty gene if you have:

  • one relative diagnosed with ovarian cancer at any age and at least two close relatives with breast cancer whose average age is under 60; all of these relatives should be on the same side of your family (either your mother's OR father's side)
  • one relative diagnosed with ovarian cancer at any age and at least one close relative diagnosed with breast cancer under the age of 50; both of these relatives should come from the same side of your family
  • two relatives from the same side of the family diagnosed with ovarian cancer at any age 

If you're at a higher risk of having a faulty gene, your GP can refer you for tests to check for faulty BRCA1 and BRCA2 genes.

Ovarian Cancer Action has developed a tool to help you check whether your family history puts you at risk of ovarian cancer

 

What does preventative surgery involve?

If testing suggests you have a high risk of developing ovarian cancer, your doctor may recommend a type of surgery called bilateral salpingo-oophorectomy. This is where both of your ovaries as well as your fallopian tubes are surgically removed.

This should significantly reduce your chance of developing ovarian cancer, though it will trigger the menopause if you have not already gone through it.

This can cause symptoms such as hot flushes and night sweats. Symptoms usually respond well to hormone replacement therapy (HRT). 

Alternative treatments are also available for women who cannot or do not want to use HRT.

What other steps can I take to reduce my risk of ovarian cancer? Stopping ovulation and the contraceptive pill

Each time you ovulate, your ovaries are damaged by the egg as it breaks through the surface of the ovary and is released into your reproductive system.

The cells that make up the surface of your ovaries divide and multiply rapidly to repair the damage caused by the egg. It's this rapid cell growth that can occasionally go wrong and result in ovarian cancer.

Anything that stops the process of ovulation can help to minimise your chances of developing ovarian cancer. This includes:

Diet and lifestyle

Research into ovarian cancer has found that the condition may be linked to being overweight or obese. Losing weight through regular exercise and a healthy, balanced diet may help lower your risk of getting ovarian cancer. Aside from this, regular exercise and a healthy, low-fat diet are extremely beneficial to your overall health, and can help to prevent all forms of cancer and heart disease.

 

Edited by NHS Choices. Follow Behind the Headlines on Twitter.

Links To The Headlines

Angelina Jolie Pitt: Diary of a Surgery. New York Times, March 24 2015

Angelina Jolie reveals she had ovaries removed after cancer scare. The Guardian, March 24 2015

Jolie Has Ovaries Removed After Cancer Scare. Sky News, March 24 2015

Angelina Jolie has ovaries and fallopian tubes removed. BBC News, March 24 2015

Categories: Medical News

Blood test could provide an early arthritis warning

Medical News - Mon, 03/23/2015 - 17:50

"Arthritis breakthrough as new test diagnoses condition up to a decade earlier," the Mail Online reports. The test measures proteins linked with arthritis.

The study aimed to see whether a blood test could be developed that could distinguish between different types of early stage arthritis.

The study included groups of people with established diagnoses, including those diagnosed with early-stage osteoarthritis (so-called "wear and tear arthritis") and rheumatoid arthritis (caused by the immune system).

It then measured and compared levels of different proteins in their blood.

Overall, it found that looking at a combination of the levels of three proteins in the blood could distinguish between the different types of early-stage arthritis. This suggested such a test could have promise.

This is still early-stage research. Further study needs to look at whether this test is reliable for identifying and distinguishing between the different forms of early-stage arthritis in practice.

Most importantly, it needs to be seen whether use of the test leads to earlier treatment, and whether this leads to an improvement in patient outcomes.

 

Where did the story come from?

The study was carried out by researchers from the University of Warwick and other institutions in the UK. No sources of funding were reported. Some of the authors have a patent based on this work.

The study was published in the peer-reviewed scientific journal Scientific Reports.

The Mail’s headline is premature, as we do not know how accurate this test will prove to be on further study or whether it would be introduced. The subheadings saying “There is currently no test, meaning some patients are only diagnosed when disease is so progressed that surgery is the only option” is also a little overdramatic and inaccurate. This reporting makes it sound like osteoarthritis currently has no diagnosis and management pathways in place, which is not the case. Osteoarthritis is usually diagnosed based on a person’s symptoms, examination findings and X-ray findings.

 

What kind of research was this?

This was laboratory research, which aimed to develop a blood test to allow the detection and differentiation between different types of early-stage arthritis.

Blood tests are already used to help diagnose or exclude certain types of arthritis, such as rheumatoid arthritis, which is linked to having particular proteins and inflammatory markers in the blood. However, osteoarthritis (OA) has no diagnostic blood test. OA is a degenerative joint condition, where the cartilage covering the ends of the bones becomes worn and thin, causing symptoms including pain, stiffness, swelling and crunching feelings in the joints.

It is currently diagnosed based on a combination of a person’s symptoms and findings from a clinical examination. X-rays can also detect characteristic changes to the joints, though these are often not present in early stages of the disease.

This study aimed to look at if there were any biochemical markers that could be detected in the blood that would help diagnose early-stage OA and distinguish it from other types of arthritis. Ideally, a diagnosis could be made before any of the more advanced joint changes set in, which could be detected by X-ray.

 

What did the research involve?

This study included groups of people (181 people in all) with different established diagnoses:

  • advanced OA
  • early OA
  • advanced rheumatoid arthritis (RA)
  • early RA
  • early non-RA inflammatory arthritis – people with early symptoms of an inflammatory arthritis, but not having the diagnostic features of RA
  • a healthy control group with no joint problems

The researchers took blood samples from these people and samples of the fluid in the joints (synovial fluid) from those with early-stage arthritis. They used advanced laboratory techniques to measure the amount of different proteins in these fluids. They particularly looked at the amount of:

  • anti–cyclic citrullinated peptide (CCP) antibodies – a marker for RA
  • citrullinated protein – a marker for inflammation
  • hydroxyproline – a building block that is part of the protein collagen – a structural protein found in cartilage and bone

They compared the levels of these markers in people from the different groups. They also assessed whether looking for a particular combination of levels of these markers would allow them to tell the different groups apart.

 

What were the basic results?

The researchers found that compared to healthy controls, blood levels of citrullinated proteins were increased in people with early OA and early RA. Generally, people with early arthritis tended to have higher levels of these proteins in the blood, while in advanced disease, levels were lower in the blood and higher in the joint fluid.

Levels of citrullinated proteins were not increased in people with other non-RA early-stage inflammatory arthritis.

Anti–CCP antibodies were found mainly in the blood of people with early RA.

Compared to health controls, increased levels of hydroxyproline were found in people with early OA and early non-RA, but not in people with early RA.

The researchers found that looking at the levels of all three proteins enabled them to discriminate between people with early OA, early RA, other non-RA early inflammatory arthritis, and healthy joints. This combination test correctly identified:

  • 73% of people with early OA
  • 57% of people with early RA
  • 25% of people with non-RA early inflammatory arthritis
  • 41% of people with healthy joints

The test also correctly identified:

  • 87% of people who did not have early OA
  • 91% of people who did not have early RA
  • 76% of people who did not have non-RA early inflammatory arthritis
  • 75% of people who did not have healthy joints

 

How did the researchers interpret the results?

The researchers say their study provides a novel biochemical blood test that could be used for the diagnosis and discrimination of early-stage arthritis. They say that this could help to support improved treatment and patient outcomes.

 

Conclusion

This laboratory study suggests that for people presenting with early joint symptoms, examining blood levels of a combination of proteins could help to distinguish people who have early-stage OA from those who have early-stage RA or other inflammatory arthritis. 

However, this study is in the early stages and so far has only looked at relatively small samples of people with confirmed diagnoses of these different conditions. A lot of further work needs to be done to examine the accuracy of such a blood test, and to see whether it could reliably identify and distinguish between people with these conditions presenting to doctors in real world practice. These studies should assess whether it offers an improvement on the current approach to diagnosis based on symptoms, clinical examination, imaging findings and other blood tests currently used – such as measurement of inflammatory markers, rheumatoid factor, or anti-CCP antibodies.

Even if such studies find that the test performs well, it is likely that it would not replace all other diagnostic tests, instead being used in combination with other methods, especially as it performed better at detecting some forms of arthritis than others.

Most importantly, it also needs to be seen whether using this blood test as a diagnostic method would actually lead to improved disease outcomes for people with arthritis, as suggested in the news reports.

While several of the risk factors associated with OA are unavoidable (e.g. increasing age, female gender, previous joint damage or abnormalities), maintaining a healthy weight and staying active could help prevent onset of the disease. RA is an autoimmune disease (where the body’s own immune cells attack the joints) with no established cause. However, smoking is associated with the development of the condition.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Arthritis breakthrough as new test diagnoses condition up to a decade earlier - with just a single drop of blood. Mail Online, March 22 2015

DISCOVERY of proteins could lead to diagnosis of arthritis up to ten years before symptoms. Daily Express, March 22 2015

Links To Science

Ahmed U, Anwar A, Savage RS, et al. Biomarkers of early stage osteoarthritis, rheumatoid arthritis and musculoskeletal health. Scientific Reports. Published online March 19 2015

Categories: Medical News

Climate change 'might bring rise in UK mosquito-borne diseases'

Medical News - Mon, 03/23/2015 - 14:00

"Mosquitoes heading for warmer UK," Sky News reports after a new review predicted climate change will make the UK a more hospitable environment for disease-carrying mosquitoes and ticks, leading to an outbreak of conditions normally seen in more tropical climates.

In the review, two authors searched the literature to identify evidence looking at the effect climate change in Europe could have on diseases carried by mosquitoes or other insects, such as ticks.

Mosquitoes thrive in warm and wet environments, so a rise in the average temperature could make the UK a more attractive destination. This could then lead to an increase in three diseases – malaria, dengue fever and chikungunya (a viral infection with symptoms similar to malaria) – in the UK by as early as 2030.

A review of this kind can only provide an estimate and cannot predict the future with 100% accuracy. However, it does show the potential public health dangers that could arise from climate change: a rise in the average temperature by just a few degrees centigrade could have a range of unpredictable effects on our environment.

Where did the story come from?

This study was written by two researchers from the Emergency Response Department, Public Health England (PHE), Porton Down. PHE is the NHS body responsible for protecting and improving public health in England.

One of the researchers received partial finding from the National Institute for Health Research's Health Protection Research Unit.

The study was published in the peer-reviewed medical journal Lancet Infectious Diseases.

What kind of research was this?

This was a literature review, where the researchers identified and discussed research on the effect climate change could have on the risk of vector-borne disease in the UK. Vector-borne disease is disease carried by a non-human organism (such as mosquitoes or ticks) that is then transmitted to humans.

The researchers searched literature databases to identify any published papers that had looked at vector-borne disease in Europe, and focused on reports that were potentially relevant to the UK.

They present a discussion of the issue and the evidence they identified. They also make various recommendations on monitoring and studying these vector-borne diseases, including how they are impacted by weather and climate.

The researchers clearly state that the views expressed in the article are theirs and "not necessarily those of the National Health Service, the NIHR, the Department of Health, or PHE".

What do the researchers say about mosquito-borne disease and climate change?

Insects regulate their body temperature by taking in heat from the environment. This means that increases in temperature could help them survive and incubate, thereby spreading any disease-causing organisms they carry, such as parasites, bacteria and viruses.

The researchers present evidence that has looked at the effects that 2C, 4C or 6C rises in average temperature could have on vectors carrying the following pathogens:

Of these pathogens, some (but not all) of the most extreme modelling scenarios suggest malaria could be present in the UK as early as 2030.

Climate assessment has suggested one type of mosquito that spreads dengue fever and chikungunya could theoretically live in warmer parts of the UK, and that by 2030 the climate could be even more well suited to this mosquito.

What do they say about malaria?

The researchers explain how malaria was regularly found in certain parts of the UK in the 1800s. The UK still has several species of mosquito capable of carrying the malaria parasite, albeit the less severe kind (Plasmodium vivax).

However, the researchers say rising summer temperatures could also support the development of the more severe malaria parasite (Plasmodium falciparum).

One group of researchers have modelled the effect climate change could have on P. falciparum. They estimate there will be between 1.5C and 5C increases in temperature between 2030 and 2100. Sustained transmission of the malaria parasite is still unlikely at these temperatures.

However, one of the most extreme model scenarios they looked at suggested there could be sustained transmission of the parasite (lasting at least one month of the year) in southern England by 2080 or, to a lesser extent, even as early as 2030.  

But, as the researchers say, antimalarial drugs and the UK health system should be able to minimise transmission.

What do they say about dengue fever and chikungunya?

The researchers say that since 1990, five different tropical species of mosquito have become adapted to the temperate climate of Europe. These species are potential vectors of the tropical diseases dengue, chikungunya and yellow fever.

In the past decade, there have been cases where one of these tropical mosquito species has been implicated in outbreaks of chikungunya and dengue in southern France, Italy and Croatia.

Climate change is predicted to permit the expansion of this species across Europe, including the south of the UK.

If these mosquitoes do become established in the UK, people with dengue or chikungunya who travel to the UK would then be a source of infection for the established mosquitoes.

Ongoing transmission would then depend on the local climate conditions controlling mosquito populations.

Two models suggested that by 2030-50, the climate in southern England could be more suitable for one species of mosquito that carries chikungunya and dengue.

Models also predicted transmission periods of one month to be possible in London by 2041, and one to three months of activity possible in southern England by 2071-2100.

What do the researchers conclude?

The researchers make the following recommendations about how the potential threat from vector-borne disease could be managed:

  • Continue to enhance UK surveillance of endemic and non-native vectors.
  • Improve understanding of the effect of climate change and develop strategies to deal with changing public health risks in a changing environment (such as wetland management).
  • Better understand the effect of extreme weather events (such as flooding and drought) on the risk of infectious disease, and work with environmental organisations to develop management plans to prepare for a disease outbreak resulting from an extreme event.
  • Develop improved models that incorporate the many drivers for change (such as climate and land use) for a range of vector-borne diseases.
  • Continue to work collaboratively across Europe sharing data on vector-borne diseases and risk assessment.
Conclusion

Overall, this review provides insights into how climate change might lead to the transmission of tropical diseases in what are currently temperate parts of the world, such as the UK. Predicting what may happen in the future can help countries make sure they are prepared for such an eventuality.

This review was informed by a search for relevant literature, but may not have captured or included all relevant studies. Most of the studies were modelling studies, which are reliant on various assumptions that may or may not turn out to be correct.

It's not possible to say with any certainty what will occur in the future. The authors also note that climate change is not the only factor affecting vector-borne diseases.

Many other factors are equally important, such as socioeconomic development and changes in how land is used. This adds to the difficulty in predicting exactly how much climate change might impact these diseases.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Mosquitoes Heading For Warmer UK - Experts. Sky News, March 23 2015

Mosquitoes 'could bring exotic diseases to UK'. BBC News, March 23 2015

Plague of mosquitos carrying deadly diseases is headed for Britain, scientists warn. The Daily Telegraph, March 23 2015

Warmer weather means mosquitoes carrying killer diseases including dengue fever 'could be in the UK by 2030'. Mail Online, March 22 2015

Links To Science

Medlock JM, Leach SA. Effect of climate change on vector-borne disease risk in the UK. The Lancet Infectious Diseases. March 23 2015. (Article not yet online)

Categories: Medical News

Research casts doubt on aspartame sensitivity

Medical News - Fri, 03/20/2015 - 13:31

"Sweetener linked to cancer is safe to use," reports the Mail Online.

Aspartame – a commonly used artificial sweetener – has been dogged by controversy, despite being deemed safe by food regulators in the UK, EU and US.

Some believe they are sensitive to the sweetener. Anecdotal reports suggest it can cause headaches and stomach upsets.

This study recruited 48 "aspartame-sensitive" individuals and tested whether giving them a cereal bar with or without aspartame would elicit the suspect symptoms. The study was a gold-standard double blind randomised controlled trial (RCT), meaning neither the participants nor those analysing the results knew which bar they had eaten. This made it a fairer and more rigorous test.

It showed that there was no difference in the symptoms reported after eating the aspartame-laced bar compared with the normal bar.

This provides evidence that aspartame fears may not be warranted in some people who believe they are sensitive to the ingredient. However, the study may have failed to recruit those most fearful of the sweetener, so we can’t rule out aspartame-related symptoms in this group.

This study also can’t tell us whether regular aspartame consumption may have any health effects in the longer term.

To find out more, read "The truth about aspartame".

 

Where did the story come from?

The study was carried out by researchers from the University of Hull, the Food Standards Agency (FSA), Imperial College London, University College Dublin, Institute of Food Research (UK) and Weill Cornell Medical College, Qatar.

It was funded by the Food Standards Agency.

The study was published open-access in the peer-reviewed medical journal PLOS One. This means it is free to view and download this aspartame research.

The Mail Online reported the story accurately. However, in stating that aspartame does not cause harm, it would be better to make clear that this study has only looked at short-term effects. This study also had nothing to do with verifying aspartame’s safety in regards to cancer, despite what the headlines may lead you to believe.

 

What kind of research was this?

This was a double-blind randomised control crossover study looking at whether aspartame causes any harmful symptoms in people who report sensitivity to it.

Aspartame is a commonly used artificial sweetener that is around 200 times sweeter than normal sugar. Since its introduction in the 1980s, there have been concerns over whether aspartame is safe. There are many anecdotal reports of it causing stomach upsets, headaches and other problems. However, this concern doesn’t match the evidence.

Aspartame has been approved as a safe food ingredient after assessment of the evidence by regulators in the UK, EU and US, all of which have independently assessed the best available evidence. Despite the regulatory assurance, some people report they are sensitive to aspartame and are convinced that it causes them problems. The current study wanted to investigate this "aspartame-sensitive" group, to see if the claims were true.

A double blind RCT like this is the gold standard of single study research. It is one of the best ways to investigate whether aspartame is affecting people who report being sensitive to it. Neither the study participants nor those analysing the results knew whether they were consuming aspartame. This helps to eliminate bias caused by pre-conceived ideas of whether it is harmful or not. The only thing more convincing in the evidence stakes than an RCT like this is a meta-analysis of many of them.

 

What did the research involve?

Researchers gave 48 UK adults who said they were sensitive to aspartame two cereal bars, at least one week apart. One of the bars was laced with 100mg aspartame. This is equivalent, the researchers say, to the amount found in a can of diet fizzy drink. The other was a normal cereal bar. After eating each bar, standard questionnaires were used to assess psychological condition, and 14 symptoms were rated repeatedly over the next four hours. Blood samples were also taken immediately after eating and four hours later – the same was done for urine samples, but at four-, 12-, and 24-hour intervals.

One of the cereal bars was laced with aspartame and one was not. However, neither the participant nor the person analysing the results knew which was which, making the test more objective and eliminating many sources of bias.

Individuals volunteering were classified as "aspartame-sensitive" if they reported suffering one or more symptoms on multiple occasions, and as a consequence were actively avoiding consumption of any aspartame in their diet.

A further 48 people who didn’t report aspartame sensitivity (controls) repeated the same experiment under the same conditions. This group was chosen to match the characteristics of the aspartame-sensitive group in terms of age and gender. The aspartame-sensitive group had 21 men and 31 women; the control group had 23 men and 26 women. The groups did not differ significantly for age (around 50), weight, BMI, waist or hip circumference.

The 14 aspartame sensitivity symptoms assessed were:

  • headache
  • mood swings 
  • hot or flushed 
  • nausea 
  • tiredness 
  • dizziness
  • nasal congestion 
  • visual problems 
  • tingling
  • bloating 
  • hunger 
  • thirst 
  • happiness
  • arousal

The researchers’ main analysis looked for differences in symptoms after eating the aspartame-laced bar in those reporting aspartame sensitivity, compared with those reporting no sensitivity.

 

What were the basic results?

The main finding was that none of the rated symptoms differed between aspartame and control bars, or between sensitive and control participants.

They also found aspartame and control bars affected levels of chemicals in the blood (GLP-1, GIP, tyrosine and phenylalanine levels) equally in both aspartame-sensitive and non-sensitive subjects.

However, there were intriguing differences between the aspartame-sensitive group and the aspartame non-sensitive group. For example, the aspartame-sensitive people rated more symptoms, particularly in the first test session, whether this was after eating the placebo bar or the aspartame bar.

The two groups also differed psychologically in how they handled feelings and perceived stress.

 

How did the researchers interpret the results?

The authors’ conclusion was firm: "Using a comprehensive battery of psychological tests, biochemistry and state of the art metabonomics, there was no evidence of any acute adverse responses to aspartame.

"This independent study gives reassurance to both regulatory bodies and the public that acute ingestion of aspartame does not have any detectable psychological or metabolic effects in humans."

 

Conclusion

This study shows that an aspartame-laced cereal bar caused no more adverse symptoms than a bar without aspartame in a group or people who said they were sensitive to aspartame. It also had no more adverse symptoms in a control group of people who did not think they were sensitive to aspartame.

The effects were monitored up to four hours after eating. This provides compelling evidence that aspartame doesn’t cause any short-term symptoms, even in people who think they are particularly susceptible to it, and report avoiding it as a result.

Limitations with the study include some missing symptom data, because not everyone was able to complete the ratings scale after eating the bars. However, you might expect someone with symptoms to fill it in, so not filling it in might signal a lack of symptoms. The sample size of around 90 participants was also relatively small. A larger sample size would have increased the conviction of the results.

The study authors reported problems recruiting participants, which brings us to the biggest limitation to consider. They anticipated 48 aspartame-sensitive people would be recruited within a year, but it took 2.5 years, despite high-level media coverage. A lot more non-aspartame-sensitive people (147 individuals) initially volunteered for the study before just one aspartame-sensitive individual participated. The researchers say this may reflect their genuine fear of aspartame consumption. Consequently, the 48 who participated may not be representative of the population of people who believe they are aspartame-sensitive, but it was impossible to recruit those most fearful, as they avoid taking part.

A further limitation is that the study looked only at short-term effects and cannot exclude the possibility of long-term, cumulative effects of aspartame on biological parameters and on a person’s psychological state. The dose given was also reported to be smaller than the daily intake of many individuals, but was greater than the intake at which the people reporting aspartame sensitivity believe they suffer symptoms.

Overall, this study provides evidence that aspartame fears may not be warranted in some people who believe they are sensitive to the ingredient. However, the study probably failed to recruit those most fearful of the sweetener. We don’t know if this group have symptoms caused by aspartame.

The conclusions of this study, and aspartame’s approval by food safety agencies in the US, UK and EU, provide quite robust reassurance that aspartame is safe for the vast majority of people. As with any ingredient, you can’t say for sure that some individuals won’t react badly to it. However, the findings from this study suggest this may be a perception of harm that is not necessarily borne out when tested rigorously.

The FSA website says that in December 2013, the European Food Safety Authority (EFSA) published an opinion on aspartame: "following a full risk assessment after undertaking a rigorous review of all available scientific research on aspartame and its breakdown products, including both animal and human studies. The EFSA opinion concluded that aspartame and its breakdown products are safe for human consumption at current levels of exposure".

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Controversial sweetener aspartame found in fizzy drinks and diet products 'does NOT cause harm', report declares. Mail Online, March 20 2015

Links To Science

Sathyapalan T, et al. Aspartame Sensitivity? A Double Blind Randomised Crossover Study. PLOS One. Published Match 18 2015

Categories: Medical News

Are half of all children's teeth really rotten?

Medical News - Fri, 03/20/2015 - 13:30

"Rotten teeth are secret reason why teens don't smile," revealed The Times today.

The Daily Mirror expressed shock over revelations that, "More than a quarter of British children are afraid to smile because they have such bad tooth decay".

It explained how "poverty and sugar" were to blame after evidence has emerged that the poorest in British society are "twice as likely" to suffer from oral disease.

Public Health England's director of dental public health, Dr Sandra White, said the survey highlighted "the need to urgently reduce the amount of sugary snacks and drinks in our children's diets".

She went on: "Fluoride is indisputable in preventing tooth decay, and by brushing teeth using fluoride toothpaste and also introducing water fluoridation where needed, we can significantly improve our children's dental health."

Quoted in The Independent, Professor Damien Walmsley, scientific adviser to the British Dental Association, said: "These inequalities [between rich and poor] are persistent, but avoidable, and both parents and government must accept their share of responsibility."

What is the basis for these current reports?

These figures were revealed in the latest of five large-scale Children's Dental Health Surveys covering England, Wales and Northern Ireland. These surveys have been carried out every 10 years since 1973 to monitor the oral health of the nation.

The new survey report outlines changes in oral health since the last survey in 2003, and provides information on the distribution and severity of oral diseases and conditions in 2013.

The latest survey gives estimates of the dental health of 5-, 8-, 12- and 15-year-olds using data collected on a random sample of children by NHS dentists and nurses at dental examinations in schools.

Information on the children's experiences, perceptions and behaviours relevant to their oral health was collected from parents and 12- and 15-year-old children using self-completion questionnaires.

What did it find?

There was some good news. There were reductions in the extent and severity of tooth decay present in the permanent teeth of 12- and 15-year-olds overall in England, Wales and Northern Ireland in the last 10 years (2003 to 2013).

Decay was found in around a third of 12-year-olds (down from 43% in 2003) and half of 15-year-olds (46%, reduced from 56% in 2003). Around a third of 5-year-olds and almost half of 8-year-olds were found to have decay in their milk teeth.

However, large proportions of children continue to be affected by poor oral health. This directly affects their lives in significant and serious ways, such as not wanting to smile or problems eating food.

Children from poor families (judged by eligibility for free school meals) were more likely to have oral disease than other children of the same age:

  • a fifth (21%) of the 5-year-olds from poor backgrounds had severe or extensive tooth decay, compared with 11% of 5-year-olds whose families were richer
  • a quarter (26%) of the 15-year-olds from poor backgrounds had severe or extensive tooth decay, compared with 12% of 15-year-olds who were not eligible for free school meals

Oral health affected the health and wellbeing of older children and their families, too:

  • a fifth of 12- and 15-year-olds (22% and 19% respectively) reported experiencing difficulty eating in the past three months
  • more than a third (35%) of 12-year-olds and more than a quarter (28%) of 15-year-olds reported being embarrassed to smile or laugh because of the condition of their teeth
  • 58% of 12-year-olds and 45% of 15-year-olds reported that their daily life had been affected by problems with their teeth and mouth in the past three months

The majority of older children were positive about their oral health. About half of 12-year-olds (51%) and 60% of 15-year-olds were satisfied with the appearance of their teeth.

However, problems with oral health were common, and these impacted on children and their families:

  • a fifth of 12- and 15-year-olds (22% and 19% respectively) reported having difficulty eating in the past three months, and 35% of 12-year-olds and 28% of 15-year-olds reported being embarrassed to smile or laugh because of the condition of their teeth
  • nearly a quarter (23%) of the parents of 15-year-olds said they had taken time off work in the last six months because of their child's oral health, and 7% of parents of 5-year-olds said this

Severe or extensive decay was seen in around one in seven of the 5- and 15-year-old children (13% and 15% respectively).

How does this news affect me?

This study is a stark reminder to children, young people and parents of the importance of good oral health from the time a baby gets their first milk teeth. Poor oral health can have a significant impact on a person's life.

Today's report does not suggest or endorse ways to combat the issues raised, but we have covered stories in the past that offer potential solutions for discussion.

These include adding fluoride to water to prevent tooth decay, more consistent advice on how to brush teeth, and teaching and supervising tooth brushing in schools.

Read more about preventing tooth decay.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Half of children have tooth decay - with poverty and sugar blamed for epidemic. Daily Mirror, March 19 2015

Nearly half British children and teenagers have tooth decay. The Independent, March 19 2015

Almost HALF of eight year olds and a third of five year olds have fillings or missing teeth because of decay. Daily Mail, March 19 2015

'Half of eight-year-olds have tooth decay'. BBC News, March 19 2015

Half of children have tooth decay, national survey finds. The Times, March 19 2015

Categories: Medical News

Following UK dietary advice may cut heart disease risk

Medical News - Thu, 03/19/2015 - 16:00

"Sensible diet cuts heart attack risk in months," The Times reports after a randomised controlled trial found evidence that following current UK diet guidelines can reduce cardiovascular disease risk factors such as blood pressure and cholesterol levels.

We know that being a healthy weight and not smoking can help lower the risk of cardiovascular diseases such as heart attacks and strokes, but the evidence that healthy people benefit from low-salt, low-fat diets is weaker.

One of this study's strengths is its randomised design, which is actually uncommon with diet studies. This helps reduce the possibility that other factors influenced the results. Limitations include the fact this was a small study (165 participants) carried out over a relatively short time period.

Blood pressure and cholesterol measures are good indicators of the chances of someone having a heart attack or stroke, but not as reliable as waiting to see whether people in the study actually did so. It would be difficult (and possibly unethical) to do a dietary study that lasted long enough to show these outcomes.

The study suggests that if healthy middle-aged people follow current UK dietary recommendations, there may well be benefits, but we can't be sure of the size of the protective effect.

Where did the story come from?

The study was carried out by researchers from King's College London and was funded by the UK Food Standards Agency, the Department of Health, and the National Institute for Health Research.

It was published in the peer-reviewed medical journal The American Journal of Clinical Nutrition. It has been published on an open access basis, so it is free to read online.

A number of the authors have worked, or are currently working, for food manufacturers and medical companies, which could represent a conflict of interest.

The UK media reported the study with enthusiasm, with the Daily Mirror describing fruit and veg as a "lifesaver".

The Guardian did a good job of reporting the different outcomes, but did not report that some of the key measures did not show any improvement.

But none of the papers questioned how the overall "one-third" reduction figure mentioned by the researchers was calculated.

What kind of research was this?

This was a randomised controlled trial that compared the effects of following two types of diet.

One was based on a nutritionally balanced standard UK diet. The other included current UK nutritional guidelines, which recommend reduced salt, saturated fat and sugar intake, and an increased consumption of oily fish, fruit, wholegrain and vegetables.

Randomised controlled trials are a good way of comparing the true effects of a treatment or diet. However, the 12-week study could only look at the effects the diets had on markers such as blood pressure and cholesterol levels, but not long-term outcomes such as heart disease and stroke.

What did the research involve?

The researchers recruited 165 healthy non-smoking UK volunteers aged 40 to 70. They all had health checks at the start of the study, and were then split randomly into two groups. One group was asked to follow a standard UK diet while the others followed a diet based on healthy eating guidelines.

After 12 weeks, the health checks were repeated and the researchers looked for differences in blood pressure, cholesterol and other measures of heart attack risk that could have been caused by the different diets.

People selected to be in the study had an average risk of having a heart attack or stroke in the next 10 years.

The researchers ensured health check measurements, such as blood pressure, were reliable by using 24-hour blood pressure monitors rather than just taking one-off measurements.

Volunteers also had urine tests throughout the study so the researchers could estimate how well they were sticking to their allotted diets by checking their nutrient levels.

The dietary guidelines group were given dietary advice to help them reach the salt, fat, sugar and other targets in the healthy guidelines, and were advised to choose low-fat dairy products and lean cuts of meat.

The standard group were advised to eat a balanced "British" diet with no salt or sugar restrictions, based on bread, pasta and rice, potatoes with meat, limited oily fish, and wholegrain cereals. They were asked to eat full-fat dairy products.

Both groups were asked to limit their intake of sweets, cakes, biscuits and crisps, and to drink alcohol within safe limits.

Before the study began, the researchers agreed to look for three main outcomes, which they said would indicate an important change in people's heart attack risk. These were:

  • a reduction in daytime systolic blood pressure of 4mmHg (the higher blood pressure figure, which shows the pressure of the blood when it is pumped out of the heart)
  • a 5% change in the ratio of total cholesterol to HDL (or "good") cholesterol
  • a 1% reduction in blood vessel stiffness (flow mediated dilation)

While they reported on many other outcomes in the study, these are the key ones to look at. The researchers reported the effects of treatment as the comparison between the diet groups at the end of the study, adjusted to take account of differences between the participants before the study began.

However, it is not clear from the study how the researchers calculated the overall reduction in risk of having a heart attack or stroke from all the changes combined.

What were the basic results?

The main result was that people who followed healthy dietary recommendations reduced their daytime blood pressure measurement by an average of 4.2mmHg compared with the standard diet group, which was more than the researchers had hoped.

However, the average change in cholesterol ratio was less than expected – 4%, below the hoped-for 5%. There was no significant difference in change in blood vessel stiffness.

The people following the dietary recommendations lost weight compared with the standard diet group (average difference 1.9kg), even though that was not the intention of the study.

How did the researchers interpret the results?

The researchers said the change in blood pressure was "remarkable" and would "suggest" a reduction in the risk of having a fatal stroke of 54%, as well as a 39% drop in the risk of getting heart disease, for people following the healthy diet, depending on age.

They attribute about half of the drop in blood pressure to the effect of eating less salt. They say the change in cholesterol levels for the dietary guidelines group, although "modest compared with drugs such as statins", would still reduce the risk of heart disease by about 6%.

They concluded that, "selecting a diet consistent with current dietary guidelines compared with a traditional United Kingdom dietary pattern" would be likely to cut the chances of having a heart attack or stroke for people in the general population by 30% based on previous research.

Conclusion

This study showed that following dietary recommendations closely for 12 weeks can reduce blood pressure by a significant amount, which is likely to cut the chances of having a heart attack or stroke for an average healthy middle-aged person. The diet also affects cholesterol levels, but the overall effect of this may be modest.

The study appears to have been carefully conducted to avoid biasing the results. The researchers gave butter or margarine spread and cooking oil to people in both groups, for example, and asked everyone to fill out food diaries, as well as taking urine samples for nutrient analysis.

This may have improved the chances of people sticking to the diet they were allocated to. The methods used to analyse blood pressure and other health checks were rigorous and likely to produce reliable results.

However, it is disappointing that the study report is not clear about how the researchers reached the headline figure of a one-third drop in the risk of a heart attack or stroke.

The report includes much detail about changes to individual risk factors, such as different ways to measure cholesterol, but does not explain how the researchers calculated the overall risk reduction.

That said, this is a well-conducted study that offers good-quality evidence of the effects of following the current UK dietary recommendations.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Sensible diet cuts heart attack risk in months. The Times, March 19 2015

Diet with more fruit, fish and nuts cuts heart attack risk, say researchers. The Guardian, March 18 2015

Healthy diet means heart attack risk reduction says study. ITV News, March 19 2015

Links To Science

Reidlinger DP, Darzi J, Hall WL, et al. How effective are current dietary guidelines for cardiovascular disease prevention in healthy middle-aged and older men and women? A randomized controlled trial. The American Journal of Clinical Nutrition. Published online March 18 2015

Categories: Medical News

New blood test could help prevent antibiotic misuse

Medical News - Thu, 03/19/2015 - 14:40

"A new blood test can help doctors tease out whether an infection is caused by a bacteria or a virus within two hours," BBC News reports. The test, which looks at protein pathways in the blood, could help to appropriately target the use of both antibiotics and antivirals.

In many cases, it is unclear whether a person’s symptoms are being caused by a viral or bacterial infection, and current testing can take up to several days to find out.

In cases of severe illness, antibiotics are usually prescribed while waiting for the results, and this can contribute to antibiotic resistance.

Israeli-based researchers who developed the test used 1,002 children and adults who had been admitted to hospital. The test was good at distinguishing between viral and bacterial infections, and separating people with and without an infectious disease.

However, it needs to be used by a greater number people, to test its effectiveness, and has not yet been used to influence treatment. Further research, including randomised controlled trials, will be required before it could be used in a clinical setting.

Where did the story come from?

The study was carried out by researchers from several institutes and medical centres in Israel. It was funded by MeMed, a company based in Israel that designs and manufactures diagnostic tests. Most of the researchers were employed by MeMed and some reported owning stock options with the company.

The study was published in the peer-reviewed medical journal PLOS One. It is published on an open-access basis, so is free to read online.

The research was accurately reported by BBC News.

What kind of research was this?

This was a laboratory study, which used blood samples from a cohort of patients admitted to hospital. It aimed to develop a blood test that could distinguish between viral and bacterial infections.

Overuse or incorrect use of antibiotics leads to the inadvertent selection of bacteria that have resistance to them. Over time, the resistant bacteria can become more common, making the drugs less useful.

This is causing global concern, as infections that have been easy to treat with antibiotics may now emerge as serious, life-threatening conditions. This can happen by people being given "broad spectrum antibiotics". This happens when an infection is suspected, but before any microbiological results can show the exact type of infection. This means that some people will be given the wrong antibiotic, too many antibiotics or an antibiotic for illness caused by viruses, which will be ineffective.

Current tests that can be speedily obtained when an infection is suspected include non-specific markers of infection and the number of different white blood cells. These cells are specialised to fight different types of infections, with neutrophils mainly fighting bacteria and lymphocytes mainly fighting viruses. However, the interpretation of these tests is not straightforward, as both can be increased in each type of infection.

The researchers wanted to develop a test that could show whether the infection is from a bacteria or virus, so that fewer

What did the research involve?

The researchers took blood samples from 30 people and measured a number of proteins that are produced by the immune system in response to bacterial or viral infections. They used this information to create a blood test that measured these proteins. They then tested how accurate it was in 1,002 children and adults admitted to hospital with or without a suspected infection.

They used a systematic literature review to identify 600 proteins that can increase during bacterial and viral infections. Using samples from 20 to 30 people, half of whom had a viral infection and half a bacterial infection, they whittled down the number of proteins that are distinctly raised in each type of infection to 86. They then looked at the level of these proteins in 100 people, half with each infection, and found that 17 of the proteins were the most useful. Using statistical programmes, they chose three proteins for their final test. These were:

  • CRP (C-Reactive Protein) – a protein that increases in response to tissue injury, infection and inflammation; this is routinely used in clinical practice
  • IP-10 (Interferon gamma-induced protein-10)
  • TRAIL (Tumour necrosis factor [TNF] related apoptosis-inducing ligand)

The researchers then used the test on blood samples from children and adults from two medical centres who were suspected of having an infection due to a fever of over 37.5C developing within 12 days of the onset of symptoms. A control group consisted of people who were not suspected of having an infection – such as people with suspected trauma, stroke or heart attack – or healthy people.

People were excluded who had:

  • evidence of acute infection in the previous two weeks
  • immune deficiency from birth
  • immunosuppressant treatment
  • cancer
  • HIV
  • hepatitis B or C

After all usual test results were obtained, a panel of three clinicians individually reviewed the clinical notes and test results, and recorded whether each person had a bacterial infection, viral infection, no infection, or that it was unclear. The three doctors made their assessment independently and were not told what the other doctors had decided, and did not know the result of the test in development. They compared findings from this expert panel with the results of their blood test.

What were the basic results?

A total of 765 participants were diagnosed with either a viral infection, bacterial infection or no infection. Additionally, there were 98 people who did not have a clear diagnosis.

The test was good at distinguishing between viral and bacterial infections, and separating people with and without an infectious disease. The test remained robust regardless of where the infection was, such as in the lungs or the gut, or variables such as age.

Results were not clearly presented for the 98 people without a firm clinical diagnosis.

How did the researchers interpret the results?

The researchers concluded that "the accurate differential diagnosis provided by this novel combination of viral- and bacterial-induced proteins has the potential to improve management of patients with acute infections and reduce antibiotic misuse".

Conclusion

This new test shows promising results in distinguishing between viral and bacterial infections. This is important because of increasing antibacterial resistance and could help doctors to tailor treatment quicker when someone is admitted with a suspected infection.

At present, distinguishing between different types of infections is complex and relies on symptoms, signs, a variety of clinical tests and clinical judgement. One of these tests is the CRP, which is used as an indicator of the severity of infection or inflammation, and is often used to monitor this over time. It is surprising that it has been used as one of the determinants in this new test, as it is considered to be a non-specific marker of inflammation or infection and increases in both viral and bacterial infections.

While the results of the study are positive, it’s important to realise that the test is not ready to be used on the general population. It will need to be tested on larger groups of people to confirm its accuracy. In addition, studies will need to show that it delivers benefits to patients in the way it is hoped – for example, finding out whether using this test leads to more accurate prescribing of antibiotics, less antibiotics being prescribed, or speeds up the process of diagnosing infection. Further research along these lines, including randomised controlled trials, will be required before it could be used in the clinical setting.

Although the test appeared to be good at distinguishing between viral and bacterial infections, it is unclear what results were obtained for people who did not end up with a clear diagnosis using the best existing methods. We do not know if the new test gave a result for these people or was inconclusive. This group doesn’t appear to benefit from the old or new testing methods, so will need to be explored in the next phases of research.

You can help slow down antibiotic resistance by always completing a course of prescribed antibiotics, even if you feel well before the end of the suggested course of treatment. Remember: antibiotics are not effective against colds, most sore throats and flu.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Rapid blood test to 'cut antibiotic use'. BBC News, March 19 2015

Links To Science

Oved K, Cohen A, Boico O, et al. A Novel Host-Proteome Signature for Distinguishing between Acute Bacterial and Viral Infections. PLOS One. Published online March 18 2015

Categories: Medical News

Damage to 'heart health' may start in childhood

Medical News - Wed, 03/18/2015 - 15:30

"Children are suffering damage to their hearts as early as 12 due to poor diets," the Mail Online reports.

A US study has found high levels of known risk factors for heart disease in children. The study has not shown the direct effect these risks have in this age group, but it has raised concerns that they may affect the heart from childhood.

The study looked at four related risk factors known to contribute towards heart disease in adults:

They are:

Researchers surveyed 8,961 children and found that less than 1% of children aged two to 11 ate a healthy diet. Nearly a third of the children were overweight or obese.

80% of the children only met one of the five elements considered to be part of an "ideal" diet:

  • four or five portions of fruit and vegetables a day
  • fish twice a week
  • low salt
  • low added sugar in drinks
  • regular wholegrains

While heart damage in later life was not directly assessed in this study, it does highlight the need for further health promotion strategies. Heart disease has now overtaken cancer as the leading cause of death in developed countries.

While it is not clear to what extent these US findings relate to the UK population, the UK is in the midst of its own obesity epidemic. The latest figures suggest that the UK is now the "fat man of (Western) Europe", with one in four British adults now obese.

 

Where did the story come from?

The study was carried out by researchers from Northwest University in Chicago, the University of North Carolina at Chapel Hill and the University of Colorado School of Medicine. No external funding was reported.

The study was published in the peer-reviewed medical journal Circulation: Cardiovascular Quality and Outcomes.

On the whole, the Mail Online reported the story well, but there were some inaccuracies. The headline that stated damage to the heart starts before the age of 12 was not confirmed by the study. While it is likely that the increased cholesterol, BMI, poor diet and high blood pressure in this age group may be bad for the heart, the study did not directly check for any damage to the heart. They did point out some of the limitations of the study – specifically, using adult dietary recommendations for children and not adjusting this for the amount of exercise they take.

 

What kind of research was this?

This was a cross-sectional study, which measured how common risk factors for cardiovascular disease are in childhood.

Cardiovascular diseases (which affect the heart and blood vessels) are the leading cause of death globally. There are several known risk factors for cardiovascular disease, which are smoking, high blood pressure, obesity, high cholesterol, diabetes, low levels of physical activity and poor diet. Previous research has shown that managing these risk factors from adolescence is associated with reduced risk of cardiovascular disease.

This research aimed to provide a national reference point for these risk factors in children under the age of 12 in the US. This will help researchers to assess how successful future strategies are in tackling childhood obesity, by looking for changes in these measures over time.

 

What did the research involve?

The researchers used information from a large US study called The National Health and Nutrition Examination Survey (NHANES). These surveys collect data from adults and children across the US every two years, using a home interview and a health examination.

This study looked at four cardiovascular risk factors for 8,961 children who had participated between 2003 and 2010. These were:

  • diet
  • cholesterol level
  • blood pressure
  • BMI

Dietary intake was assessed by two interviews with the child’s carer (parent or guardian) and recorded dietary intake over the previous 24 hours. An "ideal diet" was considered to meet the following five criteria:

  • 4.5 or more cups of fruit and vegetables per day
  • two or more servings of fish per week
  • three or more servings of wholegrains per day
  • less than 1.5 grams of salt per day
  • less than 450 calories of added sugar in drinks per week

This broadly matches current UK recommendations for a healthy diet for children.

Children were then classified into three groups, according to how many of these criteria they fulfilled:

  • "ideal diet" – meeting four or five criteria
  • "intermediate diet" – meeting two or three criteria
  • "poor diet" – meeting none or one of the criteria

Similarly, they also classed the children’s other measurements (such as BMI, blood pressure and cholesterol) as "ideal", "intermediate" or "poor", based on standard criteria.

 

What were the basic results?

The main results were:

  • 99.9% of children did not have an ideal healthy diet, with most (over 80%) having a poor diet
  • 38% did not have an ideal cholesterol level
  • about 8% did not have ideal blood pressure
  • about 30% of children did not have an ideal BMI (were overweight or obese)

When combining the results for children aged eight to 11:

  • no children had ideal levels for all four cardiovascular health measures (diet, cholesterol, BMI and blood pressure)
  • 39% of boys and 38% of girls had three ideal measures
  • all children had ideal levels for at least one measure

 

How did the researchers interpret the results?

The researchers concluded that "with the exception of diet rated as intermediate or poor for nearly all children, the majority of children observed from ages two to 11 years had ideal CVH [cardiovascular health] for BMI, total cholesterol and blood pressure, thereby starting life with generally favourable CVH metrics". However, they are concerned about the rise in obesity and the effect this has on cardiovascular health. They say that "promoting the recommended dietary habits, physical activity as part of daily life, and arresting the growing trend of obesity are keys to achieving more favourable CVH metrics and long-term freedom from cardiovascular disease".

 

Conclusion

This large US survey has found high rates of poor diet, as well as overweight and obesity in children, some of whom also had high blood pressure and cholesterol. The data was collected over a number of years, and should be nationally representative, but may not be representative of each year individually.

Other limitations acknowledged by the researchers include the following:

  • Potential inaccuracies in parental reporting of the children’s diet over the previous 24 hours. This could be due to poor recall or being unaware of food the child consumed outside of the home.
  • An average ideal dietary intake for adults was used, rather than individual estimates of the required dietary intake per child according to their level of energy expenditure, height, weight, growth rate and age.
  • Some children participated in each of the two-yearly surveys, so their results will be included at each age group. This may have affected the results.
  • The survey did not collect data on smoking or secondhand smoke exposure, physical activity level or type 2 diabetes.

While the study did not directly assess heart damage, as might be assumed from the news coverage, it does suggest that children in the US frequently have risk factors for developing cardiovascular disease. It is not clear whether the results are representative of what might be seen in the UK, but it is known that overweight and obesity are becoming more common.

Overall, the study highlights the need for measures to encourage healthy diet and lifestyle from an early age. Installing healthy habits at a young age may make it more likely that said habits will persist into adulthood.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Damage to the heart can start before the age of 12 if young people have a poor diet. Mail Online, March 18 2015

Links To Science

Ning H, Labarthe DR, Shay CM, et al. Status of Cardiovascular Health in US Children Up to 11 Years of Age - The National Health and Nutrition Examination Surveys 2003–2010. Circulation: Cardiovascular Quality and Outcomes. Published online March 17 2015

Categories: Medical News

Breastfed babies 'grow up to be brainier and richer'

Medical News - Wed, 03/18/2015 - 15:00

"Breastfed babies grow up smarter and richer, study shows," The Daily Telegraph reports. A study from Brazil that tracked participants for 30 years found a significant association between breastfeeding and higher IQ and income in later life.

This study followed almost 3,500 infants from birth to adulthood in Brazil. It found babies who were breastfed longer had higher IQs at the age of 30, as well as higher incomes. The authors say this is the first study to directly assess the impact of breastfeeding on income.

Another novel feature of the study was that the majority of the mothers were from low-income backgrounds. Studies in developed countries, such as the UK, may be skewed by the fact there is a trend for breastfeeding mothers to come from medium- to higher-income backgrounds.

The study used a good design and had a relatively high follow-up of participants (almost 60%) given how long it was. Although factors other than breastfeeding may have been influencing the results, the researchers did try to reduce their impact by making adjustments. The results for income may also not be as representative of more developed countries.

While it's difficult to conclusively state that breastfeeding itself definitely directly caused all of the differences seen, overall this research supports the view that breastfeeding can potentially benefit children long-term.

Current UK advice is that exclusive breastfeeding for around the first six months of life provides a range of health benefits for babies

Where did the story come from?

The study was carried out by researchers from the Federal University of Pelotas and the Catholic University of Pelotas in Brazil.

It was funded by the Wellcome Trust, the International Development Research Center (Canada), CNPq, FAPERGS, and the Brazilian Ministry of Health.

The study was published in the peer-reviewed medical journal Lancet Global Health on an open-access basis, so it is free to read online or download as a PDF.

The majority of the UK media provided a very balanced report of this study, noting the results and their implications, as well as the study's limitations.

What kind of research was this?

This was a prospective cohort study looking at whether breastfeeding was associated with higher IQ and income in adulthood. The short-term benefits of breastfeeding on a baby's immunity are well known.

The researchers also report that a meta-analysis of observational studies and two randomised controlled trials (RCT), which looked at the promotion of breastfeeding or compared breast milk versus formula in preterm babies, found longer-term benefits on IQ in childhood and adolescence.

There have been fewer studies looking at the effect on IQ in adults, all from developed high-income countries, but none looking at income.

Although two out of these three studies found a link with higher IQ, there is concern that this may at least in part be related to the fact that mothers of higher socioeconomic status in these countries tend to breastfeed for longer.

In the UK, women from a middle or upper class background are more likely to breastfeed than women from a working class background, so the researchers wanted to look at the link in a lower-income country (Brazil) where this pattern does not exist.

This is likely to be the best study design for assessing this question, as randomised controlled trials allocating babies to be breastfed or not are unlikely to be unethical.

As with all observational studies, the main limitation is that factors other than the one of interest (breastfeeding in this case) could be having an impact on the results, such as socioeconomic status.

Researchers can reduce the impact of these factors (confounders) by using statistical methods to take them into account in their analyses.

In this study, they also chose to analyse a population where a major confounder was thought to have less impact. There may still be some residual effect of these or other unmeasured factors, however.

What did the research involve?

The researchers recruited 5,914 babies born in 1982 in Pelotas, Brazil and their mothers, and recorded whether the babies were breastfed or not. They then followed them up and assessed their IQ, educational achievements and income as 30 year olds in 2013.

The researchers invited all mothers of babies born in five maternity hospitals in Pelotas in 1982 and who lived in the city to take part in their study, and almost all agreed.

When the babies were infants (19 months or 3.5 years old) the researchers recorded how long they were breastfed and whether they were mainly breastfed (that is, without foods other than breast milk, teas or water).

Researchers who did not know about the participants' breastfeeding history assessed their IQ using a standard test when they reached about 30 years of age. They also recorded the highest level of education participants had reached and their income in the previous month.

The researchers then compared outcomes in those who were breastfed longer against those who were breastfed for a shorter period of time or not at all.

They took into account a large range of potential confounders assessed around the time of the baby's birth (such as maternal smoking in pregnancy, family income, and baby's gestational age at birth) and during infancy (household assets). 

What were the basic results?

The researchers were able to follow-up and analyse data for 59% (3,493 individuals) of the participants they recruited.

About a fifth of babies (21%) were breastfed for less than a month, about half (49%) were breastfed for between one and six months, and the rest (about 30%) for longer than this. Most babies were mainly breastfed for up to four months, with only 12% mainly breastfed for four months or longer.

Longer duration of any breastfeeding or mainly being breastfed was associated with higher levels of education, adult IQ and income.

For example, compared with those who were breastfed for less than one month, those who had received any breastfeeding for a year or longer had:

  • IQ scores 3.76 points higher on average (95% confidence interval [CI] 2.20 to 5.33)
  • 0.91 more years of education on average (95% CI 0.42 to 1.40)
  • a higher than average monthly income (95% CI 93.8 to 588.3) – this was equivalent to around an extra 30% of the average income in Brazil

The researchers carried out a statistical analysis that suggested the difference seen in income with longer breastfeeding was largely a result of differences in IQ.

How did the researchers interpret the results?

The researchers concluded that, "Breastfeeding is associated with improved performance in intelligence tests 30 years later, and might have an important effect in real life by increasing educational attainment and income in adulthood."

Conclusion

This large long-term study found an association between being breastfed for longer and subsequent educational attainment, IQ and income at the age of 30 in participants from Brazil.

The authors say this is the first study to directly assess the impact of breastfeeding on income. The study used a good design and had a relatively high follow-up of participants (almost 60%) given its duration.

However, there are some points to note:

  • As with all observational studies, factors other than breastfeeding may have been influencing the results. The researchers did try to reduce their impact by making statistical adjustments, but some residual impact may remain.
  • There was less awareness of the benefits of breastfeeding in Brazil when the study started, so less association with socioeconomic status and education was expected. However, researchers did find that women who had the least, as well as the most, education and those with a higher family income tended to breastfeed more, although the differences tended to be small (less than 10% difference in frequency of breastfeeding at six months).
  • The results for IQ support those seen in higher-income countries, but there have not been any direct assessments of the effect of breastfeeding on income in these countries so far, and these may differ from lower-income countries.

While it's difficult to conclusively state that breastfeeding itself definitely directly caused all of the differences seen in this study, this research supports the belief that breastfeeding potentially has long-term benefits.

Breastfeeding is known to bring health benefits, and current UK advice is that this can be achieved through exclusive breastfeeding for around the first six months of life.

However, as experts noted on the BBC News website, breastfeeding is only one of many factors that can contribute to a child's outcomes, and not all mothers are able to breastfeed.

For more advice on breastfeeding, visit the NHS Choices Pregnancy and baby guide.

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Breast-fed babies grow up smarter and richer, study shows. The Daily Telegraph, March 18 2015

The longer babies breastfeed, the more they achieve in life – major study. The Guardian, March 18 2015

Breastfeeding 'linked to higher IQ'. BBC News, March 18 2015

Longer breastfeeding 'boosts IQ'. Daily Mail, March 18 2015

Breastfeeding 'linked to success and higher IQ'. The Independent, March 18 2015

Breastfed babies earn more as adults. The Times, March 18 2015

Links To Science

Victoria CG, Horta BL, de Mola CL, et al. Association between breastfeeding and intelligence, educational attainment, and income at 30 years of age: a prospective birth cohort study from Brazil. The Lancet Global Health. Published online March 18 2015

Categories: Medical News

Obese people 'underestimate how much sugar they eat'

Medical News - Tue, 03/17/2015 - 14:30

"Obese people are 'in denial' about the amount of sugar they eat," the Mail Online reports. Researchers looking into the link between sugar consumption and obesity found a "huge gap" between overweight people's self-reported sugar consumption and the reality, according to the news story.

Researchers assessed the self-reported sugar consumption (based on food diaries) and sugar levels in urine samples in about 1,700 people in Norfolk. After three years, they had their body mass index (BMI) measured.

The researchers found those whose urine test suggested they actually consumed the most sugar were more likely to be overweight after three years compared with those who consumed the least. However, the opposite was true for self-reported sugar intake.

The specific role of sugar (rather than calorie intake as a whole) in obesity is unclear, and previous studies have had inconsistent results.

One limitation of this study is that the spot-check urinary sugar test may not be representative of sugar intake over the whole study period. Also, the results may be affected by factors not taken into account by the analyses.

Although the news story focuses on the suggestion that overweight people are "in denial" about what they eat, this study itself did not attempt to explain the discrepancy between diet diaries and urine sugar measurements.

Overall, the main conclusion of this study is that more objective measures, rather than subjective diet-based records, may help future studies to better disentangle the effects of sugar on outcomes such as being overweight. 

Where did the story come from?

The study was carried out by researchers from the universities of Reading and Cambridge in the UK and Arizona State University in the US.

It was funded by the World Cancer Research Fund, Cancer Research UK, and the Medical Research Council.

The study was published in the peer-reviewed medical journal Public Health Nutrition. It is available on an open-access basis, so is available to download for free.

The Mail focuses on the suggestion that overweight people are "in denial" about what they eat. But this study did not assess why the discrepancies between diet diaries and urine sugar measurements exist. It also does not question some potential problems with the urine tests, which could undermine the results.

What kind of research was this?

This was a prospective cohort study, part of the European Prospective Investigation into Cancer and Nutrition (EPIC), a long-running investigation. It aimed to see whether people who ate more sugar were more likely to be overweight using two different ways of measuring sugar intake.

Observational studies assessing whether total sugar intake is linked to obesity have had conflicting findings. Such studies usually ask people to report what they eat using food frequency questionnaires or a food diary, and then use this information to calculate sugar intake.

However, there is concern that people under-report their food intake. Therefore, the researchers in this study used both food diaries and an objective measure (the level of sugar in urine) to assess sugar intake. They wanted to see if there was any difference in results with the two approaches.

The main limitation of observational studies such as this is that it is difficult to prove that a single factor, such as a particular type of food, directly causes an outcome such as being overweight. This is because other differences between people may be affecting the results.

However, it would not be ethical to expose people to potentially unhealthy diets in a long-term randomised controlled trial, so this type of observational study is the best practical way of assessing the link between diet and weight.

What did the research involve?

Researchers recruited adults aged 39 to 79 in Norfolk in the UK. They took measurements including their body mass index (BMI), lifestyle information, and tested their urine for sugar levels. Participants were also asked to record their diet over seven days.

Three years later, the participants were invited back and measured again for BMI and waist circumference. Researchers looked for links between people's sugar levels as shown in urine samples, the amount of sugar they reported eating based on their diet records, and whether they were overweight at this three-year assessment.

The entire EPIC study included more than 70,000 people, but researchers took a single urine sample from around 6,000 people as a "spot check" biomarker on sugar levels.

These single spot check samples measured recent sugar intake, and may be a less reliable measure of overall sugar intake than the more expensive and difficult test of collecting urine over a 24-hour period for analysis.

Almost 2,500 people did not come back for the second health check, and 1,367 people's urine tests were either not possible to analyse or the results were outside the standard range and so discarded.

This means only 1,734 of the original sample could be included in the final analysis. Because the people finally included were not randomly selected, it's possible that their results are not representative of all the people in the study.

The researchers ranked both the urine sugar results and sugar based on the dietary record results into five groups, from lowest to highest sugar intake. The specific sugar they were assessing was sucrose, found in normal table sugar.

For the analyses of people's self-reported sugar intake based on dietary record, the researchers took into account how many calories each person ate so this did not affect the analysis.

They then looked at how well the two types of sugar consumption measurement compared, and how likely people at the five different levels of sugar consumption were to be overweight or obese after three years, based on their BMI and waist circumference.

What were the basic results?

Results showed a striking difference between the urine sugar measurements and the sugar intake based on the diet diaries.

People who had the highest levels of sugar in their urine were more likely to be overweight after three years than those with the lowest levels.

The reverse was true when researchers looked at the people whose diet diaries suggested they ate the most sugar relative to their overall calorie intake compared with the least.

Using the urine sugar measurement, 71% of people with the highest concentration were overweight three years later, compared to 58% of people with the lowest concentration.

This meant that having the highest urinary levels of sugar was associated with a 54% increase in the odds of being overweight or obese after three years (odds ratio [OR] 1.54, 95% confidence interval [CI] 1.12 to 2.12).

Using people's seven-day diet diaries, 61% of people who said they ate the most sugar relative to their overall calorie intake were overweight, compared to 73% of people who said they ate the least sugar.

This meant those who reported the highest sugar intake relative to their overall calorie intake were 44% less likely to be overweight or obese after three years (OR 0.56, 95% CI 0.40 to 0.77).

How did the researchers interpret the results?

The researchers conclude that, "Sucrose measured by objective biomarker, but not self-reported sucrose intake, is positively associated with BMI."

They say there are "several possible reasons" for the discrepancies between the methods used to assess sugar intake. They admit the spot check urinary sugar marker may have disadvantages, but conclude that under-reporting of foods with high sugar content, particularly among those who are overweight or obese, may be a contributing factor.

As a result, they say future researchers looking at sugar as part of diet should consider using an "objective biomarker" such as urinary sugar, rather than relying on people's own estimates of what they have consumed.

Conclusion

This study has found conflicting associations between an objective measure of sugar intake and a subjective measure of sugar intake based on food diaries, and the risk of a person becoming overweight.

While more sugar in urine samples was associated with a greater risk of becoming overweight, consuming more sugar (based on food diary records) was actually associated with a reduced risk.

If the urine biomarker is a more accurate reflection of sugar consumed than diet diaries, then this research may explain why some previous diet studies have failed to show a link between sugar and being overweight.

However, there are some limitations to consider with the urine biomarker. Because the test used was a one-off snapshot of sugar intake, it can only show us how much sugar was in the person's urine at the time they were tested. Similar to a short-term food diary, we don't know whether that is representative of their sugar consumption over time.

The urine test is also not able to measure very high or very low sugar levels. The analyses of urine sugar levels did not adjust for overall calorie intake, while those for self-reported sugar intake did. It would have been interesting to see whether the association between urinary sugar levels remained once calorie intake was taken into account.

The current study did not assess why the dietary records and urinary measures of sugar differed. It also did not assess whether the discrepancies were larger among people who were overweight or obese at the start of the study – only how these measures were related to the outcomes at the end.

So it is not possible to say from this study alone that people who were overweight or obese had greater discrepancies between what they reported eating and their urinary sugar measurements.

However, the authors report that other studies have shown overweight people, especially women, are prone to under-reporting diet, particularly between-meal snacks.

As with all observational studies, it is difficult to rule out that factors other than those being assessed might be having an effect on the results. The researchers adjusted their analyses for age and gender, and say that results "did not change materially" after they adjusted the figures to take account of people's physical activity levels.

The results do not appear to have been adjusted to take account of other factors, such as people's level of education, income or other components of their diet, which may have an effect on weight.

The effect of sugar on health, independent of calorie intake, is still being debated by health organisations. If the findings of the current study are correct, using objective measures of sugar intake could help assess its effect on obesity and more widely on health. 

 

Analysis by Bazian. Edited by NHS Choices. Follow Behind the Headlines on Twitter. Join the Healthy Evidence forum.

Links To The Headlines

Obese people are 'in denial over how much sugar they eat': Huge gap exists between how much fat people think they consume and the reality, landmark study warns. Mail Online, March 16 2015

Links To Science

Kuhnlea GC, Tasevska N, Lentjes MA et al. Association between sucrose intake and risk of overweight and obesity in a prospective sub-cohort of the European Prospective Investigation into Cancer in Norfolk. The journal Public Health Nutrition. Published online February 2015

Categories: Medical News

Could epilepsy drug help treat Alzheimer's disease?

Medical News - Tue, 03/17/2015 - 13:37

A drug commonly used to treat epilepsy could help "slow down" the progress of Alzheimer's disease, reports The Daily Express. According to the news story, the drug levetiracetam was shown to "help restore brain function and memory". 

The story is based on a study analysing the short-term effect of the drug in 54 people with mild cognitive impairment (MCI). This is where people have problems with their memory and are at an increased risk of developing dementia, including Alzheimer's disease.

Dementia is a common condition that affects about 800,000 people in the UK. Most types of dementia cannot be cured.

Researchers found people with the condition showed overactivity in one part of the brain during one memory test involving image recognition.

This overactivity and performance on the test was better when participants had been taking 125mg of levetiracetam twice a day for two weeks, compared with when they had taken inactive "dummy" capsules.

This study was small, short-term and showed improvement on a single memory test. It is not possible to say from this study whether continuing to take the drug would reduce a person's chances of developing dementia.

Larger and longer-term trials would be needed to assess this. For now, levetiracetam remains a prescription-only medication that is only licensed for the treatment of epilepsy.

Where did the story come from?

The study was carried out by researchers from the Johns Hopkins University, and was funded by the US National Institutes of Health. It was published in the peer-reviewed medical journal NeuroImage: Clinical.

The Daily Express' headline, "Epilepsy drug found to slow down slide into Alzheimer's", overstates the findings of this study. It did not assess whether the drug affected a person's risk of Alzheimer's disease.

The study actually focused on how the drug affected short-term performance on one memory test in people with a specific type of MCI.

The news story also refers to "younger victims", but it is not clear what this means – the participants in this study were, on average, aged in their 70s.

What kind of research was this?

The main part of this study was a crossover randomised controlled trial looking at the effect of the anti-epileptic drug levetiracetam on brain function in people with amnestic mild cognitive impairment (aMCI). This type of study design is suitable if testing a drug or intervention that does not have lasting effects. 

The researchers report that previous studies have suggested people with aMCI have more activity in one part of one area of the brain (the dentate gyrus/CA3 region of the hippocampus) during certain memory tasks relating to recognising patterns.

Levetiracetam had been shown to reduce activity in these areas in animal research, so the researchers wanted to test whether low doses could reduce this excess activity and improve performance in memory tests in people with aMCI.

MCI is a decline in cognitive abilities (such as memory and thinking) that is greater than normal, but not severe enough to be classed as dementia. aMCI mainly affects a person's memory. A person with MCI is at an increased risk of developing dementia, including Alzheimer's disease.

What did the research involve?

The researchers recruited 69 people with aMCI and 24 controls (people of similar ages who did not have the condition). They gave levetiracetam to the people with aMCI and then tested their cognitive ability and monitored their brain activity with a brain scan (MRI).

They then repeated these tests with identical-looking dummy pills (placebo) and compared the results. They also compared the results with those of the controls taking the dummy pills.

All participants completed standard cognitive tests, such as the mini-mental status exam and other verbal and memory tests, as well as brain scans, at the start of the study.

Those with aMCI had to meet specific criteria – such as impaired memory, but without problems carrying out their daily activities – but not meet criteria for dementia. The control participants were tested to make sure they did not have MCI or dementia.

People with aMCI were randomly allocated to have either the levetiracetam test first and then the placebo test four weeks later, or the other way around. This aims to make sure that the order the tests were carried out does not affect the outcomes of the study.

In each test, participants took the capsules twice a day for two weeks before doing the cognitive test while having a brain scan. The researchers used three different doses of levetiracetam in their study (62.5mg, 125mg or 250mg, twice a day).

The cognitive test called the "three-judgement memory task" involved being shown pictures of common objects, such as a frying pan, beach ball, or a piece of luggage, shown one after the other.

Some of the pictures in the sequence were identical, some were similar but not identical (for example, different coloured beach balls), and most were unique pictures with no similar pictures shown.

The participants were asked whether each picture was new, identical to the one they had seen before, or similar to the one they had seen before. During the test, their brains were scanned using MRI to see which parts of the brain were active.

The researchers were able to analyse data from 54 people with aMCI and 17 controls, as some people dropped out of the study or did not have useable data – for example, if they moved too much while the brain scans were being taken.

What were the basic results?

After taking a placebo, people with aMCI tended to incorrectly identify more items as identical to ones they had seen before than control participants on the three-judgement memory task.

They identified fewer items as being similar to ones shown before compared with the control participants. This suggested people with aMCI were not as good at discriminating between items that were just similar to ones they had seen before and those that were identical.

When people with aMCI had been taking 62.5mg or 125mg of levetiracetam twice a day, they performed better on the three-judgement memory task than when they took placebo.

They correctly identified more items as being similar and fewer items incorrectly as similar, and performed similar to the controls. The highest dose of levetiracetam (250mg twice a day) did not improve test performance in people with aMCI.

Brain scans showed that when people with aMCI who had been taking placebo recognised identical items, they showed more activity in one area within a part of the brain called the hippocampus than controls recognising a match.

Taking 125mg of levetiracetam twice a day reduced this activity compared with placebo, but the lower and higher doses of levetiracetam did not.

The researchers say levetiracetam did not affect the performance of people with aMCI on standard neuropsychological tests. Results on these tests were not reported in detail.

How did the researchers interpret the results?

The researchers concluded that people with aMCI have overactivity of the dentate gyrus/CA3 region of the hippocampus during an image recognition memory task. Low doses of the epilepsy drug levetiracetam reduced this activity and improved performance on the tasks.

Conclusion

This small-scale study found that low doses of the epilepsy drug levetiracetam improved performance on an image recognition task for people with aMCI. This condition causes memory problems, and people who have it are at an increased risk of developing dementia.

While the news reporting has focused on the potential for levetiracetam to slow the onset of dementia, this is not something the research has assessed or focused on.

It instead focused on the short-term impact of the drug on a single test of memory, plus brain activity. There was reported to be no impact on other neuropsychological tests, which appeared to include other memory tests.

It's also important to note that the effect of taking the drug for two weeks was not lasting. It is not possible to say from this study whether continuing to take the drug would reduce a person's chances of developing dementia. Larger and longer-term trials would be needed to assess this. 

The researchers noted that they only looked at very specific brain areas, and this will not capture wider changes in brain networks.

Testing an existing drug that already has approval for treating another condition means that we already know it is safe enough for use in humans. This can mean that human trials can get started more quickly than if a completely new drug was being tested.

However, the benefits and risks still need to be weighed up for each new condition a drug is used for.

For now, levetiracetam remains a prescription-only medication that is only licensed for the treatment of epilepsy.

Analysis by Bazian. Edited by NHS ChoicesFollow Behind the Headlines on TwitterJoin the Healthy Evidence forum.

Links To The Headlines

Epilepsy drug found to slow down slide into Alzheimer's, study finds. Daily Express, March 14 2015

Links To Science

Bakker A, et al. Response of the medial temporal lobe network in amnestic mild cognitive impairment to therapeutic intervention assessed by fMRI and memory task performance. NeuroImage: Clinical. Published February 21 2015

Categories: Medical News