Most of the UK media is covering the announcement made in Parliament by Jeremy Hunt, Secretary of State for Health, about proposed changes to social care.
The two confirmed points to have garnered the most media attention in the run-up to the announcement are:
- a ‘cost cap’ of £75,000 worth of care costs – after this point the state would step in to meet these care costs
- raising the current means-testing threshold for people to be eligible for state-funded social care from £23,520 to £123,000
The government expects these changes will lead to fewer people having to sell their homes in order to pay for their long-term care needs.
Speaking in Parliament, Mr Hunt said that the current system was ‘desperately unfair’ as many older people face ‘limitless, often ruinous’ costs. The minister stated that he wants the country to be ‘one of the best places in the world to grow old’.
What is social care?
The term social care covers a range of services provided to help vulnerable people improve their quality of life and assist them with their day-to-day living.
People often requiring social care include:
- people with chronic (long-term) diseases
- people with disability
- the elderly – particularly those with age-related conditions, such as dementia
Social care services can include:
- help in your home or in a care home
- community support and activities
- day centres
How does the current adult social care system work?
Currently, state funding for social care is based on two criteria:
- means – people with assets of more than £23,520 do not qualify for funding
- needs – most local authorities will only fund care for people assessed to have substantial or critical needs
The majority of people currently requiring social care pay for it privately. These are known as ‘self-funders’.
What prompted these reforms to adult social care?
Put simply, on average, the UK population is getting older.
When the welfare state was created in the early 20th century, it was not expected that people would someday routinely live into their 70s, 80s, and even 90s.
The increase in life expectancy is a good thing, however, it brings a new set of challenges.
While people are living longer, they are also spending more of their lives in ill health. Older people are more likely to have potentially complex care needs that can be expensive to manage.
Many people are currently ineligible for state-funded social care under the existing laws. To meet the costs of these care needs, these ‘self-funders’ have, in many cases, had to sell or remortgage their home, or sell other assets to pay for the costs of their care.
Without reforms, experts agree that the cost of social care for both the state (through taxes) and to ‘self-funders’ is likely to become increasingly problematic.
To try and find the best way to resolve some of the difficulties of fairly funding adult social care, the Department of Health set up a commission. This independent commission reported its findings to ministers in July 2011. The government considered these findings in its white paper on care and support published in July 2012, and in the drafting of the proposed new legislation.
What happens next?
The government has introduced a Social Care Bill which will need to be passed by the Houses of Parliament.
If the bill is successfully passed it is expected the amendments will come into force by 2017.
Links To The Headlines
Social care: Jeremy Hunt hails 'fully-funded solution'. BBC News, February 11 2013
Social care reforms: Almost 2 million pensioners will be denied state help. The Daily Telegraph, February 11 2013
Social care reform: how your family may be affected. The Daily Telegraph, February 11 2013
Dilnot 'regrets' decision to set social care cap at £75,000. The Guardian, February 11 2013
Hunt statement on adult social care cap: Politics live blog. The Guardian, February 11 2013
"Patients with aggressive skin cancer have been treated successfully using a drug based on the herpes virus," The Guardian reports. A new study suggests a novel form of immunotherapy could be effective for treating some cases of advanced skin cancer.
This was a large trial examining the use of a new immune treatment called talimgogene laherparepvec (T-VEC) for advanced melanoma (the most serious type of skin cancer) that could not be removed surgically.
T-VEC is a modified derivative of the herpes virus that causes cold sores. It is injected directly into the tumour and causes the production of a chemical called granulocyte-macrophage colony-stimulating factor (GM-CSF), which stimulates an immune response to fight the cancer.
T-VEC injections were compared with injections of GM-CSF only, which is sometimes used to treat people who have poor immunity caused by cancer treatment.
The trial found, overall, significantly more people responded to treatment for more than six months with T-VEC (16.3%) than with GM-CSF injections (2.1%).
It also improved overall survival, but this only just reached statistical significance, meaning we can have less confidence in this effect. Average survival was 23.3 months with T-VEC, compared with 18.9 months with GM-CSF.
While these results are encouraging, media claims of a cure for advanced melanoma are misguided. Further research is needed to see how T-VEC compares with existing treatments. It is also not known whether the treatment would work for other types of cancer.Where did the story come from?
The study was carried out by a large collaboration of researchers from institutions in North America, including the University of Utah and the Cancer Institute of New Jersey.
It was funded by Amgen, the developers of the technology. The individual researchers report many affiliations with pharmaceutical companies, including Amgen.
The study was published in the peer-reviewed Journal of Clinical Oncology.
The quality of the reporting of this study is somewhat patchy. For example, The Guardian's statement that, "Patients with aggressive skin cancer have been treated successfully using a drug based on the herpes virus" needs to be set in the correct context.
The study showed only about one in five people given the treatment responded positively to it, so it will not work for everyone.
The claims made by the Daily Express, talking about a cure, are also not supported by the results of this study.What kind of research was this?
This was a randomised controlled trial (RCT) investigating treating melanoma with an injectable form of immune therapy.
The immune therapy under investigation is called T-VEC. It is a genetically engineered derivative of the herpes simplex virus type 1 (HSV-1), which causes cold sores.
The derivative is designed to selectively replicate within tumours and produce granulocyte-macrophage colony-stimulating factor (GM-CSF). GM-CSF is an important chemical produced during the natural immune response.
It recruits other white blood cells to fight infection or abnormal cells. Injecting a treatment that produces GM-CSF within a tumour should, in theory, boost the immune response to fight the tumour.
This study looked at whether injecting T-VEC directly into melanoma resulted in a better response compared with an injection of GM-CSF. GM-CSF injections are given under the skin, rather than directly into a tumour.
In normal medical practice, GM-CSF injections are used in the treatment of low white blood cell count (for example, in people receiving chemotherapy) to combat reduced immune system function.What did the research involve?
This was an international multicentre trial conducted in 64 different locations across North America, the UK and South Africa.
It included 436 adults (average age 63-64) with advanced melanoma that was not suitable for treatment by surgical removal, but could be directly injected with a treatment. People were randomised to receive either T-VEC injections into the tumour or GM-CSF injections under the skin.
T-VEC was given as a first dose, another three weeks later, then once every two weeks. GM-CSF was given once daily for 14 days in 28-day cycles.
Treatment was continued regardless of disease progression for 24 weeks, and after 24 weeks continued until there was disease progression, lack of response, remission or intolerability. At one year, people with stable or responsive disease could continue for a further six months.
The main outcome was disease response rate, defined as complete or partial response that started within the first 12 months and lasted continuously for at least six months. Response was measured through clinical assessment of the visible tumour and body scans.
Other outcomes included overall survival from the time of randomisation, best overall response, and duration of response.
Participants knew which treatment they were receiving, but assessors who examined the outcomes did not know. Analyses were by intention to treat (by the randomised treatment regardless of completion).What were the basic results?
Average duration of treatment was 23 weeks for T-VEC and 10 weeks for GM-CSF, and average follow-up time from randomisation to final analysis was just under two years.
Disease response rate was significantly better in people given T-VEC (16.3%) compared with those given GM-CSF (2.1%). This was an almost nine-fold increased odds of response (odds ratio [OR] 8.9, 95% confidence interval [CI] 2.7 to 29.2).
For these people who responded, average time to response was 4.1 months in the T-VEC group and 3.7 months in the GM-CSF group. Average time to treatment failure was significantly longer in the T-VEC group (8.2 months) than in the GM-CSF group (2.9 months).
Average survival was 23.3 months with T-VEC, compared with 18.9 months with GM-CSF. Overall, this was a borderline significant reduction in risk of death, which included the possibility there was no difference (HR 0.79, 95% CI 0.62 to 1.00).
The most common side effect using T-VEC was fever, which affected around half of those treated. This compared with less than 10% of those treated with GM-CSF.
Fatigue affected half of T-VEC treatment patients compared with just over a third in the GM-CSF group. Cellulitis was the only more serious side effect, occurring in a larger proportion of the T-VEC group.How did the researchers interpret the results?
The researchers concluded that T-VEC is the first cancer immune therapy to demonstrate benefit against melanoma in a clinical trial.
They say it gave significantly higher disease response rate and just significantly higher overall survival, making this a "novel potential therapy for patients with metastatic melanoma".Conclusion
This randomised controlled trial has demonstrated the effectiveness of a novel injectable immune treatment for advanced melanoma that cannot be surgically removed.
The trial has various strengths, including its large sample size, analysis by intention to treat, and blinding of assessors to treatment assignment, which should have reduced the risk of bias.
It demonstrated that, overall, significantly more people responded to treatment with T-VEC than GM-CSF injections. It also improved survival by an average of 4.4 months, but this only just reached statistical significance, meaning we can have less confidence in this effect.
There are several points to bear in mind, however:
- T-VEC boosts GM-CSF production within the tumour to enhance the immune response, and was therefore compared with GM-CSF injections. However, GM-CSF is not used as a treatment for advanced melanoma. Ideally, the treatment would need to be compared with treatments for advanced melanoma that are currently available – for example, chemotherapy, radiotherapy, and particularly other immune therapies, such as the antibody treatment ipilimumab.
- The treatment has not been shown to "cure" melanoma. Most of the people in this study passed away during the two years of follow-up, but the people receiving T-VEC generally lived slightly longer.
- The treatment is a genetically engineered derivative of herpes simplex type 1 virus. But this is not the same as having been infected with herpes simplex. For example, people shouldn't wrongly interpret the headlines to think getting cold sores offers protection against melanoma or other types of cancer.
- It is not known whether this treatment could only have potential for the treatment of advanced melanoma, or whether it could have other potential uses for other types of cancer.
Overall, the results of this trial into a potential new immune treatment for advanced melanoma are promising, but more research will be needed.
As with most conditions, prevention is more effective than cure when it comes to melanoma. Avoid overexposure to the sun or other artificial sources of ultraviolet light, such as sun beds, to reduce your risk of skin cancer.
Read more about protecting your skin from the sun
Links To The Headlines
Virotherapy: skin cancer successfully treated with herpes-based drug. The Guardian, May 27 2015
Cold sore virus 'treats skin cancer'. BBC News, May 26 2015
Genetically engineered virus 'cures' patients of skin cancer. The Daily Telegraph, May 27 2015
Herpes Virus Offers Hope Over Skin Cancer. Sky News, May 27 2015
World first as scientists use cold sore virus to attack cancer cells. The Independent, May 27 2015
Why HERPES virus could hold key to finding a skin cancer cure. Daily Express, May 27 2015
Links To Science
Andtbacka RHI, Kaufman HL, Collichio F, et al. Talimogene Laherparepvec Improves Durable Response Rate in Patients With Advanced Melanoma. Journal of Clinical Oncology. Published online May 26 2015
"Women who take the latest generations of contraceptive pills are at a greater risk of potentially lethal blood clots," The Times reports. While the increase in risk is statistically significant, it is very small in terms of individual risk
The combined oral contraceptive pill, commonly referred to as "the pill", is already well known to be linked to increased risk of blood clots in the veins, such as deep vein thrombosis (DVT), as we discussed back in 2014.
A new study, using two large GP databases, set out to refine the assessment of the risk. It identified women who had had a venous blood clot, matched them by age to unaffected women, and examined use of the pill in the previous year.
Overall it found that use of any contraceptive pill almost tripled risk of blood clot; though the baseline risk is small. And risk was generally higher with the newer third generation pills, compared with older pills. Encouragingly, risk was lowest for pills containing levonorgestrel, which is by far the most common prescribed. This pill carried risk of around six extra cases of blood clot for every 10,000 women prescribed.
Risk was more than double this for pills containing desogestrel, gestodene, drospirenone and cyproterone, though these aren't normally pills of first choice in practice, and are normally used when there are reasons to treat other symptoms such as acne.
The combined oral contraceptive pill remains a safe and effective form of contraception for most women, but it is not suitable for all – such as women with a history of heart disease or high blood pressure. Read more about who can, and who shouldn't, use the combined oral contraceptive pill.
Where did the story come from?
The study was carried out by researchers from the Division of Primary Care, University Park in Nottingham. It received no external sources of funding. The study was published in the peer-reviewed British Medical Journal as an open-access article. This means it can be read online or downloaded by anyone for free.
The reporting of the study by the UK media was accurate and refreshingly took steps to put the small increase in risk in context.
What kind of research was this?
This was a case-control study of women identified through two general practice databases in the UK. The researchers were aiming to look at the link between use of the combined oral contraceptive pill ("the pill") and risk of blood clots in the veins (e.g. deep vein thrombosis, or DVT), specifically taking into account the type of progestogen in the pill.
Use of the pill is already well known to be associated with increased risk of blood clots in the veins (venous thromboembolism). Different types of pill combine different types of the hormone progestogen with another hormone called oestrogen. It is recognised that the different progestogens have differing influence on the risk of blood clots, though previous study has not been able to quantify the risks of the different pills, particularly newer ones.
This case-control study investigated this by looking at women diagnosed with a blood clot, matching them to unaffected women and then looking at type of pill used.
What did the research involve?
The study used two large GP databases, QResearch and Clinical Practice Research Datalink (CPRD), both of which have previously been used to look into links between different drugs and blood clot risk. QResearch covers 618 general practices in the UK, and CPRD covers 722.
Researchers identified women aged 15-49 years registered between 2001 and 2013 who had a first instance of a venous blood clot. They matched these "cases" with up to five unaffected age-matched "controls" from the same database. They excluded women who were pregnant around the time, or those who had a hysterectomy or sterilisation. They excluded women who were pregnant around the time, who had hysterectomy or sterilisation, or who had a history of use of blood thinning medicine – suggesting history of or susceptibility to blood clots.
Use of the combined oral contraceptive pill was examined in the year prior to the blood clot record. They included all the most commonly used preparations in the UK, containing the different types of progestogen. They also included the combination of oestrogen with cyproterone acetate (brand name Dianette), which acts as a contraceptive pill, but its main indication is for the treatment of acne. They looked at when the pill had been used in relation to the time of the clot (e.g. current or past use) and how long it had been used for.
They took into account potential confounding factors that may influence clot risk, including:
- chronic medical conditions (e.g. cancer, heart or lung disease, arthritis or inflammatory conditions)
- recent immobilisation such as through trauma, surgery or hospital admission
- smoking and alcohol
- polycystic ovary syndrome (associated with pill use and increased risk of clots)
- social deprivation
What were the basic results?
After exclusions they had a sample of 5,500 cases and 22,396 controls in the QResearch database, and 5,062 cases and 19,638 matched controls in the CPRD database. The incidence of venous blood clots in the two databases was around six per 10,000 women per year. Just over half (58%) the blood clots in the two databases were DVTs.
In the two databases 28-30% of cases had used the pill in the past year, compared with 16-18% of controls. Overall, any pill use in the past year was associated with an almost tripled risk of venous blood clot compared to no use (adjusted odds ratio (OR) 2.97, 95% confidence interval (CI) 2.78 to 3.17).
The most common pill was one containing the progestogen levonorgestrel, which accounted for roughly half of prescriptions in cases and controls.
By type of progestogen the researchers found the following to be associated with lower risk:
- levonorgestrel (OR 2.38, 95% CI 2.18 to 2.59)
- norethisterone (OR 2.56, 95% CI 2.15 to 3.06)
- norgestimate (OR 2.53, 95% CI 2.17 to 2.96)
The following were associated with higher risks:
- desogestrel (OR 4.28, 95% CI 3.66 to 5.01)
- gestodene (OR 3.64, 95% CI 3.00 to 4.43)
- drospirenone (OR 4.12, 95% CI 3.43 to 4.96)
- cyproterone acetate (OR 4.27, 95% CI 3.57 to 5.11)
Pills are sometimes termed according to generations of when they were manufactured. The bottom list are newer "third generation" pills, while the former group includes mostly earlier generations. The exception is norgestimate in the former list, which is also third generation.
The number of extra cases of venous blood clot per year was lowest for levonorgestrel and norgestimate (both six extra per 10,000 women) and highest for desogestrel and cyproterone (both 14 extra per 10,000 women).
How did the researchers interpret the results?
The researchers conclude: "In these population-based, case-control studies using two large primary care databases, risks of venous thromboembolism associated with combined oral contraceptives were, with the exception of norgestimate, higher for newer drug preparations than for second generation drugs."
It is already well known that the combined oral contraceptive pill ("the pill") is associated with increased risk of venous blood clots. It is also already known that risk can differ according to the type of progestogen in the pill. This study adds further evidence helping to quantify these risks.
The study has numerous strengths. It has used two large GP databases covering large samples of the UK population, and containing reliable information on medical diagnoses and prescriptions made. The analyses were also adjusted for various confounders known to be associated with risk of blood clots.
It demonstrates pill use in the previous year almost tripled risk of venous blood clot, with risk generally higher with the newer pills than the older ones – though there were some exceptions.
Encouragingly, preparations containing levonorgestrel – which is by far the most common pill prescribed – had the lowest associated risk; around six extra cases of blood clot for every 10,000 women prescribed.
The preparations associated with the highest risks in this study – desogestrel, gestodene, drospirenone and cyproterone – were already recognised to be linked to higher risk, though this study has helped to better quantify these risks. These are not usually pill preparations of first choice in practice and are normally used when there are specific indications (e.g. women who have acne, particularly those taking cyproterone), or who have had side effects with other preparations.
The organisation in charge of regulating medicines in England quantified the risks of the combined pill last year and came up with very similar results.
That review said the benefits of the pill far outweigh the risks, but added: "Prescribers and women should be aware of the major risk factors for thromboembolism, and of the key signs and symptoms."
This study again highlights the need for careful prescribing of the combined oral contraceptive pill, taking into account the individual woman’s risk factors such as lifestyle and medical history. Women should also be aware of the signs and symptoms of venous blood clots, such as DVT. If a woman taking the pill experiences unexplained swelling or pain in the leg, or sudden breathlessness and/or chest pain, they should seek medical help immediately.
The combined pill may not be suitable for you if you have a history of certain chronic diseases, such as heart disease, diabetes or high blood pressure. Other alternative methods of contraception, such as a contraceptive implant may be a more suitable option.
Your GP should be able to advise you on the safest method for your individual circumstances.
Links To The Headlines
New generation of Pill raises blood clot risk. The Times, May 27 2015
Newer contraceptive pills raise risk of blood clot four fold. The Daily Telegraph, May 26 2015
Newer contraceptive pills linked to 'higher blood clot risk'. ITV News, May 26 2015
Women on newer types of contraceptive pill 'double their risk of getting blood clot'. Daily Mirror, May 27 2015
Links To Science
Vinogradova Y, Coupland C, Hippisley-Cox J. Use of combined oral contraceptives and risk of venous thromboembolism: nested case-control studies using the QResearch and CPRD databases. BMJ. Published online May 26 2015
"Teenage boys who become very obese may double their risk of getting bowel cancer by the time they are in their 50s," The Guardian reports. A Swedish study found a strong association between teenage obesity and bowel cancer risk in later adulthood.
The study involved over 230,000 Swedish males, who were conscripted into the military aged 16 to 20 years old. Those who were in the upper ranges of overweight and those who were obese at that time were about twice as likely to develop bowel cancer over the next 35 years as those who were a normal weight.
This study has a number of strengths, including its size, the fact that body mass index (BMI) was objectively measured by a nurse and that the national cancer registry in Sweden captures virtually all cancer diagnoses. However, it was not able to take into account the boys' diets or smoking habits – both of which affect bowel cancer risk.
Obesity in adulthood is already known to be a risk factor for bowel cancer, therefore the possibility that a person being obese from an early age also increases risk seems plausible. Maintaining a healthy weight at all ages will have a range of health benefits, such as reducing your risk of developing conditions including heart disease and type 2 diabetes, as well as a number of cancers.
Where did the story come from?
The study was carried out by researchers from Harvard School of Public Health and other research centres in the US, Sweden and the UK.
The study and researchers were funded by the National Cancer Institute, Harvard School of Public Health, Örebro University and the UK Economic and Social Research Council (ESRC).
The study was published in the peer-reviewed medical journal Gut.
The UK media covers this study reasonably well but did not discuss any limitations.What kind of research was this?
This was a cohort study looking at whether there was a link between body mass index (BMI) and inflammation in adolescence, and risk of colorectal (bowel) cancer later in life.
Being obese and having long-lasting (chronic) signs of inflammation in the body as an adult have been linked to increased bowel cancer risk. However, few of the studies have assessed the effect of obesity in adolescence specifically, and none have been said to look at the impact of inflammation in adolescence.
This type of study is the best way to look at the link between a possible risk factor and an outcome, as people cannot be randomly assigned to have, for example, higher or lower body mass index (BMI) or inflammation.
However, as people are not randomly allocated, it does mean that a group of people with an exposure are likely to differ in other ways from those without that exposure.
It is difficult to disentangle the effects of each of these differences, but researchers can try to single out the effect of the factors they are interested in if they have enough information about the differences between the groups.What did the research involve?
The researchers used data on BMI and inflammation collected from a very large group of Swedish adolescents and young men taking part in compulsory military service.
They used a national cancer registry to identify any of these men who later developed bowel cancer. They then analysed whether those who had higher BMIs or inflammation as youths were at greater risk.
The researchers analysed data from 239,658 men aged between 16 and 20 years old. These men had medical examinations when they were enlisted in compulsory military service between 1969 and 1976.
The marker (or sign) of inflammation the researchers had information on was the erythrocyte (red blood cell) sedimentation rate, or ESR. This measurement increases when there is inflammation.
Sweden has a national registry recording cancer cases diagnosed in the country, and researchers used this to identify men in the study who developed cancer from their enlistment up to January 2010. This gave an average of 35 years of follow-up for the men.
The researchers analysed whether BMI or signs of inflammation in late adolescence were linked to later risk of bowel cancer. They took into account confounding factors measured at the time of conscription that might affect results, including:
- household crowding
- health status
- blood pressure
- muscle strength
- physical working capacity
- cognitive function
The researchers identified 885 cases of bowel cancer.
Compared with those with a healthy weight BMI (from 18.5 to less than 25), those who were:
- underweight (BMI less than 18.5) or at the lower end of the overweight category (BMI 25 to less than 27.5) did not differ in their risk of bowel cancer
- at the upper end of the overweight category (BMI 27.5 to less than 30) had about twice the risk of developing bowel cancer during follow-up (hazard ratio [HR] 2.08, 95% confidence interval [CI] 1.40 to 3.07)
- obese (BMI 30 or more) were also more than twice as likely to develop bowel cancer during follow-up (HR 2.38, 95% CI 1.51 to 3.76)
Adolescents with "high" levels of inflammation were more likely to develop bowel cancer than those with "low" levels (HR 1.63, 95% CI 1.08 to 2.45).
However, those developing bowel cancer or inflammatory bowel disease (Crohn's disease or ulcerative colitis) in the first 10 years of follow-up were excluded, as this link was no longer statistically significant.
This suggested that the link with inflammation might at least in part be due to some men with high levels of inflammation already being in the early stages of inflammatory bowel disease, which is itself linked to higher bowel cancer risk.How did the researchers interpret the results?
The researchers concluded that, "late-adolescent BMI and inflammation, as measured by ESR, may be independently associated with future CRC [colorectal cancer] risk".Conclusion
This large cohort study found obesity in adolescence is linked to later colorectal cancer risk in men.
The very large size of this study is its main strength, along with the fact that BMI was objectively measured by a nurse, and that the national cancer registry in Sweden is estimated to record virtually all cancer cases.
As with all studies, there are limitations. For example, the study:
- only had information on BMI at one time point, and could not tell whether the men maintained their BMIs or not
- did not have information on diet or smoking, and these are known to impact bowel cancer risk
- only analysed one marker for inflammation – results may differ for other markers
- findings may not apply to women
Obesity in adulthood is already known to be a risk factor for bowel cancer, therefore the possibility that if a person is obese from an early age also increases risk seems plausible.
Research suggests that you can help lower your risk of bowel cancer by:
- cutting down on your consumption of red meat (no more than 70g a day) and processed meat
- eating lots of fibre-rich foods such as fruits and vegetables
- quitting smoking if you smoke
- sticking within recommended alcohol consumption levels
- taking regular exercise
Also, adults can take part in the NHS Bowel Screening Programme offered at specific ages (age 55 for one form of screening, and ages 60 to 74 for another).
Links To The Headlines
Obese teenage boys could have higher risk of bowel cancer, study says. The Guardian, May 25 2015
Obesity in adolescence linked to bowel cancer risk, says study. BBC News, May 26 2015
'Bowel cancer risk' for obese teens. Mail Online, May 26 2015
Links To Science
Kantor ED, Udumyan R, Signorello LB, et al. Adolescent body mass index and erythrocyte sedimentation rate in relation to colorectal cancer risk. Gut. Published online May 18 2015
"Living near a main road causes people to gain weight with the risk of obesity," is the slightly dubious claim in The Daily Telegraph. While a Swedish study did find an association between noise pollution and obesity, cause and effect has not been proved.
The study involved more than 5,000 adults. It looked at the traffic noise exposure where participants lived and whether they were obese according to measurements such as their body mass index (BMI) or waist circumference. The researchers also looked at exposure to road, rail and aircraft noise.
Researchers found people with more exposure to traffic noise from any of the sources had greater waist circumferences. The more sources of traffic noise a person was exposed to, the more likely they were to be obese around the waist. However, there was no link between traffic noise exposure and being obese based on BMI measurement.
Because this study measured noise exposure and obesity at around the same time, it's not possible to say whether noise could contribute towards causing obesity. While the researchers did try to take into account factors (confounders) such as people's lifestyles and socioeconomic status, these factors could still be influencing the results.
The study was carried out by researchers from the Karolinska Institute in Sweden and other research centres in Sweden and Norway.
It was funded by the Swedish Research Council for Health, Working Life and Welfare, the Swedish Heart and Lung Foundation, Stockholm County Council, the Swedish Research Council, the Swedish Diabetes Association, Novo Nordisk Scandinavia and GlaxoSmithKline.
The Daily Telegraph, along with the Daily Mirror and the Daily Express, overstates what can be concluded based on the findings of this study. For example, the first sentence in the Telegraph's story states that traffic noise "causes people to gain weight".
We cannot say for certain whether this is the case, or whether people were already obese before they were exposed to road noise. We also can't say that moving to less urban environments would help people lose weight, as the paper suggests.
It also says at one point that: "Living under a flight path doubled the rate of obesity."
To its credit, however, a balanced comment from an expert was included at the end of the article noting that: "It's definitely too soon to be able to blame your increasing waistline on traffic noise!".
Other UK newspapers, such as The Guardian and The Independent, were more reserved, explaining that a causal relationship has not been proven.What kind of research was this?
This cross-sectional study looked at whether exposure to traffic noise was linked to obesity. Some studies have suggested that this is the case. The suggestion is that this may relate to noise exposure increasing stress hormones such as cortisol, or disrupting sleep.
Other studies have also suggested that traffic noise may be linked to cardiovascular disease, and a link with obesity might be one way this might occur.
But the evidence so far is limited, and studies have not looked at whether the different types of traffic noise (road, rail or aeroplane) show differing associations with obesity.What did the research involve?
The researchers studied 5,075 adults in suburban and semi-rural areas of Stockholm County. They assessed the participants' exposure to noise from road traffic, railways and aircraft at their homes, and took various measurements of the participants' fatness, such as their weight and waist circumference. They then analysed whether there was a relationship between these factors.
The participants were taking part in the Stockholm Diabetes Prevention Program, which looked at risk factors for type 2 diabetes. About half were selected to participate because of a family history of type 2 diabetes, but none had the condition at the start of the study.
The assessments for the current study took place when participants were followed up between 2002 and 2006, when they were aged between 43 and 66 years. Participants filled out questionnaires on their lifestyles and health, and had a medical examination by trained nurses.
The researchers obtained information on where the participants lived since 1991 from various national sources. They combined this information with maps of road traffic noise exposure from the local regions to assess exposure, and also calculated exposure to railway noise and aircraft noise based on distance from rail lines or Stockholm's Arlanda airport flight paths. The average exposure between 1997 and 2002 for each participant was estimated, taking into account if they moved house.
The researchers analysed whether there were links between the different forms of traffic noise (road, rail or aeroplane) and measures such as BMI, waist circumference and waist to hip ratio. Individuals were considered to have "central obesity" if they had a:
- waist circumference of 88cm or above for women and 102cm or above for men
- waist to hip ratio of 0.85 or above for women and 0.90 or above for men
In their analyses, the researchers took into account confounders such as the participants':
- physical activity
- dietary habits
- self-reported noise sensitivity
- self-reported annoyance with road traffic noise
- road traffic air pollution
- socioeconomic status (based on household income)
The researchers found that:
- 62% of participants were exposed to road traffic noise of 45 decibels (dB) or higher – 45dB is just a bit louder than a bird call
- 22% of participants were exposed to airplane traffic noise of 45dB or higher
- 5% of participants were exposed to rail traffic noise of 45dB or higher
- 30% of participants were classified as having no exposure to traffic noise of 45dB or higher
Fewer people were obese based on BMI measurement (19% of men and 17% of women) than based on waist circumference (23% of men and 36% of women) or waist to hip ratio (63% of men and 50% of women).
All forms of traffic noise were linked to waist circumference – every 5dB increase in exposure was associated with a:
- 0.21cm increase in waist circumference for road traffic noise
- 0.46cm increase in waist circumference for rail traffic noise
- 0.99cm increase in waist circumference for aircraft traffic noise
Road and aircraft traffic noise were linked to waist to hip ratio, but rail traffic noise was not. None of the traffic noise sources was linked with BMI.
The odds of having central obesity based on waist circumference and waist to hip ratio was significantly higher in those exposed to any traffic noise source of 45dB or higher, with the odds increasing with the more sources of traffic noise participants were exposed to.
For example, exposure to all three traffic noise sources was associated with almost twice the odds of central obesity based on waist circumference (odds ratio [OR] 1.95, 95% confidence interval [CI] 1.24 to 3.05).
Obesity based on BMI measurement was not significantly associated with any traffic noise source of 45dB or higher.How did the researchers interpret the results?
The researchers concluded that their results "suggest that traffic noise exposure can increase the risk of central obesity" and that, "combined exposure to different sources of traffic noise may convey a particularly high risk".Conclusion
This cross-sectional study found a link between traffic noise exposure from cars, railways or aircraft and obesity around the waist (central obesity – having a bigger belly), but not obesity defined by a high BMI (30 or over).
The main limitation of this research is that, as it is cross-sectional, it cannot determine whether the exposure to the traffic noise came before the central obesity. Therefore, we cannot say the traffic noise definitely causes the obesity.
Factors other than traffic noise (confounders) may be contributing to the link seen. The researchers did try to take a number of these factors into account, but their impact may not be removed completely.
For example, where a person lives is likely to be strongly linked to their socioeconomic status, and this is in turn likely to be linked to a range of lifestyle behaviours. Likewise, areas with high levels of noise pollution tend to be located in the poorer parts of towns and cities, and poverty is known to be associated with a higher risk of obesity. Disentangling these factors to identify the exact impact of each is very difficult.
The estimation of traffic noise exposure was based on the person's place of residence, but did not take into account whether they had noise-reducing measures such as double or triple glazing. It also did not assess noise exposure from other sources – for example, at work.
One way that results were expressed (odds ratios) can make it sound like the differences are greater than they are when you look at the groups. Adjusting for other factors does help to remove their effects, but can contribute to this. "Twice the odds" of being obese might not actually translate to twice as many people being obese when you look at the actual numbers.
So, while 33% of women who were exposed to less than 45dB of road traffic noise had central obesity based on their waist circumference, 36% of those experiencing 45-55dB fell into this category and 39% of those experiencing more than 55dB. These are increases, but are not as drastic as the "doubling" figure might suggest.
While the study suggests a link likely to warrant further investigation, we cannot as yet say for certain that noise pollution causes obesity.
You can take other steps to reduce your waist circumference if it is moving into the danger zone (94cm or more for men, 80cm or more for women). The NHS Choices Weight loss plan uses a combination of healthy diet choices and exercise to get your belly back to a healthier size.
Links To The Headlines
Why living on a main road could be making you fatter. The Daily Telegraph, May 25 2015
Living near a busy road or airport can give you a pot belly claim scientists in Sweden. Daily Mirror, May 25 2015
Living near noisy traffic 'could make you fat'. ITV News, May 26 2015
Traffic noise could increase risk of fat around midriff, says Swedish study. The Guardian, May 25 2015
Stress caused by traffic noise 'can increase risk of obesity', study claims. The Independent, May 25 2015
LIVING with transport noise can make you fat, scientists claim. Daily Express, May 26 2015
Links To Science
Pyko A, Eriksson C, Oftefdal B, et al. Exposure to traffic noise and markers of obesity. Occupational and Environmental Medicine. Published online May 25 2015
"The amount of alcohol people in England drink has been underestimated by the equivalent of 12 million bottles of wine a week," BBC News reports.
It has long been known there is a big gap between the amount people say they drink in national surveys, like the Health Survey for England, and the amount of alcohol known to be sold in England.
In this new survey researchers set out on the assumption that while people may accurately report their standard drinking patterns from week to week, they may forget the drinking they do on special occasions, such as bank holidays, parties, weddings, wakes or big sporting events (which, for many England fans, is akin to a wake).
The study used a large phone interview to estimate the amount of extra drinking going on during these types of occasions. They found this accounted for an extra 12 million bottles of wine a week in England – just under a staggering eight and a half million litres, which is more than enough to fill three Olympic-size swimming pools.
The results seem plausible. As the scientists point out: "The impact of atypical and special occasion drinking is reflected in evening presentations to emergency units, which peak on weekends but also sports events, bank holidays, and even commemorative occasions such as Halloween."
Where did the story come from?
The study was carried out by UK researchers from Cardiff University, Bangor University, Liverpool John Moores University, and the London School of Hygiene and Tropical Medicine. It was funded by Alcohol Research UK.
The study was published in the peer-reviewed medical journal BioMed Central. This is an open-access journal, so the study is free to read online or download as a PDF.
The UK media reported the story accurately.What kind of research was this?
This was a cross-sectional survey aiming to provide a more accurate picture of how much alcohol people in England drink.
The researchers say there is a big gap between the amount people report drinking in national surveys and the amount of alcohol being sold in England. So are we a nation of liars in denial about our drinking habits?
Rather than fibbing, the researchers suspected people might be being asked the wrong type of questions on alcohol surveys. You are usually asked what your average alcohol consumption is, say, over a week. People might not think to include special events in this estimate, such as drinking at a wedding or a birthday party, because they are not typical.
The scientists designed a large telephone interview study to see whether the special occasion drinking might make up the shortfall between estimates of typical drinking and alcohol sales.What did the research involve?
The team conducted a large-scale telephone survey between May 2013 and April 2014 of people aged 16 years or over living in England.
Respondents (n = 6,085) provided information on typical drinking (amounts per day, drinking frequency) and changes in consumption associated with routine atypical days (e.g. Friday nights) and special drinking periods (e.g. holidays) and events (e.g. weddings).
The team acknowledged it did not collect a representative sample of alcohol consumers and abstainers on a national basis, but instead used national population estimates and stratified drinking survey data to weight responses to match the English population.
The analysis looked to identify additional alcohol consumption associated with atypical or special occasion drinking by age, sex and typical drinking level.What were the basic results?
Accounting for atypical and special occasion drinking added more than 120 million units of alcohol per week (equivalent to 12 million bottles of wine) to population alcohol consumption in England.
The greatest impact was seen among 25- to 34-year-olds with the highest typical consumption, where atypical or special occasions added approximately 18 units a week (144g) for both sexes.
Those reporting the lowest typical consumption (≤1 unit/week) showed large relative increases in consumption (209.3%) with most drinking associated with special occasions.
In some demographics, adjusting for special occasions resulted in overall reductions in annual consumption – for example, women aged 65 to 74 years in the highest typical drinking category.
The Health Survey for England, a nationally representative survey, estimates alcohol consumption only accounted for 63.2% of sales. The new survey, including the special occasion drinking, accounted for 78.5%.How did the researchers interpret the results?
The research team concluded: "Typical drinking alone can be a poor proxy for actual alcohol consumption. Accounting for atypical/special occasion drinking fills 41.6% of the gap between surveyed consumption and national sales in England."
From a public health perspective they said: "These additional units are inevitably linked to increases in lifetime risk of alcohol-related disease and injury, particularly as special occasions often constitute heavy drinking episodes.
"Better population measures of celebratory, festival and holiday drinking are required in national surveys in order to adequately measure both alcohol consumption and the health harms associated with special occasion drinking."Conclusion
This large telephone survey sought to generate a more accurate estimate of England's alcohol consumption by taking account of atypical drinking days like Friday nights, holidays and events such as weddings.
It found atypical and special occasion drinking added more than 120 million units of alcohol a week (about 12 million bottles of wine) to population alcohol consumption in England
This accounted for some of the discrepancy between self-reported alcohol consumption and alcohol sales, but not all. The Health Survey for England, a nationally representative survey, estimates alcohol consumption only accounts for 63.2% of sales. The new survey improved this to 78.5%.
This begs the question, where is the other 21.5% going? There are many potential explanations for this. One is that people are pretty bad at estimating how much they drink, and generally underestimate it, for whatever reason, when asked.
An alternative, rather worrying, explanation is that a significant portion could be consumed by under-16s, who were excluded from the survey. And there could be people who just can't help downplaying the amount they drink, whether consciously or unconsciously, even to strangers on the telephone.
The research team highlighted a number of limitations of its own research. First, the survey did not attempt to generate a representative sample of alcohol consumers and abstainers on a national basis.
The scientists say their survey acts as a proof of concept, and a larger nationally representative survey is needed to test the usefulness of this methodology as a national alcohol monitoring tool. For example, participation rates were quite low (just 23.3% of those contacted) and the sample had more women, older people and people of white ethnicity than is true for England as a whole.
The estimates also might be imprecise. For example, the team didn't know if special drinking events were instead of or as well as the normal drinking days. In their analysis, they opted for a conservative measure by removing an average drinking day's consumption for each special event day reported.
The results make sense. As the scientists point out: "The impact of atypical and special occasion drinking is reflected in evening presentations to emergency units, which peak on weekends but also sports events, bank holidays, and even commemorative occasions such as Halloween."
Links To The Headlines
English drink 12 million bottles of wine a week more than estimated. BBC News, May 22 2015
Forgotten holidays and lost birthdays leave English drinking underestimated. The Guardian, May 22 2015
Drinkers in England consuming 12 million more bottles of wine a week than previously thought. The Independent, May 22 2015
English alcohol consumption 'hugely' underestimated, research suggests. The Daily Telegraph, May 22 2015
Links To Science
Bellis MA, Hughes K, Jones L, et al. Holidays, celebrations, and commiserations: measuring drinking during feasting and fasting to improve national and individual estimates of alcohol consumption. BMC Medicine. Published online May 22 2015
A sobering BBC News headline greets sun worshippers on the eve of the spring bank holiday: "More than a quarter of a middle-aged person's skin may have already made the first steps towards cancer."
Sunlight is made up of ultraviolet (UV) radiation. Low levels of exposure to UV light are actually beneficial to health – sunlight helps our bodies produce vitamin D.
As part of a study into skin cancer, researchers analysed skin removed from the eyelids of four people aged 55 to 73 known to have a varying history of sun exposure (but not a history of cancer) to see what DNA mutations had built up.
To their surprise they found hundreds of normal cells showing DNA mutations linked to cancer, called "mutant clones", in every 1sq cm (0.1 sq in) of skin, and there were thousands of DNA mutations per cell.
The results were based on skin cells from the eyelids of just four people, so we don't yet know if the same would be found in other skin areas, or in other people, or what proportion of the mutated cells would eventually progress to skin cancer.
Where did the story come from?
The study was carried out by researchers from The Wellcome Trust Sanger Institute in the UK, and was funded by The Wellcome Trust and the Medical Research Council.
It was published in the peer-reviewed journal, Science.
The BBC and the Daily Mail reported the story accurately and reiterated the best ways to lower your risk of getting skin cancer.What kind of research was this?
This was a genetics study looking at changes in the DNA of normal skin cells to see what proportions were linked to cancer.
Skin cancer is one of the most common forms of cancer. There are two main types of skin cancer:
- non-melanoma skin cancer – where cancer slowly develops in the upper layers of the skin; there are more than 100,000 new cases of non-melanoma skin cancer every year in the UK
- melanoma skin cancer – a more serious type of skin cancer; there are around 13,000 new cases of melanoma diagnosed each year in the UK and 2,000 deaths
Radiation from too much sun exposure causes damage to the DNA of skin cells. When certain combinations of mutations accumulate, the cell can become cancerous, multiplying and growing uncontrollably.
Scientists know about lots of skin cancer mutations, but these tend to have been studied using samples of cancerous skin cells. Researchers don't know what combination of mutations is needed to transform healthy skin cells into cancer, or in what order.
Approaching the problem from a different direction, this team looked at healthy skin cells to see what mutations might be accumulating in a pre-cancerous stage.What did the research involve?
The scientists analysed the DNA of healthy eyelid skins cells removed from four people during plastic surgery (blepharoplasty). They looked for DNA mutations they knew were linked to cancer later on. The removed eyelid skin was reported to be normal and free of any obvious damage.
The team used eyelid skin because of its relatively high levels of sun exposure and because it is one of the few body sites to have normal skin removed.
They say this procedure is performed for age-related loss of elasticity of the underlying skin, which can cause eyelid drooping sometimes severe enough to disrupt vision, although the epidermis remains otherwise normal.
The skin sample donors were three women and one man, aged 55 to 73. Two had low sun exposure, one moderate and one high. Three were of western European origin and one was of south Asian origin. It was not clear how sun exposure was assessed.What were the basic results?
The researchers found a lot more cancer-related mutations in the normal cells than they were expecting. In all, their analysis pinpointed 3,760 mutations. The pattern of DNA mutations "closely matched" those expected for UV light exposure and that seen in skin cancers.
DNA is made up of a code of letters known as base pairs. The team estimated people have around two to six mutations per million base pairs per skin cell. This, they said, was lower than the number of mutations usually found in skin cancer, but higher than found in other solid tumours.
Overall, they estimated around 25% of all skin cells carried a certain type of cancer-linked mutation called NOTCH mutations. While not enough to cause cancer on their own, if other mutations accumulate on top of the NOTCH mutations, they may cause cancer in the future.How did the researchers interpret the results?
Dr Peter Campbell, head of cancer genetics at Sanger, told the BBC News website: "The most surprising thing is just the scale; that a quarter to a third of cells had these cancerous mutations is way higher than we'd expect, but these cells are functioning normally."
He added: "It certainly changes my sun worshipping, but I don't think we should be terrified … It drives home the message that these mutations accumulate throughout life, and the best prevention is a lifetime of attention to the damage from sun exposure."Conclusion
This study estimated around 25% of normal skin cells have DNA mutations that could prime them to develop into skin cancer in the future. This was a lot higher than the scientists expected.
The genetic analysis of the study was robust, but used skin samples from just four people. This severely limits the generalisablity of the findings to the general population. For example, the results might be different for people of different ages, sun exposures and skin colours, so we don't know if this is true for most people.
Similarly, the researchers only used eyelid cells. There may be something unique about eyelid tissue that is linked to this higher than expected mutation rate. This may or may not be true for skin from other areas. At the moment, we don't know if the one in four estimate applies to other skin areas.
The good news is there are simple and effective ways of reducing your risk of skin cancer. The best way to prevent all types of skin cancer is to avoid overexposure to the sun and to keep an eye out for new or changing moles.
A few minutes in the sun can help maintain healthy levels of vitamin D, which is essential for healthy bones, but it's important to avoid getting sunburn. Wearing protective clothing such as sun hats, seeking shade, and wearing sun cream of at least SPF 30 are all advised.
Read more about how to enjoy the benefits of the sun without exposing your skin to damage
Links To The Headlines
Quarter of skin cells 'on road to cancer'. BBC News, May 22 2015
Skin cancer alert for the over-50s: Millions failing to heed advice over sun damage. Daily Mail, May 22 2015
Links To Science
Martincorena I, Roshan A, Gerstung M, et al. High burden and pervasive positive selection of somatic mutations in normal human skin. Science. Published online May 22 2015
"Thousands discover Calpol has been free on NHS 'for years' as mum's Facebook post goes viral," the Daily Mirror reports.
This and other similar headlines were prompted by a post made on the social networking site Facebook. In the post, it was claimed that all medicines for children were available for free on the NHS as part of the minor ailment scheme.
"I was in Boots yesterday buying Calpol and happened to complain to the cashier how expensive it is. She told me, to my amazement, that if you register your details with them under the minor ailments scheme that all medicines for children are free – a scheme that has been going for eight years."
The post went viral, being "shared" and "liked" more than 100,000 times in the space of a few days.
But there are a number of inaccuracies both in the Facebook post and in the media's reporting of the story.What is the minor ailment scheme?
The minor ailment scheme is designed to enable people with minor health conditions to access medicines and advice they would otherwise visit their doctor for.
It allows patients to see a qualified health professional at a convenient and accessible location within their community, and means patients do not need to wait for a GP appointment or queue up for a valuable A&E slot with a non-urgent condition.
Childhood ailments that may be treated under the scheme include:
If the patient being treated is exempt from paying prescription charges – because they're under 16 or over 60, for example, or they have a prescription prepayment certificate (PPC) – you don't have to pay for the medicine.Important points about the minor ailment scheme
There are a number of important points that have not been made clear by the media:
- The minor ailment scheme is not a national scheme. It is not possible to say exactly which medical conditions are covered because this will vary depending on the location and the particular service.
- The scheme is designed to offer medication to meet an acute need. It is not an opportunity for parents to stock up on free children's medications – if a pharmacist thinks someone is trying to abuse the system, they can refuse any request for treatment at their discretion.
- The pharmacist has no obligation to provide branded medication such as Calpol. If there is a cheaper generic version available that is known to be equally effective, it is likely that will be provided instead.
- Claims that the scheme is secretive are incorrect. Information about the minor ailment scheme has been freely available on the NHS Choices website since 2008.
Read more about the services offered by pharmacies and how they can often save you a trip to the GP.
Links To The Headlines
What's the truth about free NHS Calpol? BBC News, May 21 2015
Thousands discover Calpol has been free on NHS 'for years' as mum's Facebook post goes viral. Daily Mirror, May 20 2015
Calpol for FREE: Hundreds find they're entitled to free medication after mum's social post. Daily Express, May 20 2015
Did you know you can get Calpol for free? Metro, May 20 2015
"Paracetamol use in pregnancy may harm male foetus," The Guardian reports. Researchers found evidence that taking paracetamol for seven days may lower the amount of testosterone testicular tissue can produce – using human foetal testicular tissue grafted into mice.
Low testosterone levels in male pregnancies have been linked to a range of conditions, ranging from the relatively benign, such as undescended testicles, to more serious conditions, such as infertility and testicular cancer.
Reassuringly, just taking a one-day course of paracetamol did not affect the level of testosterone. It seems any effect could be from continuous daily use only, rather than occasional use, which is how most people would probably take paracetamol.
An obvious caveat is that as the series of experiments was performed in mice, it is not known what the effect would be in humans. It is also not known whether the effect of regular daily use would be reversible and over what timescale. And we also don't know whether exposure in pregnancy would actually have any detrimental effects in a male child.
Paracetamol is generally believed to be safe in pregnancy, but – as with all medicines – pregnant women should only take them if absolutely necessary in the lowest effective dose and for the shortest period of time.Where did the story come from?
The study was carried out by researchers from the University of Edinburgh, the Edinburgh Royal Hospital for Sick Children, and the University Department of Growth and Reproduction in Copenhagen.
It was funded by The Wellcome Trust, the British Society of Paediatric Endocrinology and Diabetes, and the UK Medical Research Council.
The study was published in the peer-reviewed journal, Science Translational Medicine.
In general, the media reported the story accurately, though many of the headline writers decided to discuss the "harms" of paracetamol – such as The Guardian's headline, "Paracetamol use in pregnancy may harm male foetus" – which is an unhelpful term.
Apart from the fact this study involved mice, not men, there is no evidence a temporary drop in testosterone levels would cause permanent harm to a male foetus. The effect could well be temporary and reversible.
The Daily Mail went particularly over the top with its claim that, "the popular painkiller is believed to have lifelong effects on baby boys, raising their risk of everything from infertility to cancer". The Mail may think this is the case, but most qualified experts would question this due to lack of evidence.What kind of research was this?
This was a laboratory study using a mouse model to look at the effect of paracetamol on testicular development. Testes produce the sex hormone testosterone.
Previous research found an association between low sex hormone exposure in the womb and reproductive disorders, such as undescended testes at birth, or low sperm count and testicular cancer in young adulthood.
The researchers wanted to investigate whether exposure to paracetamol reduces the level of testosterone production. As it would be unethical to study this in pregnant women, the researchers used a mouse model.
Paracetamol is one of the drugs believed to be safe to use in pregnancy. This safety is based on observational studies of use during human pregnancy, as randomised controlled trials – the gold standard in research – are not performed in pregnancy for ethical reasons.What did the research involve?
The researchers grafted samples of human foetal testicular tissue into mice. In a series of laboratory experiments, they gave the mice different doses of oral paracetamol over the course of a week. The researchers measured what effect the paracetamol had on the level of testosterone produced by the testicular tissue.
Human foetal testes were obtained from pregnancies terminated in the second trimester. Small 1mm3 tissue samples of these testes were grafted under the skin of mice so the researchers could see what effect paracetamol had on their growth in an animal that has some similarities to humans.
The mice had their testicles removed so their production of testosterone would not influence the study. Their immune system was also dampened down to reduce the chances of rejecting the testicular tissue.
After a week – enough time for the testicular tissue to establish a blood supply – the mice were given injections of a hormone called human chorionic gonadotrophin (hCG), which stimulates testosterone production and is usually present in the womb. The mice were then randomly assigned to be given different strengths and regimes of oral paracetamol or placebo.
The researchers measured the level of testosterone at different time points during the study. They also measured the weight of the seminal vesicle, a gland that holds the liquid that mixes with sperm to form semen. Previous research has shown the growth of seminal vesicles is sensitive to sex hormones.
Experiments were also performed on the mice to measure the effect of paracetamol on their production of testosterone.What were the basic results?
Testosterone levels were reduced by exposure to paracetamol for seven days. A dose of 20mg/kg three times a day for seven days, the equivalent of a normal dose in adults, resulted in:
- 45% reduction in testosterone production by the testicular tissue grafts
- 18% reduction in seminal vesicle weight
Exposure to 20mg/kg three times in one day was tested under the premise that most pregnant women would only use paracetamol for a short period of time. This did not reduce testosterone levels or cause any change in seminal vesicle weight.
High-dose paracetamol of 350mg/kg once a day for seven days did not change the level of testosterone, but it did result in reduced seminal vesicle weight in host mice of 27%.
Graft survival was 65% over the two-week experiment period. There was no difference in graft weight between mice exposed to any dose of paracetamol and placebo. The mice appeared healthy and had no change in body weight.How did the researchers interpret the results?
The authors concluded that: "One week of exposure to a human-equivalent therapeutic regimen of acetaminophen [paracetamol] results in reduced testosterone production by xenografted human foetal testis tissue, whereas short-term (one day) use does not result in any long-lasting suppression of testosterone production."
They say that because the study was performed in mice, it cannot directly inform new recommendations for the use of paracetamol in pregnancy, but suggest pregnant women should consider limiting their use of the drug.Conclusion
This was a well-designed laboratory study looking at the effect of paracetamol on testicular development. As it would be unethical to study this in pregnant women, the researchers used a mouse model. This involved grafting samples of human foetal testicular tissue under the skin of mice.
The main finding from the study was that oral paracetamol reduced testosterone production if given at a dose equivalent to humans, three times a day for one week. A single dose of paracetamol did not reduce testosterone production.
As the researchers say, they tested the effect of single dose exposure as it is assumed that pregnant women are more likely to use paracetamol occasionally rather than continuously.
The study's strengths include the randomisation procedure, which meant different doses and regimes of paracetamol could be directly compared with the control condition.
However, this study has some limitations because of the nature of a mouse model. These include:
- graft testicular tissue may not respond in exactly the same way as normal testicular development in the womb
- the grafts were fragments of testis tissue – an intact testicle may act differently
- the mice were immunocompromised, which may have influenced the results
The results of this study suggest that regular paracetamol for seven days may reduce the production of testosterone by the developing testicle. However, further studies would be required to determine if this would be the case in humans.
It is also not clear whether the effect would be reversible and over what timescale. It is further completely unknown whether pregnancy exposure would actually have any detrimental effects in the male child – for example, in terms of the development of sexual characteristics at puberty, or upon future fertility.
At present, the product safety information for paracetamol does not preclude its use in pregnancy. Paracetamol is the painkiller of choice during pregnancy, as alternatives such as ibuprofen, and particularly aspirin, are thought to be associated with a higher risk of complications.
Paracetamol is also excreted in breast milk, but this is not believed to be in an amount that would harm the baby. Infant versions of paracetamol such as Calpol, however, are not licensed in babies under the age of two months.
As with all medicines, pregnant women should only take them if absolutely necessary, in the lowest effective dose and for the shortest period of time. If you have a painful condition that persists for more than one to two days, ask your midwife or the doctor in charge of your care for advice.
Links To The Headlines
Paracetamol use in pregnancy may harm male foetus, study shows. The Guardian, May 20 2015
Paracetamol during pregnancy may affect male babies, study shows. The Independent, May 21 2015
Pregnant women warned paracetamol may harm unborn sons' fertility. ITV News, May 20 2015
Limit paracetamol in pregnancy, say scientists. BBC News, May 20 2015
Links To Science
van den Driesche S, Macdonald J, Anderson RA, et al. Prolonged exposure to acetaminophen reduces testosterone production by the human fetal testis in a xenograft model. Science Translational Medicine. Published online May 20 2015
"Mildly cold, drizzly days far deadlier than extreme temperatures," The Independent reports. An international study looking at weather-related deaths estimated that moderate cold killed far more people than extremely hot or cold temperatures.
Researchers gathered data on 74,225,200 deaths from 384 locations, including 10 in the UK. The results showed that the days most countries have the fewest deaths linked to temperature are those with warmer temperatures than average.
Therefore, the researchers calculate, the majority of "excess deaths" occur on days that are colder than average. Because extreme temperatures occur on only a few days a year, they have an impact on fewer deaths than the majority of moderately cold days.
Overall, the researchers say, 7.71% of all deaths can be attributed to temperature based on their statistical modelling.
One hypothesis offered by the researchers is that exposure to mild cold may increase cardiovascular stress while also suppressing the immune system, making people more vulnerable to potentially fatal conditions.
The researchers suggest that their findings show public health officials should spend less time planning for heatwaves, and more time thinking about how to combat the effect of year-round lower than optimum temperatures.Where did the story come from?
The study was carried out by researchers from 15 universities and institutes in 12 countries led by a team from the London School of Hygiene and Tropical Medicine.
It was funded by the UK Medical Research Council. The study was published in the peer-reviewed medical journal The Lancet and has been made available on an open-access basis, so it is free to read online or download as a PDF.
The media reports focused on the finding that moderately cold weather – such as that experienced in the UK for much of the year – caused more deaths than hot weather or extremely cold weather. The Daily Telegraph gave a good overall summary of the research.
The Independent's claim that "mildly cold, drizzly days" are "far deadlier than extreme temperatures" is an extrapolation, as the study didn't look at drizzle or rain as a risk factor, just temperature.
The Guardian includes a number of reactions from independent experts, such as Sir David Spiegelhalter's, presumably tongue-in-cheek, suggestion that "perhaps they are really saying that the UK climate is killing people".What kind of research was this?
This study was a meta-analysis of data on temperatures and deaths around the world to find out what effect temperature has on the risk of death, and whether people are more likely to die during cold weather or hot weather.
The researchers used statistical modelling to estimate the proportions of deaths in the regions studied that could be attributed to heat, cold, and extreme heat and cold. This type of study can tell us about links between variables such as temperature and death rates, but not whether one causes the other.What did the research involve?
Researchers collected data on temperature and mortality (74,225,200 deaths) from 384 locations in 13 different countries, during time periods from 1985 to 2012. They used statistical analysis to calculate the relative risk of death at different temperatures for each location.
The countries included were Australia, Brazil, Canada, China, Italy, Japan, South Korea, Spain, Sweden, Taiwan, Thailand, the UK and the US. About one-third of the locations sampled were in the US.
The researchers were not able to adjust figures to take account of the potential effects of other factors, such as income levels in the different countries, although they used air pollution data when it was available.
The researchers divided the temperature data from each location into evenly spaced percentiles, from cold to hot days. This was so the temperatures for the coldest days would be in the lowest percentiles of 1 or 2, while the highest temperatures would be at the top range, 98 or 99.
They defined extreme cold for a location as below the 2.5th percentile, and extreme heat as above the 97.5th percentile. They looked for the "optimum" temperature for each location, being the temperature at which fewest deaths attributable to temperature were recorded.
They calculated the deaths linked to temperatures above or below the optimum, and sub-divided that again to show deaths linked to extreme cold or heat.
The statistical analysis used a complex new model developed by the researchers, which allowed them to take account of the different time lag of different temperatures.
The effects of very high temperatures on death rates are usually quite shortlived, while very cold temperatures may have an effect on deaths for up to four weeks.What were the basic results?
Across all countries, colder weather was linked to more excess deaths than warmer weather – approximately 20 times as many (7.29% deaths in colder weather compared with 0.42% in warmer weather).
For all countries, the optimum temperature – when there were fewest deaths linked to weather – was warmer than the average temperature for that location.
In the UK, for example, the average temperature recorded was 10.4C, while optimum temperature ranged from 15.9C in the north east to 19.5C in London. The optimum temperature for the UK was in the 90th centile, meaning that 9 out of 10 days in the UK are likely to be colder than the optimum.
The proportion of all deaths linked to extremely hot or cold days was much lower than that linked to less extreme hot or cold. The researchers say extreme heat or cold was responsible for 0.86% of deaths according to their statistical modelling (95% confidence interval 0.84 to 0.87).
However, the relative risk of dying at extremes of temperatures was increased, with a sharp increase in deaths at the hottest temperatures for most countries.How did the researchers interpret the results?
The researchers say their results have "important implications" for public health planning, because planning tends to focus on how to deal with heatwaves, whereas their study shows that below-optimal temperatures have a bigger effect on the number of people who die.
They say deaths from cold weather may be attributed to stress on the cardiovascular system, leading to more heart attacks and strokes. Cold may also affect the immune response, increasing the chances of respiratory disease.
They say their results show that public health planning should be "extended and refocused" to take account of the effect of the whole range of temperature fluctuation, not just extreme heat.Conclusion
Many of the headlines focus on the finding that moderate cold may be responsible for more deaths than extreme hot or cold weather.
Perhaps more interesting is the finding that the optimum temperature for humans seems to be well above the temperatures we usually experience, especially in colder countries like the UK. If this is true, then the finding that most deaths occur on days colder than the optimum is unsurprising, as most days are colder than the optimum temperature.
The relative unimportance of very hot or very cold days in terms of mortality is interesting, because most research and public health planning has focused on extreme weather. However, this depends partly on the definition of extreme temperature.
The researchers used 2.5 upper and lower percentiles to decide on what was extreme for a particular location, so by definition these temperatures are experienced on very few days. Even though the relative risk of death is increased on those days, the absolute number of deaths is nowhere near as high as on the majority of days.
That doesn't mean it's not worth planning for the increased risk of deaths during extreme temperatures. In London, for example, the relative risk of death is more than doubled on days with temperatures below 0C, compared with days at the optimum temperature of 19.5C.
There are some limitations to the study we should be aware of. First, although it sampled data from 13 countries from very different climates, it didn't include any countries in Africa or the Middle East. This means we can't be sure the findings would apply worldwide.
Second, the study did not take into account some confounders that could affect how many deaths occur in warmer or colder periods – for example, levels of air pollution, whether people have access to shelter and heating, the age make-up of a population, and whether people have access to nutritious food all year round.
This also makes it difficult to know how governments or public health bodies can make plans using this new data, as we don't know whether the effects of moderate cold on mortality could be affected by public health measures.
In the UK, the NHS already plans for more hospital admissions during the winter months, taking account of factors such as the amount of flu-like illness circulating in the population, as well as the temperature.
Read more advice about winter health.
Links To The Headlines
Mildly cold, drizzly days far deadlier than extreme temperatures for Brits, says study. The Independent, May 21 2015
Moderately cold weather 'more deadly than heatwaves or extreme cold'. The Guardian, May 21 2015
Cold weather twenty times more deadly than hot weather, study suggests. The Daily Telegraph, May 21 2015
Links To Science
Gasparrini A, Guo Y, Hashizume M, et al. Mortality risk attributable to high and low ambient temperature: a multicountry observational study. The Lancet. Published online May 20 2015
"Men with hay fever are more likely to have prostate cancer – but those with asthma are more likely to survive it," the Daily Mirror reports. Those were the puzzling and largely inconclusive findings of a new study looking at these three conditions.
Researchers looked at data involving around 50,000 middle-aged men and followed them up for 25 years, looking at whether asthma or hay fever at study start were associated with diagnoses of prostate cancer or fatal prostate cancer during follow-up.
The findings weren't as conclusive as the headline suggests. The researchers did find hay fever was associated with a small (7%) increased risk of prostate cancer development. There was some suggestion asthma may be associated with a decreased risk of getting prostate cancer or fatal prostate cancer. However, these links were only of borderline statistical significance, meaning there was a high risk they could have been the result of chance.
And the links between hay fever and fatal prostate cancer weren't significant at all, meaning there was no evidence that men with hay fever were more likely to die from the disease (so no need to worry if you are affected).
The possibility that inflammation, or the immune system more generally, could be associated with risk of prostate cancer is plausible, but this study tells us little about how different immune profiles could influence cancer risk.
Where did the story come from?
The study was carried out by researchers from Johns Hopkins Bloomberg School of Public Health and other institutions in the US. It was funded by grants from The National Cancer Institute and The National Heart, Lung and Blood Institute. The study was published in the peer-reviewed International Journal of Cancer.
The Daily Mirror has taken an uncritical view of the research findings and fails to make clear to its readers that the findings were mainly based on borderline statistically significant or non-significant results. These don't provide firm proof of links between asthma or hay fever and prostate cancer or lethal prostate cancer.
What kind of research was this?
This was a prospective cohort study looking into how the immune system might be involved in the development of prostate cancer.
The study authors say emerging research suggests inflammation, and the immune response in general, may be involved in the development of prostate cancer. As they say, one way to explore this is by looking at the links between prostate cancer and conditions that have a particular immune profile. Two such immune-mediated conditions are asthma and allergies, such as hay fever.
Previous studies looking at links between the conditions gave inconsistent results. This study looked at the link in a prospective cohort of almost 50,000 cancer-free men, looking to see whether they developed prostate cancer and the factors associated. Cohort studies such as these can demonstrate associations, but they cannot prove cause and effect as many other unmeasured factors may be involved.
What did the research involve?
The cohort was called the Health Professionals Follow-Up Study. In 1986 it enrolled 47,880 cancer-free men, then aged 40-75 years (91% white ethnicity), who were followed up for 25 years.
Every two years, men completed questionnaires on medical history and lifestyle, and filled in food questionnaires every four years.
At study enrolment they were asked whether they had ever been diagnosed with asthma, hay fever or another allergy and, if so, the year it started. In subsequent questionnaires they were asked about new asthma diagnoses and asthma medications, but hay fever was only questioned at study start.
Men reporting a diagnosis of prostate cancer on follow-up questionnaires had this confirmed through medical records. The researchers also used the National Death Index to identify cancer deaths.
The researchers looked at the associations between prostate cancer and reported asthma or hay fever, particularly looking at the link with "lethal" prostate cancer. This was defined as being prostate cancer either diagnosed at a later stage when the cancer had already spread around the body (so expected to be terminal), or being the cause of death.
They adjusted their analyses for potential confounders of:
- body mass index (BMI)
- smoking status
- physical activity
- family history of prostate cancer
What were the basic results?
Five percent of the cohort had a history of asthma at study start and 25% had hay fever. During the 25-year follow-up there were 6,294 cases of prostate cancer. Of these, 798 were expected to be lethal, including 625 recorded deaths.
After adjusting for confounders, there was a suggestion that having asthma at study start was associated with a lower risk of developing prostate cancer. We say a suggestion, because the 95% confidence interval (CI) of the result included 1.00. This makes it of borderline relative risk (RR) 0.89, 95% CI 0.78 to 1.00) meaning the finding may have been down to chance alone.
Hay fever, by contrast, was associated with an increased risk of developing prostate cancer, which did just reach statistical significance (RR 1.07, 95% CI 1.01 to 1.13).
Looking at lethal prostate cancer, there was again a suggestion that asthma was associated with decreased risk, but this again appeared of borderline statistical significance (RR 0.67, 95% CI 0.45 to 1.00). Hay fever was not this time significantly associated with risk of lethal prostate cancer.
The researchers then looked at ever having a diagnosis of asthma, this time not only looking at the 5% already diagnosed at study start, but also the 4% who developed the condition during follow-up. Again they found that ever having a diagnosis of asthma was associated with a decreased risk of lethal prostate cancer, but this was only of borderline statistical significance (RR 0.71, 95% CI 0.51 to 1.00).
The researchers also considered the time of diagnosis. They report that onset of hay fever in the distant past (more than 30 years ago) "was possibly weakly positively associated with risk of lethal" prostate cancer. However, this link is not statistically significant (RR 1.10, 95% CI 0.92 to 1.33).
How did the researchers interpret the results?
The researchers' conclude: "Men who were ever diagnosed with asthma were less likely to develop lethal and fatal prostate cancer." They add: "Our findings may lead to testable hypotheses about specific immune profiles in the [development] of lethal prostate cancer."
The researchers' suggestion that this research is "hypothesis generating" is the most apt. It shows a possible link between immune profiles and prostate cancer, but doesn't prove it or explain the underlying reasons for any such link.
This single study does not provide solid evidence that asthma or hay fever would have any influence on a man's risk of developing prostate cancer or dying from it, particularly when you consider the uncertain statistical significance of several of the findings.
Links suggesting asthma may be associated with a lower risk of total or lethal prostate cancer were all only of borderline statistical significance, meaning we can have less confidence that these are true links.
Links with hay fever were similarly far from convincing. Though the researchers found a 7% increased risk of developing prostate cancer with hay fever, this only just reached statistical significance (95% CI 1.01 to 1.13). The links between hay fever and risk of lethal prostate cancer that hit the headlines weren't significant at all, so they provide no evidence for a link.
Even if there is a link between asthma and allergy and prostate cancer risk, it's still possible this could be influenced by unmeasured health and lifestyle factors that have not been adjusted for.
Other limitations to this prospective cohort include its predominantly white sample, particularly given that prostate cancer is known to be more common in black African or black Caribbean men.
The results may not be applicable to these higher-risk populations. Also, though prostate cancer diagnoses were confirmed through medical records and death certificates, there is the possibility for inaccurate classification of asthma or allergic conditions as these were self-reported.
The possibility that inflammation, or the immune system more generally, could be associated with risk of prostate cancer is definitely plausible. For example, history of inflammation of the prostate gland is recognised to be possibly associated with increased prostate cancer risk. Therefore, study into how different immune profiles could have differing cancer risk is a worthy angle of research into prostate cancer.
However, the findings of this single cohort should not be of undue concern to men with hay fever or, conversely, suggest that men with asthma have protection from the disease.
Links To The Headlines
Men with hay-fever more likely to have prostate cancer – but those with asthma more likely to survive. Daily Mirror, May 19 2015
Links To Science
Platz EA, Drake CG, Wilson KM, et al. Asthma and risk of lethal prostate cancer in the Health Professionals Follow-Up Study. International Journal of Cancer. Published online February 27 2015
"Children of the 90s are three times as likely to be obese as their parents and grandparents," the Mail Online reports. A UK survey looking at data from 1946 to 2001 found a clear trend of being overweight or obese becoming more widespread in younger generations. Another related trend saw the threshold from being a normal weight to being overweight was passed at a younger age in younger generations
The study examined 273,843 records of weight and height for 56,632 people in the UK from five studies undertaken at different points since 1946. The results found children born in 1991 or 2001 were much more likely to be overweight or obese by the age of 10 than those born before the 1980s, although the average child was still of normal weight.
The study also found successive generations were more likely to be overweight at increasingly younger ages, and the heaviest people in each group got increasingly more obese over time. These results won't come as much of a surprise given the current obesity epidemic.
The findings are a potential public health emergency in the making. Obesity-related complications such as type 2 diabetes, heart disease and stroke can be both debilitating and expensive to treat. The researchers called for urgent effective interventions to buck this trend.
Where did the story come from?
The study was carried out by researchers from University College London and was funded by the Economic and Social Research Council.
The Mail Online focused on the risk to children, saying children were more likely to be obese.
But the figures in the study were for obesity and being overweight combined. We don't know how the chances of obesity alone changed over time because there were too few obese children in the earliest cohorts to carry out the calculations.
BBC News gave a more accurate overview of the study and the statistics.What kind of research was this?
This was an analysis of data from five large long-running cohort studies carried out in England ranging around 50 years in total. It aimed to see how people's weight changed over time through childhood and adulthood, and how this compared across generations.
Studies like this are useful for looking at patterns and telling us what has changed and how, but can't tell us why these changes arose.What did the research involve?
Researchers used data from cohort studies that recorded the weight and height of people born in 1946, 1958, 1970, 1991 and 2001.
They used the data to examine how the proportion of people who were normal weight, overweight or obese changed over time for the five birth cohort groups. They also calculated the chances of becoming overweight or obese at different ages, across childhood and adulthood, for the five groups.
The researchers used data from 56,632 people, with 273,843 records of body mass index (BMI) recorded at ages ranging from 2 to 64. BMI is calculated for adults as weight in kilograms divided by height in metres squared.
For children, BMI is assessed differently to account for the way children grow, using a reference population to decide whether children are underweight, normal weight, overweight or obese at specific ages.
To keep the populations as similar as possible across the cohort studies, the researchers only included data on white people, as there were few non-white people in the earliest study. Immigration of non-white people to the UK did not begin in any significant numbers until the 1950s.
For each of the five cohort studies, men were analysed separately from women, and children were analysed separately from adults. Each cohort was divided into 100 equal centiles, or sub-groups, according to BMI – for example, the 50th centile is the group where half the people in the study have a higher BMI and half have a lower BMI.
Tracking the 50th centile over time can show whether the average person in the group is normal weight or overweight at certain ages. Higher centiles, such as the 98th centile, show the BMI of the heaviest people in the group, where only 2% of people in the group had a higher BMI and 97% had a lower BMI.What were the basic results?
The study found that:
- People born in the more recent birth cohorts were more likely to be overweight at younger ages. The age at which the average (50th centile) subgroup became overweight was 41 for men born in 1946, 33 for men born in 1958, and 30 for men born in 1970. For women, the age fell from 48 to 44, then to 41 across the three birth cohorts.
- The chances of becoming overweight in childhood increased dramatically for children born in 1991 or 2001. For children born in 1946, the chance of being overweight or obese at the age of 10 was 7% for boys and 11% for girls. For children born in 2001, the chance was 23% for boys and 29% for girls. However, the average children (50th centile) remained in the normal weight range in all five birth cohorts.
- The biggest changes in weight were seen at the top end of the spectrum. The heaviest people from the group born in 1970 (98th centile) reached a higher BMI earlier in life than the people born in earlier birth cohorts.
The researchers say their results show children born after the 1980s are more at risk of being overweight or obese than those born before the 1980s.
They say this is because of their exposure to an "obesogenic environment", with easy access to high-calorie food. They say the changes in obesity over time among older cohorts also support the theory that changes to the food environment in the 1980s are behind the rise in obesity.
They go on to warn that if trends persist, modern day and future generations of children will be more overweight or obese for more of their lives than previous generations, and this could have "severe public health consequences" as they will be more likely to get illnesses such as coronary heart disease and type 2 diabetes.Conclusion
The study shows how, while the whole population of England has become heavier over the past 70 years, different generations have been affected in different ways. People born in 1946 were, on average, normal weight until their 40s, but this group has since seen their weight rise and they are now, on average, overweight.
By the time they reached 60, 75% of men and 66% of women from this group were overweight or obese. People born in 1946 from the heaviest cohorts, who were already overweight in early adulthood, are now likely to be obese or very obese.
For people born since 1946, the chance of being overweight as young adults, adolescents or children has been increasing. The chances of being overweight or obese by the age of 40 was 65% for men born in 1958 (45% for women) and 67% for men born in 1970 (49% for women). The chances of children born in 2001 being overweight or obese by the age of 10 are almost three times that of the children born in 1946.
We can deduce from the figures that something may have happened during the 1980s – the decade when the earliest birth cohort's average group moved from normal weight to overweight – to increase the chances of people of all ages becoming overweight or obese.
What these figures can't tell us is what that was, despite the researchers' assertion this was a change to an obesogenic environment. Still, it seems plausible that a combination of high-calorie, low-cost food and an increasingly sedentary lifestyle – both in terms of working life and recreation – contributed to this trend.
This study has some limitations. Of the five studies, four were national studies across the UK, while one (the 1991 study) was limited to one area of England, so may not be representative of the UK as a whole.
More importantly, the five studies used differing methods to record height and weight at different time points. Some records were self-reported, which means they rely on people accurately recording and reporting their own height and weight.
We know that being overweight and obese is bad for our health. These conditions increase the chances of a range of illnesses, including heart disease, diabetes and some cancers. We also know children who are overweight tend to grow up to be overweight or obese adults, so increasing their chances of illness.
This study gives us more information about who is at risk of becoming overweight and at which ages, which may help health services develop better strategies for turning the tide of obesity.
Links To The Headlines
Children of the 90s are THREE times as likely to be obese as their parents and grandparents. Mail Online, May 19 2015
UK children becoming obese at younger ages. BBC News, May 20 2015
Links To Science
Johnson W, Li L. Kuh D, Hardy R. How Has the Age-Related Process of Overweight or Obesity Development Changed over Time? Co-ordinated Analyses of Individual Participant Data from Five United Kingdom Birth Cohorts. PLOS Medicine. Published online May 19 2015
"Scientists believe they may have discovered how to mend broken hearts," reports the Daily Mirror.
While it may sound like the subject of a decidedly odd country and western song, the headline actually refers to damage to the heart muscle.
A heart attack occurs when the muscle of the heart becomes starved of oxygen causing it to be damaged. If there is significant damage the heart can become weakened and unable to effectively pump blood around the body. This is known as heart failure and can cause symptoms such as shortness of breath and fatigue.
The heart contains "dormant" stem cells, and researchers want to learn more about them to work out ways to get them to help repair damaged heart tissue.
In this new laboratory and animal study, researchers identified a characteristic genetic "signature" of adult mouse heart stem cells. This led to them being more easily identified than they have been previously, making them easier to "harvest" for study.
Injections of these cells into damaged mouse hearts was shown to improve heart function, even though very few of the donor cells remained in the heart.
These findings will help researchers to study these cells better, for example investigating whether they could be chemically triggered to repair the heart without removing them first. While the hope is that this research could lead to treatments for human heart damage, as yet the results are just in mice.
The researchers also note that they need to find out whether human hearts have the equivalent cells.Where did the story come from?
The study was carried out by researchers from Imperial College London and other UK and US universities. It was funded by the British Heart Foundation, European Commission, European Research Council and the Medical Research Council, with some of the researchers additionally supported by the UK National Heart and Lung Institute Foundation and Banyu Life Science Foundation International.
The Mirror’s main report covers the story reasonably, but one of its subheadings – that scientists have identified a protein that if injected can stimulate heart cell regeneration – is not quite right. The researchers have not yet been able to utilise a protein to stimulate heart regeneration. They have just used a specific protein on the surface of the stem cells to identify the cells. So it was the cells, and not the protein, that were used in regeneration.
The Daily Telegraph’s coverage of the study is good and includes some useful quotes from the lead researcher Professor Michael Schneider. The article also makes it clear that this study only involved mice.What kind of research was this?
This was laboratory and animal research studying the adult stem cells in mice that can develop into heart cells.
A number of diseases cause (or are caused by) damage to the heart. For example, heart attacks occur when some heart muscle cells do not get enough oxygen and die – usually due to a blockage in the coronary arteries that supply the heart muscle with oxygen-rich blood. There are "dormant" stem cells in the adult heart that can generate new heart muscle cells, but are not active enough to completely repair damage.
Researchers are starting to test ways to encourage the stem cells to repair heart damage fully. In this study, the researchers were studying these cells very closely, to understand whether all heart stem cells are the same, or whether there are different types and what they do. This information could help them to identify the right type of cells and conditions they need to fix heart damage.
This type of research is a common early step in understanding how the biology of different organs works, with the aim of eventually being able to develop new treatments for human diseases. Much of human and animal biology is very similar, but there can be differences. Once researchers have developed a good idea of how the biology works in animals, they will then carry out experiments to check to what extent this applies to humans.What did the research involve?
The researchers obtained stem cells from adult mouse hearts and studied their gene activity patterns. They then went on to study which of these cell types could develop into heart muscle cells in the lab, and which could successfully produce heart muscle cells that could integrate into the heart muscle of living mice.
The researchers started by identifying a population of adult mouse heart cells that is known to contain stem cells. They separated out these into different groups, some of which are known to contain stem cells, and further separated each group into single cells, and studied exactly which genes were active in each cell. They looked at whether the cells showed very similar gene activity patterns (suggesting that they were all the same type of cells, doing the same things), or whether there were groups of cells with different gene activity patterns. They also compared these activity patterns to young heart muscle cells from newborn mice.
Once they identified a group of cells that looked like the cells that could develop into heart muscle cells, they tested whether they would be able to grow and maintain these in the lab. They also injected the cells into the damaged hearts of mice to see if they formed new heart muscle cells. They also carried out various other experiments to further characterise the cells that form new heart muscle cells.What were the basic results?
The researchers found distinct groups of cells with different gene activity patterns. One particular group of these cells was identified as the cells that have started to develop into heart muscle cells. These cells were referred to as Sca1+ SP cells, and one of the genes they expressed produces a protein called PDGFRα, which is found on the surface of these cells. These cells grew and divided well in the lab, and the offspring cells maintained the characteristics of the original Sca1+ SP cells.
When the researchers injected samples of the offspring cells into damaged mouse hearts, they found that between 1% and 8% of the cells remained in the heart muscle tissue the day after the injection. Over time, most of these cells were lost from the heart muscle, but some remained (about 0.1% to 0.5% at two weeks).
By two weeks, some (10%) of the remaining cells were showing signs of developing into immature muscle cells. At 12 weeks, more of the remaining cells (50%) were showing signs of being muscle cells. These cells were also showing signs of being more developed and forming muscle tissue. However, there were only a few of these donor cells in each heart (5 to 10 cells). Some of the donor cells also appeared to have developed into the two sorts of cells found in blood vessels.
Mice whose hearts had been injected with the donor cells showed better heart function at 12 weeks than those who had a "dummy" injection with no cells. The size of the damaged area was smaller in those with donor cell injections, and the heart was able to pump more blood.
Further experiments showed the researchers that they could identify and separate out the cells that specifically develop into heart muscle cells by looking for the PDGFRα protein on their surface. The cells identified in this way grew well in the lab, and when injected into the heart they could integrate into the heart muscle and showed signs of developing into muscle cells after two weeks.How did the researchers interpret the results?
The researchers concluded that they had developed a way to identify and separate out a specific subset of adult mouse heart stem cells and can generate new heart muscle cells. They say that at the very least this will help them to study these cells in mice more easily. If a human equivalent of these cells exists, they may also be able to utilise this knowledge to obtain stem cells from adult heart tissue.Conclusion
This laboratory and animal study has identified a characteristic genetic "signature" of adult mouse heart stem cells. This has allowed them to be more easily identified than they have been previously. Injections of these cells have also been shown to be able to improve heart function after heart muscle damage in mice.
These findings will help researchers to study these cells more closely in the lab and investigate how they can prompt them to repair damaged heart muscle, possibly without removing them from the heart first. While the hope is that this research could lead to treatments for human heart damage, for example after a heart attack, as yet the results are just in mice. The researchers themselves note that they now need to find out whether human hearts have the equivalent cells.
Many researchers are working on the potential uses of stem cells to repair and damage human tissue, and studies such as this are important parts in this process.
Links To The Headlines
Stem cells could be used to repair previously irreversible damage from heart attacks, say scientists. Daily Mirror, May 18 2015
'Heartbreak' stem cells could repair damage of heart attack. The Daily Telegraph, May 19 2015
Links To Science
Noseda M, Harada M, McSweeney S, et al. PDGFRα demarcates the cardiogenic clonogenic Sca1+ stem/progenitor cell in adult murine myocardium. Nature Communications. Published online May 18 2015
The Daily Mirror carries the alarming headline that, "Heroin made in home-brew beer kits could create epidemic of hard drug abuse". It says scientists are "calling for urgent action to prevent criminal gangs gaining access to [this] new technology" following the results of a study involving genetically modified yeast.
This study did not actually produce heroin, but an important intermediate chemical in a pathway that produces benzylisoquinoline alkaloids (BIAs). BIAs are a group of plant-derived chemicals that include opioids, such as morphine.
BIAs have previously been made from similar intermediate chemicals in genetically engineered yeast. Researchers hope that by joining these two parts of the pathway, they will get yeast that can produce BIAs from scratch. This could be cheaper and easier than current production methods, which often still involve extraction from plants.
But because morphine can be refined into heroin using standard chemical techniques and yeast can be grown at home, this has led to concerns about the potential misuse of this discovery.
So, will this lead to a rash of "Breaking Bad"-style heroin labs in criminals' garages and spare rooms? We doubt it – at least in the near future. A strain that can produce morphine has not yet been made and would need to be specially genetically engineered to do this, not just using unmodified home-brewing yeast available off the shelf.
Still, it may be worth raising awareness about the potential need for regulation of opioid-producing strains.Where did the story come from?
The study was carried out by researchers from the University of California and Concordia University in Canada.
It was funded by the US Department of Energy, the US National Science Foundation, the US Department of Defense, Genome Canada, Genome Quebec, and a Canada Research Chair.
The Daily Mirror's reporting takes a sensationalist angle – the picture caption, for example, reads: "Home-brewed heroin is on the rise, scientists warn". No heroin was made in this study, and complete opioid-producing strains of yeast have not been made yet – home-brewing heroin from yeast is not yet possible, much less on the rise.
The possibility of home-brewing comes from a commentary on the article in Nature, which discusses the findings of this and related studies. This commentary also discusses the potential legal implications, and the ways that risks could be reduced. For example, scientists could only produce yeast strains that make weaker opioids. But they acknowledge that the risk of criminals making opiate-producing yeast strains themselves is low.
The Guardian and BBC News take a slightly more restrained approach, suggesting that home-brew heroin may be a problem in the future but it certainly is not an issue now. The BBC also points out that producing medicines in microbes is not a new thing.What kind of research was this?
This laboratory research studied whether a group of chemicals called benzylisoquinoline alkaloids (BIAs) could be produced in yeast. BIAs include a range of chemicals used as drug treatments in humans. This includes opioids used for pain relief, as well as antibiotics and muscle relaxants.
Opioids are among the oldest drugs first identified as being produced naturally by opium poppies. Morphine is an opioid derived from poppies, and it and other derivatives or man-made versions of opioids are used to treat pain.
Opioids also produce euphoria and can be addictive. The illegal drug heroin is an opiate that can be produced by refining morphine to make it more powerful.
The researchers say many of these compounds are still made from plants such as the opium poppy, as they are chemically very complex and therefore difficult and expensive to make from scratch in the lab.
However, now we know much more about how the chemicals are made in plants, it may be possible to genetically engineer microbes in the lab to produce these chemicals in industrial quantities.
The researchers say the yeast S. cerevisiae – sometimes known as baker's or brewer's yeast – has been used to produce BIAs in the lab from intermediate chemicals in the BIA production pathway. The earlier steps in the pathway have not yet been managed in yeast, although they have in bacteria.
In this study, the researchers wanted to see if they could produce the intermediate chemical (S)-reticuline in yeast. This has been tried before, but was not successful.What did the research involve?
The researchers knew they needed one particular type of protein called a tyrosine hydroxylase, which would work in yeast to perform the first step in the process of making (S)-reticuline.
They developed a system to allow them to quickly screen a large group of known tyrosine hydroxylases to identify one that would work in yeast. The tyrosine hydroxylase is needed to produce the intermediate chemical dopamine.
The researchers then needed other proteins that convert dopamine and another chemical already present in yeast into another intermediate chemical, and then carry out the other chemical steps needed to form (S)-reticuline. They identified proteins they needed for these stages from the opium poppy and the Californian poppy.
Finally, they genetically engineered yeast cells to produce tyrosine hydroxylase and all of the other proteins needed, and tested whether the yeasts were able to produce (S)-reticuline.What were the basic results?
The researchers were able to identify tyrosine hydroxylase from the sugar beet that worked in yeast, allowing them to produce the intermediate chemical dopamine. They used genetic engineering to make a version of this protein in yeast that worked even better than the original one.
They were also able to produce the other proteins they needed in yeast. A yeast strain producing all of these proteins was able to produce (S)-reticuline, the chemical intermediate needed in the production of opioids.How did the researchers interpret the results?
The researchers concluded that coupling their work with the work already done, and improving on the yield of the process, "will enable low-cost production of many high-value BIAs".
They say that, "Because of the potential for illicit use of these products, including morphine and its derivatives [such as heroin], it is critical that appropriate policies for controlling such strains be established so that we garner the considerable benefits while minimising the potential for abuse."Conclusion
This laboratory study has successfully managed to produce an important intermediate chemical in the pathway that produces benzylisoquinoline alkaloids (BIAs), a group of plant-derived chemicals that include opioids.
BIAs such as morphine have previously been made from similar intermediate chemicals in genetically engineered yeast, but this is the first time the earlier stages have been completed successfully in yeast. The researchers hope that by joining these two parts of the pathways, they will get yeast that can produce BIAs from scratch.
This study has not completed this final step, however. Researchers will need to test this before they know that it will be successful. They acknowledge that further optimisation of their method to produce more of the intermediate chemical is needed before it could be used to produce BIAs.
This study has generated media coverage speculating about the possibility of "home-brew heroin" creating an "epidemic of hard drug use". But the researchers did not produce heroin or any other opioid, only an intermediate chemical. These yeasts have been specially genetically engineered, and the experiments are not the sort of thing most people are going to be able to easily replicate in their garage.
While the likelihood of such strains being successfully made for criminal use seems very small, at least in the short to medium term, criminals can be resourceful. Considering the potential implications of this research and whether policies are needed, both nationally and internationally, may be prudent.
Links To The Headlines
Home-brewed heroin? Scientists create yeast that can make sugar into opiates. The Guardian, May 18 2015
'Home-brewed morphine' made possible. BBC News, May 19 2015
Heroin made in home-brew beer kits could create epidemic of hard drug abuse. Daily Mirror, May 18 2015
Links To Science
DeLoache WC, Russ ZN, Narcross L, et al. An enzyme-coupled biosensor enables (S)-reticuline production in yeast from glucose. Nature Chemical Biology. Published online May 18 2015
"Drinking orange juice every day could improve brain power in the elderly, research shows," the Mail Online reports. Despite the encouraging words from the media, the small study this headline is based on does not provide strong evidence that an older person would see any noticeable difference in their brain power if they drink orange juice for two months.
The study involved 37 healthy older adults, who were given orange juice or orange squash daily for eight weeks before switching to the other drink for the same amount of time. The 100% orange juice contains more flavonoids, a type of plant compound that has been suggested to have various health benefits.
The researchers gave participants a whole battery of cognitive tests before and after each eight-week period. Both drinks caused very little change on any of the test results and were not significantly different from each other on any of the tests individually.
The researchers also carried out analyses where they combined the test results and looked at the statistical relationships between the drink given and when the test was given. On this occasion, they did find a significant result – overall cognitive function (the pooled result of all the tests combined) was better after the juice than after the squash.
But the overall pattern of the results doesn't seem very convincing. This study does not provide conclusive evidence that drinking orange juice has an effect on brain function.Where did the story come from?
The study was carried out by researchers from the University of Reading, and was funded by Biotechnology and Biological Sciences Research Council grants and the Florida Department of Citrus, also known as Florida Citrus.
Florida Citrus is a government-funded body "charged with the marketing, research and regulation of the Florida citrus industry", a major industry in the state. Florida Citrus was reported to have helped design the study.
The study was published in the peer-reviewed American Journal of Clinical Nutrition.
The Mail Online took the study at face value without subjecting it to any critical analysis. Looking into the research reveals rather unconvincing evidence that drinking orange juice would have any effect on a person's brain function.What kind of research was this?
This was a randomised crossover trial that aimed to compare the effects of 100% orange juice, which has high flavanone content, and orange-flavoured cordial, which has low flavanone content, on cognitive function in healthy older adults.
Flavonoids are pigments found in various plant foods. It has been suggested they have various health benefits – for example, some studies have suggested that high consumption of flavonoids can have beneficial effects on cognitive function. Flavanones are the specific type of flavonoids found in citrus fruits. This trial investigated the effect of flavanones in orange juice.
This was a crossover trial, meaning the participants acted as their own control, taking both the high and low flavanone content in random order a few weeks apart. The crossover design effectively increases the sample size tested, and is appropriate if the interventions are not expected to have a lasting impact on whatever outcome is being tested.What did the research involve?
The study recruited 37 older adults (average age 67) who were given daily orange juice or orange squash for eight weeks in a random order, with a four-week "washout" period in between. They were tested to see whether the drinks differed in their effect on cognitive function.
All participants were healthy, without significant medical problems, did not have dementia and had no cognitive problems. In random order, they were given:
- 500ml 100% orange juice containing 305mg natural flavanones daily for eight weeks
- 500ml orange squash containing 37mg natural flavanones daily for eight weeks
The drinks contained roughly the same calories. The participants were not told which drink they were drinking, and the researchers assessing the participants also did not know.
Before and after each of the eight-week periods, participants visited the test centre and had data collected on height, weight, blood pressure, health status and medication. They also completed a large battery of cognitive tests assessing executive function (thinking, planning and problem solving) and memory.
The researchers analysed change in cognitive performance from baseline to eight weeks for each drink, and compared the effects of the two drinks.What were the basic results?
On the whole, the two drinks caused very minimal change from baseline on any of the individual tests. There was no statistically significant difference between the two drinks when comparing score change from baseline on any of the tests individually.
There was only a single significant observation when looking at the individual tests at the end of treatment (rather than change from baseline). A test of immediate episodic memory was higher eight weeks after drinking 100% orange juice compared with squash (score 9.6 versus 9.1). However, when this was compared to the change from baseline, it didn't translate into any significant difference between the groups.
The researchers also carried out a statistical analysis looking at the interactions between the drink given and the testing occasion. In this analysis, they did find an interaction between the drink and testing for global cognitive function (when all test results were combined). This showed that, overall, this was significantly better at the eight-week visit after the orange juice intake.How did the researchers interpret the results?
The researchers concluded that, "Chronic daily consumption of flavanone-rich 100% orange juice over eight weeks is beneficial for cognitive function in healthy older adults."
They further say that, "The potential for flavanone-rich foods and drinks to attenuate cognitive decline in ageing and the mechanisms that underlie these effects should be investigated."Conclusion
Overall, this small crossover study does not provide conclusive evidence that drinking orange juice has an effect on brain function.
A wide variety of cognitive tests were performed in this study before and after the two drinks (orange juice and squash). The individual test results do not indicate any large effects. Notably, both drinks caused very little change from baseline on any of the test results, and were not significantly different.
The only significant results were found for overall cognitive function when combining test results and looking at statistical interactions. The fact a consistent effect wasn't seen across individual measures and the different analyses means the results are not very convincing.
The trial is also quite small, including only 37 people. These participants were also a specific sample of healthy older adults who volunteered to take part in this trial, and none of them had any cognitive impairment, so the results may not be applicable to other groups.
While the participants were not told what they were drinking and the drinks were given in unlabelled containers, they did have to dilute them differently. This and the taste of the drinks may have meant the participants could tell the drinks apart. The researchers did ask participants what they thought they were drinking, and although about half said they did not know, most of those who gave an opinion (16 out of 20) got it right.
There is also only comparison of high- and low-flavanone orange juice. There is no comparison with a flavanone-free drink, or foods or drinks that contain other types of flavonoid.
The possible health benefits of flavonoids or flavanones specifically will continue to be studied and speculated. However, this study can't conclusively tell us that they have an effect on brain power.
A good rule of thumb is what's good for the heart is also good for the brain – taking regular exercise, eating a healthy diet, avoiding smoking, maintaining a healthy weight, and drinking alcohol in moderation.
Links To The Headlines
Orange juice 'improves brain function'. The Daily Telegraph, May 15 2015
Links To Science
Kean RJ, Lamport DJ, Dodd GF, et al. Chronic consumption of flavanone-rich orange juice is associated with cognitive benefits: an 8-wk, randomized, double-blind, placebo-controlled trial in healthy older adults. The American Journal of Clinical Nutrition. Published online January 14 2015
"A 'groundbreaking' new therapy for cystic fibrosis could hugely improve patients' quality of life," The Daily Telegraph reports after a combination of two drugs – lumacaftor and ivacaftor – was found to improve lung function.
The headline is prompted by a trial looking at a new treatment protocol for cystic fibrosis, a genetic condition caused by a mutation in a gene that normally creates a protein that controls salt balance in a cell. This leads to thick mucus build-up in the lungs and other organs, causing a persistent cough, poor weight gain and regular lung infections.
The prognosis for cystic fibrosis has improved dramatically over the past few decades, but the condition is still life-limiting. This new drug combination works together to make the faulty cell protein work better.
More than 1,000 people with cystic fibrosis were given the new protocol or a placebo for 24 weeks. The treatment led to meaningful improvements in lung function compared with the placebo. It also reduced the number of lung infections, improved quality of life, and helped people gain weight.
Further study of the drugs' effects in the longer term will be needed, in addition to collecting more information on side effects.
But this treatment won't work for all people with cystic fibrosis. There are various gene mutations, and this treatment only targeted the most common one, which affects half of people with the condition.Where did the story come from?
The study was carried out by researchers from various international institutions, including the University of Queensland School of Medicine in Australia, and Queens University Belfast.
There were various sources of funding, including Vertex Pharmaceuticals, which makes the new treatment.
The study was published in the peer-reviewed New England Journal of Medicine.
The UK media provided balanced reporting of the study, including cautions that the treatment should work in around half of people with cystic fibrosis. Researchers were quoted as saying that although they hope this could improve survival for people with cystic fibrosis, they don't know this for sure.
However, some of the reporting focusing on the quality of life improvements does not take note of the researchers' caution that, overall, these improvements fell short of what was considered meaningful.
The media also debated the cost of the treatment protocol. The Guardian reports one year's course of lumacaftor alone costs around £159,000. The new treatment protocol is being assessed by the National Institute for Health and Care Excellence (NICE) to see if it is a cost effective use of resources.What kind of research was this?
This was a randomised controlled trial (RCT) aiming to investigate the effects of a new treatment for cystic fibrosis.
Cystic fibrosis is a hereditary disease caused by mutations in a gene called cystic fibrosis transmembrane conductance regulator (CFTR). The protein made by the CFTR gene affects the balance of chloride and sodium inside the cells.
In people with cystic fibrosis, the CFTR protein does not work. This causes mucus secretions in the lungs and other parts of the body to be too thick, leading to symptoms such as a persistent cough and frequent chest infections.
There is no cure for cystic fibrosis, and current management focuses on breaking down mucus and controlling the symptoms with treatments such as physiotherapy and inhaled medicines.
We have two copies of all of our genes – one inherited from each parent. To develop cystic fibrosis, you need to inherit two abnormal copies of the CFTR gene. One in 25 people carry a copy of the abnormal CFTR gene. If two people carrying an abnormal gene have a child and the child receives the abnormal gene from both parents, they will develop cystic fibrosis.
This trial looked at the effects of a treatment that helps the abnormal CFTR protein work better, called lumacaftor. It was tested in combination with another treatment called ivacaftor, which also boosts the activity of CFTR proteins.
There are various different types of CFTR gene mutations. One, called Phe508del, is the most common and affects 45% of people with the condition. Lumacaftor specifically corrects the abnormality caused by the Phe508del mutation, so this trial only included people with this mutation. An RCT is the best way of examining the safety and effectiveness of a new treatment.What did the research involve?
This study reports the pooled results of two identical RCTs that have investigated the effects of two different doses of lumacaftor, in combination with ivacaftor, for people with cystic fibrosis who have two copies of the Phe508del CFTR mutation.
The study recruited 1,122 people aged 12 or older; 559 in one of the trials and 563 in the other. Participants in both trials were randomly assigned to one of three study groups:
- 600mg of lumacaftor every 24 hours in combination with 250mg of ivacaftor every 12 hours
- 400mg of lumacaftor every 12 hours in combination with 250mg of ivacaftor every 12 hours
- placebo pills every 12 hours
The placebo pills looked just like the lumacaftor and ivacaftor pills and were taken in the same way, so researchers and participants could not tell whether they were taking placebo or not. All treatments were taken for 24 weeks.
The main outcome examined was how well the participants' lungs worked, measured as a change in percentage of predicted forced expiratory volume (FEV1). This is the amount of air that can be forcibly exhaled in the first second after a full in-breath, which provides a well-validated method of assessing lung health and function.
The percentage of predicted FEV1 shows how much you exhale as a percentage of what you would be expected to, based on your age, sex and height.
The researchers also looked at the change in body mass index (BMI) and in people's quality of life in terms of their lung function, as reported in the patient-reported Cystic Fibrosis Questionnaire – Revised (CFQ-R).
The study analysis included all patients who received at least one dose of the study drug, which was 99% of all participants.What were the basic results?
At the start of the study, the average FEV1 of participants was 61% of what was predicted (what it ought to be). There were no differences between the randomised groups in terms of age, sex, lung function, BMI or current cystic fibrosis treatments used.
Lumacaftor-ivacaftor significantly improved how well the participants' lungs worked compared with placebo in both trials, and at both doses. The change in percentage of predicted FEV1 ranged between 2.6% and 4.0% across the two trials compared with placebo over the 24 weeks.
There were also significant improvements compared with placebo in BMI (range of improvement 0.13 to 0.41 units), and respiratory quality of life (1.5 to 3.9 points on the CFQ-R). There was also a reduced rate of lung infections in the treatment groups.
There was similar reporting of side effects across the two treatment groups and placebo groups (around a quarter of participants experienced side effects). The most common adverse effect participants experienced was lung infections.
However, the proportion of participants who stopped taking part in the study as a result of side effects was slightly higher among the drug treatment groups (4.2%) compared with placebo groups (1.6%).
The specific reasons for discontinuation varied between individuals – for example, a couple stopped because of shortness of breath or wheezing; some stopped because of blood in their sputum; some because of a rash; and so on.How did the researchers interpret the results?
The researchers concluded that, "These data show that lumacaftor in combination with ivacaftor provided a benefit for patients with cystic fibrosis [who carried two copies of] the Phe508del CFTR mutation."Conclusion
This trial has demonstrated that this new treatment combination could be effective in improving lung function for people with cystic fibrosis who have two copies of the common Phe508del CFTR mutation.
The trial has many strengths, including its large sample size and the fact it captured outcomes at six months for almost all participants. The improvements in lung function were seen while the participants continued to use their standard cystic fibrosis treatments. As the researchers suggest, this indicates the treatment could be a beneficial add-on to normal care to further improve symptoms.
The results seem very promising, but there are limitations that should be addressed. Though lung function improvements were said to be clinically meaningful, improvements in quality of life relating to lung function fell short of what is considered to be meaningful clinically (four points and above on the CFQ-R scale).
The trial only included people with well-controlled cystic fibrosis, and effects of the treatment might not be as good for people with poorer disease control. The treatment combination would also only be suitable for people with the Phe508del CFTR mutation.
This trial only included people with two copies of this mutation, which is only the case in around 45% of people with the condition. Whether the treatment would benefit people who carry one copy of the Phe508del mutation and a different second CFTR mutation is not yet clear, and people with two non-Phe508del mutations would not be expected to benefit from this treatment.
The effects of this treatment combination will need to be studied in the longer term, beyond six months – for example, to see whether it could prolong life. Further information will need to be collected on side effects and how commonly they cause people to stop treatment.
Though this treatment targets the abnormal protein that causes symptoms, as one of the study authors notes in The Guardian, it is not a cure. The lead researcher, Professor Stuart Elborn, was quoted as saying: "It is not a cure, but it is as remarkable and effective a drug as I have seen in my lifetime."
Overall, the results of this trial show promise for this new treatment for people with cystic fibrosis who carry two copies of this specific gene mutation.
Links To The Headlines
'Groundbreaking' new treatment for cystic fibrosis. The Daily Telegraph, May 17 2015
Cystic fibrosis treatment found to improve lives of sufferers in trials. The Guardian, May 17 2015
Cystic fibrosis drug offers hope to patients. BBC News, May 17 2015
Links To Science
Wainwright CE, Elborn JS, Ramsey BW, et al. Lumacaftor–Ivacaftor in Patients with Cystic Fibrosis Homozygous for Phe508del CFTR. The New England Journal of Medicine. Published online May 17 2015
"Hate injections? Holding your breath can make the pain of jabs more bearable," the Mail Online reports. A team of Spanish researchers mechanically squeezed the fingernails of 38 willing volunteers to cause them pain.
For one round of experiments, the group were told to hold their breath before and during the pain squeeze. In the second round, they had to breathe in slowly while the pain was applied. Those holding their breath reported slightly lower pain ratings overall than those breathing in slowly.
The hypothesis underpinning this technique is that holding your breath increases blood pressure, which in turn reduces nervous system sensitivity, meaning you have a reduced perception of any pain signals.
But before you try this out, it's worth saying the pain perception differences were very small – a maximum 0.5 point difference on a scale from 0 to 10.
Also, the pain scores of the experimental breathing styles weren't compared with normal breathers, so we don't actually know if they were beneficial overall at reducing pain perception, only relative to one other.
We wouldn't advise changing your breathing habits in an attempt to avoid pain based on the results of this study.
Where did the story come from?
The study was carried out by researchers from University of Jaén in Spain, and was funded by the Spanish Ministry of Science and Innovation.
It was published in the peer-reviewed journal, Pain Medicine.
Generally, the Mail Online reported the story accurately. In their article, the lead study author explained that holding your breath won't work for an unexpected injury, such as standing on a pin or stubbing a toe. But it might work if you start holding your breath before the pain kicks in – for example, anticipating the sting of an injection.
The Mail added balance by indicating other scientists were critical of the findings. They said the pain reduction was very small, and pointed out that holding your breath might make your muscles more tense, which could worsen pain in some circumstances, such as childbirth.What kind of research was this?
This human experimental study looked at whether holding your breath affects pain perception.
The researchers explain that holding your breath immediately after a deep inhalation slows your heart rate and increases your blood pressure. This stimulates pressure-sensing receptors called baroreceptors to send signals to the brain to reduce blood pressure.
This happens through reduced activity of the sympathetic nervous system, which is involved in the "fight or flight" response to danger. When working as it should, this feedback loop ensures blood pressure doesn't get too high.
The researchers say the dampening down of this part of the nervous system might also reduce sensitivity to pain. In this study, the researchers wanted to test their theory that increasing your blood pressure through holding your breath would reduce your perception of pain.What did the research involve?
Researchers used a machine to squeeze the fingernails of 38 healthy adult volunteers at different pressures to stimulate pain. Before the squeeze, the group were told to inhale slowly or to hold their breath after a deep breath in.
The researchers analysed ratings of pain in the two breathing styles to see if there was a difference. Volunteers were pre-tested to find a nail squeeze pressure they found painful and three personalised pain intensity thresholds.
Two breathing styles were tested and compared in each person. One involved breathing in slowly for at least seven seconds while the pain was applied. The other involved inhaling deeply, holding your breath while the pain was applied, before exhaling for seven seconds without actively forcing the breath out.
Both groups practised the breathing styles before the experiment began until they were confident they could do it properly. Once they had established their breathing, each volunteer had one fingernail mechanically squeezed for five seconds. After the squeeze, participants could breathe normally.
They were asked to rate the pain on a Likert scale ranging from 0 (not at all painful) to 10 (extremely painful). The experiment was repeated on the same person using three pain intensity thresholds for each breathing condition.
Volunteers knew the experiment was about pain and breathing, but they were not told which breathing experiment the study team expected to work.What were the basic results?
Ratings of pain intensity were consistently higher in the slow breathing group compared with the breath-holders. This held true for each of the three pain intensities tested.
Both breathing styles slowed heart rates, but this happened a little quicker, and the difference was larger, in the breath-holding condition.How did the researchers interpret the results?
The researchers concluded that, "During breath-holding, pain perception was lower relative to the slow inhalation condition; this effect was independent of pain pressure stimulation."
On the implications of their findings, they said: "This simple and easy-to-perform respiratory manoeuvre may be useful as a simple method to reduce pain in cases where an acute, short-duration pain is present or expected (e.g. medical interventions involving needling, bone manipulations, examination of injuries, etc.)."Conclusion
This small human experimental study used a fingernail-squeezing machine to cause pain to 38 willing volunteers. It found those instructed to hold their breath before the pain stimulus consistently rated their pain lower than those told to breathe slowly.
The difference between the two breathing groups was very small, although statistically significant. The biggest pain difference seen looked to be less than 0.5 points on a 10-point scale. How important this is to doctors or patients is debateable.
Similarly, the study compared two artificial breathing conditions against one another. They did not compare these against pain scores in people breathing normally throughout. This would have been useful, as it would give us an idea of whether one or both of the breathing types were any better than breathing normally.
On this point, the Mail Online reported that, "On a scale of 1 to 10, the pain experienced by volunteers fell by half a point from 5.5 to 5 when they held their breath". It wasn't completely clear whether they were talking about the difference between the two groups, or the absolute pain reduction experienced related to normal breathing.
This figure wasn't clear in the published research, so may have come from an interview. If true, it again highlights the rather small reduction in pain found.
The volunteers knew they were taking part in a pain study related to breathing. Participants' general expectations about the likely effects of the two breathing conditions therefore might have biased the results. Larger studies involving study blinding and randomisation would reduce the chance of this bias and others.
Overall, this study shows that changing your breathing pattern might affect your pain perception – but at such a small level that it might not be useful in any practical way.
There may be other dangers in holding your breath in an attempt to control pain. For example, you might feel lightheaded and pass out, or tense your muscles, which can hamper the ease of injections.
If you are worried about having an injection, you should tell the health professional before they give you an injection. They can take steps to make the experience less distressing.
Links To The Headlines
Hate injections? Holding your breath can make the pain of jabs more bearable. Mail Online, May 14 2015
Links To Science
Reyes del Paso G, de Guevara ML, Montoro CI. Breath-Holding During Exhalation as a Simple Manipulation to Reduce Pain Perception. Pain Medicine. Published online April 30 2015
The Daily Telegraph today tells us that: "Single mothers in England [are] more likely to suffer ill health because their families 'do not support them'."
This is a half-truth. The large international study – involving 25,000 people from England, the US and 13 other European countries – behind the headline found a link between single motherhood between the ages of 16 and 49 and worse health in later life. But it did not find this was because families do not support them.
It would appear that this claim is prompted by a trend spotted in the study by the researchers. It found that health risks were more pronounced in northern European countries and the US. While in southern European countries the risk was less pronounced.
The researchers speculated that in southern European countries there is more of a tradition of informal support services, where grandparents, aunts, uncles, cousins etc all pitch in with childcare duties. Or as the proverb puts it "It takes a village to raise a child".
While this hypothesis is plausible it is also unproven and was not backed up with any new robust data on social support as part of the study.
The study was very large and diverse so the mother health link appears real. However, the reasons and causes behind it are still to be worked out.
Where did the story come from?
The study was carried out by researchers from US, Chinese, UK and German universities and was funded by the US National Institute on Aging.
The study was published in the peer-reviewed Journal of Epidemiology & Community Health.
The media reporting was generally partially accurate, as most took the finding about social support at face value. The link between single motherhood and later ill health was supported by the body of this study, but the study did not collect any information on social support, so this explanation, although plausible, was not based on direct evidence.
What kind of research was this?
The study investigated if single motherhood before the age of 50 was linked to poorer health later in life, and whether it was worse in countries with weaker "social [support] safety nets". To do this they analysed data collected from past cohort and longitudinal studies across 15 countries.
The researchers say single motherhood is known to be linked to poorer health, but didn’t know whether this link varied between countries.
Analysing previously collected data is a practical and legitimate study method. A limitation is that the original information was collected for specific reasons that usually differ from the research aims when coming to use it later. This can mean some information that would ideally be analysed is not there. In this study, the researchers couldn’t get information on social support networks, which they thought might explain some of their results.
What did the research involve?
The research team analysed health and lifestyle information on single mothers under 50 collected from existing large health surveys. The single mothers' health was documented into older age and compared across 15 countries.
Data was available from 25,125 women aged over 50 who participated in the US Health and Retirement Study; the English Longitudinal Study of Ageing; or the Survey of Health, Ageing and Retirement in Europe (SHARE). Thirteen of the 21 countries represented by SHARE (Denmark, Sweden, Austria, France, Germany, Switzerland, Belgium, The Netherlands, Italy, Spain, Greece, Poland, Czech Republic) had collected relevant data. With the US and England on board, this gave 15 countries for final analysis.
The researchers used data on number of children, marital status and any limitations on women's capacity for routine daily activities (ADL), such as personal hygiene and getting dressed, and instrumental daily activities (IADL), such as driving and shopping. Women also rated their own health.
Single motherhood was classified as having a child under the age of 18 and not being married, rather than living with a partner.
What were the basic results?
Single motherhood between the ages of 16 and 49 was linked to poorer health and disability in later life in several different countries. The risks were highest for single mothers in England, the US, Denmark and Sweden.
Overall 22% of English mothers had experienced single motherhood before age 50, compared with 33% in the US, 38% in Scandinavia, 22% in western Europe and 10% in southern Europe.
While single mothers had a higher risk of poorer health and disability in later life than married mothers, associations varied between countries.
For example, risk ratios for ADL limitations were significant in England, Scandinavia and the US but not in western Europe, southern Europe and eastern Europe.
Women who were single mothers before age 20, for more than eight years, or resulting from divorce or non-marital childbearing, had a higher risk.
How did the researchers interpret the results?
The researchers' concluded that: "Single motherhood during early adulthood or mid-adulthood is associated with poorer health in later life. Risks were greatest in England, the US and Scandinavia."
Although they didn’t have good data to back it up, they suggested that social support and networks may partially explain the findings. For example, areas like southern Europe, which the researchers say have strong cultural emphasis on family bonds, were not associated with higher health risks.
They add: "Our results identify several vulnerable populations. Women with prolonged spells of single motherhood; those whose single motherhood resulted from divorce; women who became single mothers at young ages; and single mothers with two or more children, were at particular risk."
This large retrospective study of over 25,000 women linked single motherhood between the ages of 16 and 49 with worse health in later life. This is not a new finding. What was new was that the link varied across different countries. Risks were estimated as greatest in England, the US and Scandinavia for example, but were less consistent in other areas of Europe.
The research team thought this might be caused by differences in how social networks supported single mothers in different countries, such as being able to rely on extended families. But they had no data to directly support this. They did not have information on, for example, socioeconomic status, social support or networks during single motherhood, so could not analyse whether these were important causes. They also did not know whether any of the women they classed as single were actually in non-marital or same-sex partnerships, which may have affected results.
Health status in later life is likely to be linked to a complex number of interrelated factors. Being a single mum may be one, social networks might be another. But based on this study we don't yet know for sure, or the mechanisms by which this could lead to worse health.
Studies that collect information on levels of social support alongside health outcomes for single women would be able to tell us whether this is the likely cause, but getting this data may not be easy.
Links To The Headlines
Single mothers more likely to suffer ill health, study finds. The Independent, May 14 2015
'Health risk' for single mothers. Mail Online, May 15 2015
Single mothers in England more likely to suffer ill health because their families 'do not support them'. The Daily Telegraph, May 14 2015
Links To Science
Berkman LF, Zheng Y, Glymour MM, et al. Mothering alone: cross-national comparisons of later-life disability and health among women who were single mothers. Journal of Epidemiology and Community Health. Published online May 14 2015
"Cannabis pills 'do not help dementia sufferers'," reports The Daily Telegraph. Previous research suggested one of the active ingredients in cannabis – tetrahydrocannabinol (THC) – can have effects on the nervous system and brain, such as promoting feelings of relaxation.
In this study, researchers wanted to see if THC could help relieve some of the behavioural symptoms of dementia, such as mood swings and aggression.
They set up a small trial involving 50 dementia patients with behavioural symptoms. They found taking a pill containing a low dose of THC for three weeks did not reduce symptoms any more than a dummy pill. Other studies suggested the substance might have benefits, but these studies were not as well designed as this trial.
The study was small, which reduces its ability to detect differences between groups. But the trend was for a greater reduction of symptoms in the placebo group than the THC group, suggesting that THC would not be expected to be better, even with a larger group.
People taking the THC pills did not show more of the typical side effects expected, such as sleepiness or dizziness. This led researchers to suggest the dose of THC may need to be higher to be effective. Further studies would be needed to determine whether a higher dose would be effective, safe and tolerable.Where did the story come from?
The study was carried out by researchers from Radboud University Medical Center and other research centres in the Netherlands and the US.
It was funded by the European Regional Development Fund and the Province of Gelderland in the Netherlands. The study drug was provided by Echo Pharmaceuticals, but they did not provide any other funding or have any role in performing the study.
The study was published in the peer-reviewed medical journal, Neurology.
The Daily Telegraph covered this story well.What kind of research was this?
This was a randomised controlled trial (RCT) looking at the effects of tetrahydrocannabinol (THC), one of the active ingredients in cannabis, on neuropsychiatric symptoms in people with dementia.
This was a phase II trial, meaning it is a small-scale test in people with the condition. It aims to check safety and get an early indication of whether the drug has an effect.
The researchers say they had also carried out a similar trial with a lower dose of THC (3mg daily), which did not have an effect, so they increased the dose in this trial to 4.5mg daily.
People with dementia often have neuropsychiatric symptoms, such as being agitated or aggressive, delusions, anxiety, or wandering.
The researchers report that existing drug treatments for dementia have a delicate balance of benefits and harms, and non-drug treatments are therefore preferred, but they have limited evidence of effectiveness and may be difficult to put into practice.
An RCT is the best way to assess the effects of a treatment. Randomisation is used to create well-balanced groups, so the treatment is the only difference between them. This means any differences in outcome can be attributed to the treatment itself and not to other confounding factors.What did the research involve?
The researchers enrolled 50 people with dementia and neuropsychiatric symptoms. They randomly assigned them to take either a THC pill or an identical-looking inactive placebo pill for three weeks. They assessed symptoms over that time and looked at whether these differed in the two groups.
The trial initially intended to assess people who also had pain, but the researchers could not find enough people with both symptoms to participate, so they focused on neuropsychiatric symptoms. It also intended to recruit 130 people, but did not reach this number because of delays in getting approval for the trial in some centres.
About two-thirds (68%) of the participants had Alzheimer's disease and the rest had vascular dementia or mixed dementia. They all had experienced neuropsychiatric symptoms for at least a month. Both groups were taking similar neuropsychiatric drugs, including benzodiazepines, and continued to take these drugs during the study period.
People with a major psychiatric disorder or severe aggressive behaviour were excluded. Just over half (52%) lived in a special dementia unit or nursing home. The participants were aged about 78 years on average.
The pills contained 1.5mg of THC (or none in the case of placebo) and were taken three times a day for three weeks. Neither the participants nor the researchers assessing them knew which pills they were taking, which stops them influencing the results.
The researchers assessed the participants' symptoms at the start of the study, two weeks later, and then at the end of the study. They used a standard questionnaire, which asked the caregiver about symptoms in 12 areas, including agitation or aggression and unusual movement behaviour, such as pacing, fidgeting or repeating actions such as opening and closing drawers.
The researchers used a second method to measure agitated behaviour and aggression, and also measured quality of life and the participants' ability to carry out daily activities. They also assessed whether the participants experienced any side effects from treatment. The researchers then compared the results of the two groups.What were the basic results?
Three participants did not complete the study: one in each group stopped treatment because they experienced side effects, and one in the placebo withdrew their consent for taking part.
Both the placebo and the THC pill groups had a reduction in neuropsychiatric symptoms during the trial. There was no difference in the reduction between the groups. The groups also did not differ in a separate measure of agitation and anxiety, quality of life, or ability to carry out daily activities.
Two-thirds of people (66.7%) taking THC experienced at least one side effect, and over half of those taking placebo (53.8%). The sorts of side effects that have been previously reported with THC, such as sleepiness, dizziness and falls, were actually more common with placebo.How did the researchers interpret the results?
The researchers concluded they found no benefit of 4.5mg oral THC for neuropsychiatric symptoms in people with dementia after three weeks of treatment.
They suggested the dose of THC used may be too low because the participants did not experience the expected side effects of THC, such as sleepiness.Conclusion
This small phase II trial has found no benefit of taking THC pills (4.5mg a day) for neuropsychiatric symptoms in people with dementia in the short term.
The authors say this contrasts with previous studies, which have found some benefit. However, they note the previous studies were also limited in that they were even smaller, did not have control groups, or did not collect data prospectively.
The study was small, which reduces its ability to detect differences between groups. However, the non-significant trend was for a greater reduction of symptoms in the placebo group than the THC group.
The researchers note the improvement in the placebo group was "striking" and may be the result of factors such as the attention and support received from the study team, expectations of the participants on the effects of THC leading to perceived improvement, and training of the nursing personnel in the study.
While the authors suggest the dose of THC may need to be higher, further studies are needed to determine whether a higher dose would be effective and safe.
Links To The Headlines
Cannabis pills 'do not help dementia sufferers'. The Daily Telegraph, May 13 2015
Links To Science
van den Elsen G, Ahmed AIA, Verkes R, et al. Tetrahydrocannabinol for neuropsychiatric symptoms in dementia. Neurology. Published online May 13 2015
"Poor grip can signal chances of major illness or premature death," the Mail Online reports. An international study has provided evidence that assessing grip strength could help identify people who were at higher risk of cardiovascular incidents such as a heart attack.
The study authors wanted to see whether muscle strength, measured by grip, can predict the chances of getting a range of illnesses, and of dying, in high-, medium- and low-income countries. To find out, they tested 142,861 people across 17 countries and tracked what happened to them over the course of four years. The study found that the chances of dying during this period were higher for people with weaker grips, as were the chances of having a heart attack or stroke.
The grip test predicted death from any cause better than systolic blood pressure did, but the blood pressure test was better at predicting whether someone would have a heart attack or stroke.
Grip tests may be a quick way of assessing someone's chances of having cardiovascular disease, or dying from it, but the study doesn't tell us whether muscle weakness causes illness, or the other way around.
It is unlikely that a "grip test" would replace standard protocols for diagnosing cardiovascular diseases, which rely on a combination of risk assessment methods and tests, such as electrocardiogram (ECG) and a coronary angiography. However, such a test could be useful in areas of the world where access to medical resources is limited.Where did the story come from?
The study was carried out by researchers from 23 different universities or hospitals, in 17 different countries. It was led by researchers at McMaster University in Ontario, Canada, and funded by grants from many different national research institutes and pharmaceutical companies. The study was published in the peer-reviewed medical journal The Lancet.
The media reported the study reasonably accurately, although the Mail and The Daily Telegraph seemed to confuse the maximum grip strength measured by dynamometer with the strength of a person's handshake, which isn't the same thing. You would hope that somebody shaking your hand wouldn’t try to grip it as powerfully as possible.What kind of research was this?
This was a longitudinal population study carried out in 17 countries, with high, medium and low income levels. It aimed to see whether muscle strength, measured by grip, could predict someone's chances of illness or death from many different causes. As this was an observational study, it can't tell us whether grip strength is a cause of illness or death, but it can show us whether the two things are linked.What did the research involve?
Researchers recruited 142,861 people from households across the 17 countries included in the study. They tested their grip strength and took other measurements, including their weight and height, and asked questions about their:
- activity levels
- general health
They checked up with them every year for an average of four years, to find out whether they were still alive and whether they had developed certain illnesses. After four years, the researchers used the data to calculate whether grip strength was linked to a higher or lower risk of having died or developed an illness.
The researchers aimed to get an unbiased sample of people from across the countries involved. They tried to get documentary evidence about cause of death, if people had died. However, if that was not available, they asked a standard set of questions of the people in their household to try to ascertain the cause of death. Most people in the study had their grip strength tested in both hands, although some at the start of the study had only one hand tested.
The data was analysed in a number of different ways, to check whether the results were consistent across different countries and within the same country. One big problem with this type of study is reverse causation. This means that the thing being measured – in this case, grip strength – could be either a cause or a consequence of illness. So someone with a weak grip might have weak muscles because they are already ill with a fatal disease. To try to get around this, the researchers analysed the figures excluding everyone who had died within six months of being enrolled in the study, and another analysis excluding everyone with cardiovascular disease or cancer at the start of the study. The results were adjusted to take account of age and gender, because older people and women, on average, have weaker muscle strength than younger people and men.What were the basic results?
The researchers had follow-up data for 139,691 people, of whom 3,379 (2%) died during the study. After adjusting their figures, researchers found that people with lower grip strength were more likely to have died during the study, whether from any cause, cardiovascular disease or non-cardiovascular disease. People with low grip strength were also more likely to have had a heart attack or stroke. There was no link between grip strength and diabetes, hospital admission for pneumonia or chronic obstructive pulmonary disease (COPD), injury from falling, or breaking a bone. The results did not change significantly when excluding people who died within six months, or who had cancer or cardiovascular disease at the start.
Grip was measured in kilograms (kg) and adjusted for age and height. Average values for men ranged from 30.2kg in low income-countries to 38.1kg in high-income countries. On average, across all study participants, a 5kg reduction in grip strength was associated with a 16% increase in the chance of death (hazard ratio 1.16, 95% confidence interval 1.13 to 1.20). Grip strength alone was more strongly associated with the chance of dying from cardiovascular disease (such as a heart attack or stroke) than systolic blood pressure – a more commonly used measurement. However, blood pressure was better at predicting whether someone was going to have a heart attack or stroke.How did the researchers interpret the results?
The researchers said their findings showed that muscle strength is a strong predictor of death from cardiovascular disease and a moderately strong predictor of having a heart attack or stroke. They say that muscle strength predicts the chances of death from any cause, including non-cardiovascular disease, but not the chances of getting non-cardiovascular illness.
They go on to say that these findings suggest muscle strength may predict what happens to people who get illnesses, rather than just predicting whether they get ill. When they looked at what happened to people who got ill, whether from cardiovascular disease or other causes, those who had low grip strength were more likely to die than those with high grip strength.
They say they cannot tell from the study why there is an association between muscle strength and chances of getting cardiovascular disease. They say further research is needed to see whether improving muscle strength would cut the chances of having heart attacks or strokes.Conclusion
These are interesting results from a range of very different countries, showing that people with low muscle strength may be at higher risk of dying prematurely than other people. Earlier studies in high-income countries had already suggested that this was the case, but this is the first study to show it holds true across countries from high to low incomes.
The study also shows that Europeans, and men from high-income countries, on average, have higher grip strength than people from lower-income countries. Interestingly, women from middle-income regions, such as China and Latin America, had slightly higher muscle strength than women from high-income countries.
What we don't know from the study is why and how muscle strength is linked to the chances of death. It might seem obvious that people who are weak and frail are more at risk of death than other people, but we don't know whether this is because they are already ill, or whether their weak muscle strength makes them more vulnerable to getting ill, or less able to survive illness if they do get sick.
Importantly, the study doesn't tell us what can be done for people with low muscle strength. Should we all be doing weight training to increase our strength, or would that make no difference? Low muscle strength may reflect lots of things, such as the amount of exercise people do, what type of diet they eat, their age and occupation. We know that muscle strength declines as we age, but we don't know the effect of trying to halt this decline.
Should doctors routinely measure people's grip strength to test their health? The researchers say it is a better predictor of cardiovascular death than blood pressure, and could be easily used in lower-income countries. But raised blood pressure and cholesterol both increase the risk of cardiovascular disease, and there are treatments available to get them under control. Simply measuring a person’s grip strength would miss this opportunity and not lead to any preventive strategies.
The "grip test" could be used in poorer countries as a quick way to identify people who might be at risk of heart attack or stroke, who could then benefit from follow-up testing.
Links To The Headlines
Palm 'holds secrets of future health'. BBC News, May 14 2015
Handshake strength 'could predict' heart attack risk. The Daily Telegraph, May 14 2015
Why testing your grip could save your life. Daily Express, May 14 2015
Links To Science
Leong DP, Teo KK, Rangarajan S, et al. Prognostic value of grip strength: findings from the Prospective Urban Rural Epidemiology (PURE) study. The Lancet. Published online May 13 2015