Christina Reith opens spreadsheets full of adverse events the way most people scroll through social media: methodically, looking for patterns that others miss. Over the past several years, she and her colleagues have compiled something extraordinary: every reported side effect, from every participant, across 23 major clinical trials of statins. That’s 154,664 people, followed for a median of nearly five years, generating millions of data points about headaches, muscle aches, memory lapses, and dozens of other complaints. The question they wanted to answer was deceptively simple: which of these problems do statins actually cause?
The timing matters. Since 2013, when prominent media coverage suggested statins cause side effects in one in five people, prescription rates have wobbled. In the UK alone, more than 200,000 patients stopped taking their pills following those reports. The researchers estimate this could have led to 2,000 to 6,000 avoidable heart attacks and strokes over the following decade. People were making decisions based on package inserts listing 66 potential side effects—everything from cognitive impairment to sleep disturbances to sexual dysfunction. But package inserts, Reith’s team realised, weren’t based on the kind of rigorous evidence you’d want when deciding whether to take a drug that might save your life.
Here’s the trick with observational studies: if you give millions of people a pill and then track what happens to them, you’ll see everything that happens to people. Headaches. Depression. Weight gain. Insomnia. The challenge is figuring out what the pill caused versus what would have happened anyway. Randomised, double-blind trials solve this problem elegantly—half the participants get the real drug, half get a placebo, and neither they nor their doctors know who got what. Compare the two groups and the difference reveals what the drug actually does.
The Cholesterol Treatment Trialists’ Collaboration, coordinated between Oxford and Sydney, gathered individual participant data from every large, long-term, double-blind statin trial they could find. Not summary statistics—the raw reports of every cough, every dizzy spell, every complaint recorded during years of treatment. They coded everything using a standardised medical dictionary, controlling for multiple testing using statistical methods designed to avoid false discoveries. Then they looked at what happened to people taking statins versus people swallowing placebos.
For 62 of the 66 conditions listed on statin labels, the rates were essentially identical. Cognitive impairment? Reported by 0.2 per cent of people taking statins and 0.2 per cent taking placebo each year. Sleep disturbance, erectile dysfunction, depression, peripheral neuropathy: no meaningful difference. “Among the large number of patients assessed in this well-conducted analysis, only four side effects out of 66 were found to have any association with taking statins, and only in a very small proportion of patients,” says Bryan Williams, Chief Scientific and Medical Officer at the British Heart Foundation.
Those four conditions tell a more nuanced story. Statins did increase liver function test abnormalities. About 783 people per year taking statins reported abnormal liver transaminases versus 556 on placebo. Sounds worrying until you realise that’s an absolute excess of roughly one person in a thousand. And crucially, there was no increase in actual liver disease; no hepatitis, no liver failure. The blood test changes didn’t typically lead to serious liver problems, suggesting the elevations matter less than the alarming language of “liver dysfunction” might imply. The effect appeared dose-dependent too, with higher-intensity statins showing bigger increases, particularly atorvastatin 80 milligrams.
Two other findings emerged from the analysis: small increases in urinary composition changes and oedema. But here’s where dose-response becomes informative—when they analysed trials comparing more intensive versus less intensive statin regimens, they found no relationship between dose and these outcomes. That absence of a dose-response suggests the associations might not be causal at all, despite the statistical significance. The absolute numbers remain tiny regardless: about three extra people per 10,000 per year for urinary changes, seven for oedema.
What about muscle symptoms, the side effect everyone associates with statins? Previous work by this same collaboration already established that statins cause muscle pain in about one in 100 people during the first year of treatment, with no excess after that. Serious muscle damage (rhabdomyolysis) occurs in perhaps two or three people per 100,000. Real, but rare. Meanwhile, an effective statin regimen prevents major vascular events in about 1,000 of every 10,000 people with existing heart disease treated for five years, and 500 of every 10,000 at high risk but without prior events.
“Statins are life-saving drugs used by hundreds of millions of people over the past 30 years,” Reith says. “However, concerns about the safety of statins have deterred many people who are at risk of severe disability or death from a heart attack or stroke. Our study provides reassurance that, for most people, the risk of side effects is greatly outweighed by the benefits of statins.”
The implications reach beyond individual prescribing decisions. Rory Collins, senior author on the paper and Emeritus Professor at Oxford, argues the findings demand action from regulators: “Statin product labels list certain adverse health outcomes as potential treatment-related effects based mainly on information from non-randomised studies which may be subject to bias… Now that we know that statins do not cause the majority of side effects listed in package leaflets, statin information requires rapid revision to help patients and doctors make better-informed health decisions.”
That revision matters because misinformation has consequences. Williams calls the new evidence “a much-needed counter to the misinformation around statins” that should “help prevent unnecessary deaths from cardiovascular disease.” The researchers couldn’t assess every possible long-term effect, the trials followed people for a median of just under five years. And some adverse effects might be too rare to detect even in 154,664 participants. But for the dozens of conditions currently listed as potential side effects despite no solid evidence they’re actually caused by the drug? Those labels mislead more than they inform. Sometimes the most important finding is what you don’t find.
Study link: https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(25)01578-8/fulltext
If our reporting has informed or inspired you, please consider making a donation. Every contribution, no matter the size, empowers us to continue delivering accurate, engaging, and trustworthy science and medical news. Independent journalism requires time, effort, and resources—your support ensures we can keep uncovering the stories that matter most to you.
Join us in making knowledge accessible and impactful. Thank you for standing with us!
