This 19th century Benjamin Disraeli quote is more a propos than ever. Especially since medical policy has migrated from common sense to “Evidence Based Practice.” Is this statistics run amok?
Evidence Based medicine sounds so reassuring. There are two huge problems with it.
Problem #1: Which evidence?
The human race is far too diverse for single solutions to medical problems. People will have different reactions, sometimes allergic ones, to various drugs. However, this does not stop the medical profession from trying to find common ground for all. We suppose this makes sense, but only if it is assumed that doctors are basically robots that detect a symptom and apply a standard remedy. A good example is, “over 50? take statins.”
The grand scientific justification to all this is usually statistics.
Medicine didn’t always employ this statistical approach. Although statistics has been around for a few hundred years, randomized clinical trials didn’t really start until the 50s. The drug companies started using it to peddle drugs that didn’t have an obvious benefit. Penicillin is a drug with an obvious benefit. It will zap almost any bacterial infection that isn’t resistant to it and has saved millions of lives. There was no statistical analysis done at all for penicillin. Statins are a different story. Take a statin and nothing much happens at first. The circulatory system doesn’t immediately shed all its plaque. In fact the effect of statins is so non-apparent that to prove it, a bigger hammer is called for, and that bigger hammer is, of course, statistics. And this is where the mischief often begins. For any non-obvious medical assertion, no matter how often repeated, or how widely believed, if statistics are needed to prove its efficacy, eyebrows should automatically raise.
Statistics
People that design clinical trials rel y on statistics packages. I read a book on designing clinical trials (I am neither a doctor or clinical researcher) and was appalled really. Much of the trial size and construction is dictated by whatever statistics package they are using. There was little discussion of “why.” This seems odd, rather like the statistical fox guarding the henhouse. Further, anyone with knowledge of these packages knows that by selecting subsets and manipulating parameters of the cohorts treated, almost any result can be obtained. Keep an eye out for this. If something in the cohort description seems irrelevant, it was probably selected to obtain a desired result. This meddling with the statistical parameters had gotten so bad that in the psychology journals, the researcher is often forbidden to modify the statistical analysis methodology after the experiments begin.
John von Neumann, a major contributor to 20th century math had this to say about adjusting parameters to obtain desired results: “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.” It is all too easy. Even genuinely honest researchers will inadvertently be prone to selection bias. It’s just human nature.
Furthermore, statistics need only be used if the result is indefinite. If 50% of a cohort benefits from a treatment, that’s a done deal, as least as far as efficacy is concerned. The other side of the coin are the side-effects. How many is it hurting, and how much? If the treatment seems safe, it might take a lot of tests to elicit the correct answer. Statistics could help there. Unfortunately side effects are often swept under the carpet with phrases like, “well tolerated” or “no obvious side effects.”
Problem #2: I am not a statistic
And neither are you. Medicare, for instance, won’t pay for an inexpensive PSA blood test. This is tragic, because prostate cancer is one of the few cancers that will exhibit early warning signs in a blood test. And caught early, prostate cancer is seldom fatal. How could this seemingly illogical decision have been made? The answer is statistical. Statistically, men whose PSA is measured and, if indicated, subsequently treated, live no longer than men whose PSA is not measured at all. So why bother with PSA? This defies common sense to such an extreme extent that it seems to have blown right by the high medical priests that make such decisions. What is the problem here? Is it detection? Or is it unnecessary or inappropriate treatment? Over 27,000 men in the U.S. die annually from this cancer, and without a doubt, most of these men did not get early treatment, and would be alive if they had. Still, it’s Evidence Based Medicine, so that’s OK. Right?
Men: The hell with statistics. Use common sense. Get that PSA test. Insist. Pay for it yourself if necessary. This way, you can avoid being number 27,001. If you have prostate cancer, DON’T PANIC. The odds that you have it are about equal to your age. Most likely it won’t kill you. Most men go to their graves with prostate cancer, but not because of it. If a close watch is kept, the PSA fluctuations will indicate when and if treatment is necessary. Or would you rather skip the PSA test and just hope for the best?
Statins are the drug industries solution to just about everything, including the bottom line. Statins lower LDL cholesterol. Why would we want to do this? Common wisdom: it is said to cause heart attacks. Common sense: Our bodies are regulating LDL cholesterol at a certain level for some very good reason. It is folly to arbitrarily tamper with that level. Not nice to fool Mother Nature.
So what are the net results for statins? This depends on statistics. If we select the right statistics, the ones from tests that the drug industry sponsored, statins are beneficial. If we look at other statistics, there appears to be no benefit. Any reduction in heart attacks is offset by death from other causes. It may turn out that statins are quite harmful, as the long-term effects aren’t yet fully known yet. It is entirely possible that the dementia and diabetes epidemics are exacerbated by statins. Unfortunately we don’t have any statistics to support this (yet).
What about statins and common sense? Let’s start with LDL cholesterol. Why does it cause heart attacks? Actually, only certain types of LDL cholesterol are dangerous at all. LDL-cholesterol particles come in different sizes. The smaller ones are dangerous. The larger ones aren’t. Some people naturally have the larger particles, but anyone can force their LDL-particles to the larger, harmless sizes. People with larger cholesterol particles certainly do not need their cholesterol lowered. Want to make sure you have these large particles? Amazingly, you don’t get there by eating low cholesterol food or eliminating saturated fat. The safe larger particles are obtained by cutting sugar and starches. More info here, here, here, and here. So, for starters, why doesn’t standard practice look into these nuances? It certainly makes no sense to give statins to someone that is unlikely to ever have a heart attack. But au contraire. For a huge segment of the medical profession, it’s statins if you are over 50, or have slightly high cholesterol, or blood pressure over 140, etc.
Will common sense ever return to medicine?
There is a place for statistics, make no mistake. Without further knowledge, whichever treatment has worked best in the past for others, would be the appropriate first choice. And that is statistical inference. But statistics today has taken on grand exalted power it simply doesn’t deserve. Even obvious medical choices aren’t undertaken without elaborate statistical justification. This broad application is often used to make individual recommendations like, “cut out the saturated fat.” These recommendations would almost always be far more effective if the actual patient’s likely response to the proposed intervention were examined instead. If one is really worried about, say, the healthiness of saturated fats, do an experiment: Measure triglycerides, fasting and average glucose. We want all three lower. Lower is healthier. Then significantly increase or decrease the saturated fat for a three month period and measure again. Do whichever makes you, the individual you, not the statistical you, get healthier.
Generic statistical advice never applies to any…one.
“If your experiment needs statistics, you ought to have done a better experiment.” Ernest Rutherford