Before I start talking about that vitamin study you all want to know about, I want to say a few words about MSNBC and FOX NEWS.
Trust me, it’s relevant.
No matter what side of the political fence you’re on, I’m sure you’ll agree that cable news has become extremely shrill and highly partisan. Both MSNBC and FOX may agree on the facts they are reporting but then spin them in an entirely different way to reach entirely different conclusions. Each political argument is founded on certain “if’s, ands and maybes”; i.e. this policy will lower (or raise) the debt assuming certain projections (such as medical costs or unemployment) are in fact true. Different researchers come up with very different projections (just read the Wall Street Journal stock advice columns!) Depending on whose projections and figures you use, even well-intentioned honest people can come to very different conclusions.
So why am I talking about cable news in a story about women and vitamins? Because, sadly, the same thing that happens on cable news happens in nutrition science. The problem is everyone knows it’s happening in cable news, but people naively think science is always “objective” and reporting about science is actually accurate.
Neither is true.
Take the latest scary study that’s got everybody all a-twiddle about how if you’re an older woman taking some common vitamins, you might die.
The Media’s Take: Fair and Balanced, Anyone?
Let’s start with the reporting. One typical headline I saw about this story shrieked, “More Bad News About Vitamins!” Now if you read that without shaking your head, go back and think for a minute about what’s implied in that headline. We’re talking one study with a very mildly (and very questionable) negative result (we’ll get to that in a minute).
Now compare that one study to the dozens and dozens of studies that come out on a regular basis showing the benefits of vitamin K, vitamin D, vitamin C, minerals like selenium, magnesium, fats like omega-3s, and even- in several studies the lowly multivitamin. A writer or newspaper or television station with a different slant might easily have titled this story, “A Surprising Negative Study on Vitamins Amidst a Sea of Positive Ones”. “More Bad News About Vitamins!”? Serious? (Yes, I used “serious?” instead of “seriously” on purpose. I feel like it gives me street cred. Please humor me.)
OK, now let’s get to the study itself, and what it found. Which isn’t very much. But let’s take a look.
“Let’s Go To The Videotape”
The study was titled “Dietary Supplements and Mortality Rate in Older Women: The Iowa Women’s Health Study.”
The researchers took the database of the Iowa Women’s Health Study and examined the records of 38,772 older women average age 61.2 at the start of the study—looking specifically at their use of dietary supplements.
Well, they didn’t exactly look at the women at all, since it was not a clinical study. No one was given supplements and monitored, supplement use wasn’t confirmed by any outside source, nothing like that. No, they assessed supplement use with three…count ‘em, three… self-reporting questionnaires given to the women at three different points during the 18 year study, which began in 1986 and continued through 2004. (No one was asked about doses, brands, combinations, nothing. Just “did you use a supplement?” “Yes: Vitamin C, vitamin B, vitamin E, multivitamin, calcium, iron”.)
OK, cool, see you in 11 years or so!
The researchers then examined the death records through the State Health Registry of Iowa and through the National Death Index. They checked for all original 38,772 women and found that by Dec. 31, 2008, 15,594 of them had indeed died. (Which was approximately 40 percent of the women. But do remember, at baseline- 1986—they were pushing 62. This is 22 years later. An optimistic way to look at it is that 60 percent of these ladies were living into their mid eighties! But I digress, and this really has nothing to do with the story.)
But that’s OK, because the study itself is pretty boring and doesn’t have very much to tell us. Although you’d never know it from the media attention it got (see above). First let’s look at the conclusions of the study, then we’ll talk about what they mean. (Spoiler alert: they mean next to nothing. I’ll show you why.)
The conclusions of the study (in the researchers’ words): “In older women, several commonly used dietary vitamin and mineral supplements may be associated with increased total mortality risk; this association is strongest with supplemental iron. In contrast to the findings of many studies, calcium is associated with decreased risk.” Since the words “associated” or “association” are used three times in the above paragraph, let’s take a minute and look at what an association (observational) study actually is.
What Exactly Is An “Observational” Study?
In an observational study from which many associations are generated, you take a whole bunch of people—thousands of them—and you gather data about a zillion different things.
Maybe it’s blood pressure and cholesterol, maybe it’s heart disease, maybe it’s what they ate for breakfast, how often they brush their teeth, how many of their parents had diabetes, how many of them own television sets, practice the rhumba, love Lady Gaga, take antidepressants, or pop a Centrum now and then.
OK, now you’ve got a statistician’s version of heaven— tons and tons of data. Eighty gazillion gigabytes of numbers from thousands of people, and it’s your job to see if there’s any pattern, to determine which things are “associated,” meaning “found together.” If two things are said to be associated, that means there is some relationship between these two things that’s unlikely to be an accident.
Which brings us to “yellow finger syndrome.”
Correlation, Cause and “Yellow Finger Syndrome
Interestingly, people with lung cancer are more likely to have yellow, stained fingers. So yellow stained fingers are positively “associated” (correlated) with lung cancer. In any given group, the more cases of yellow fingers you see, the more cases of cancer are likely.
Hmm…so who would have yellow fingers?
Let me guess. Smokers?
You can see in this case how wrong it would be to assume that because two things are associated, there is a cause and effect relationship. An association is not proof of cause. Yellow fingers don’t cause lung cancer, and lung cancer doesn’t cause yellow fingers. They’re found together because they’re both associated with a third variable, namely smoking. Smoking causes lung cancer, and yellow fingers are a kind of irrelevant by-product of the real cause. (This kind of mistake is made all the time in cholesterol studies where high cholesterol is “associated” with heart disease except it’s not a cause even though everyone thinks it is. But I digress.)
So one thing we might ask is, what else might be true of women who are taking vitamins? Remember this started in 1986, and vitamin usage wasn’t what it is now. Maybe these people were a bit sicker at baseline and were seeking out vitamins as a way of not getting sicker? Maybe they were people who were eating a particularly bad diet and told themselves that vitamin caps would make up for it? Who knows?
You always have to ask yourself, with any association, what else might be going on here? What else might be “interfering” or “confounding” the results? Were all the vitamin takers, for example, also soccer players? (Of course not, but there’s a wild example of how an uncontrolled variable can have a huge effect on the results without anyone noticing.)
The Confounding Variable Issue
Researchers are very aware of confounding variables, so they try to adjust for these influences with statistical techniques (“adjusting for possible confounding variables”) but they don’t always adjust for the right ones. Or they can over adjust and wind up with an “association” that’s a pure statistical fluke. I’ll come back to this “adjusting” thing in a minute—it’s very relevant to our little story, and wait till you hear how it relates to this study.
Though you’d never know it in a million years from any newspaper article or television story about this study, here’s what was true of the supplement using women at the beginning of the study: (This is taken directly from the actual research paper in the Archives of Internal Medicine.)
At baseline, compared to nonusers, supplement users:
- had a lower prevalence of diabetes
- had a lower prevalence of high blood pressure
- smoked less
- had lower average BMI
- had lower average waist to hip ratio
- had higher educational levels
- were more physically active
- were more likely to be on estrogen replacement therapy
Then, get this—(you’re going to love this one!) Adjusted for age and (calorie) intake, supplement use of vitamin B complex, vitamins C, D and E and calcium had significantly lower risk of total mortality compared to non-use.
Wait, I thought the study concluded vitamin takers had a higher risk of total mortality?
Patience, grasshopper. We aren’t finished with the data.
OK, the researchers must have thought, age and calories are important, glad we adjusted for those, but there are probably a few other things to adjust for, so they did just that. “With further adjustment only the use of calcium retained a significantly lower risk of mortality,” they explain.
So none of the vitamins (except calcium) had a protective effect, which was exactly the hypothesis they set out to prove. (Their words: “Our hypothesis, based on the findings of a previous study by some of us, was that the use of dietary supplements would not be associated with a reduced rate of total mortality.”) Great, hypothesis confirmed, vitamins suck, we can all go home now, right?
Ah what the heck. Let’s squeeze the data a little more, throw in some more things, see what we come up with. Uh oh. Squeeze that data even more and presto now those three-times-in-18 years self-reports of vitamin use are now “associated” with a higher rate of mortality. Do I have to tell you they were serving champagne that day in every marketing department of every pharmaceutical company in America?
So What’s the Risk?
The real punch line is that with all that hoopla, what “increased risk” are we talking about? Depending on the vitamin, maybe 6 percent to 15 percent. But let’s look at what that means, since it sounds way worse than it is. If non-vitamin users died at a rate of 15 per 1000, vitamin users would be expected to die at a rate of between 15.9 women per 1000 (a 6 percent increase in risk) to 16.5 women per 1000 (a 10 percent increase in risk). Now that’s no small thing if you happen to be among the .9–1.5 women affected, but let’s keep it in perspective. It’s a tiny association of questionable meaning-not exactly the death toll for the multivitamin, as Dr. David Katz solemnly proclaimed it on the Huffington Post.
I mean, come on.
Look, I’m not dismissing this study completely. But I am saying that there’s very little likelihood there’s anything to it. Put enough data into the mix and you can come up with associations to make almost any case. (The China Study, T. C. Campbell’s book about The China Project— a massive study of diet and health in rural china—is a perfect example of this kind of data selecting. Out of 8000 associations generated in the original China Project, T. C. Campbell picked just those that supported his pro-vegan hypothesis and put them in his book, The China Study, conveniently omitting all the many associations that refuted his theory. But don’t get me started.)
Now if I were preparing a scholarly rebuttal to this study, I could go back and search out the many, many studies showing how low folic acid is a risk for cancer, how folic acid helps prevent spinal tube birth defects, how vitamin D effects mood, physical performance, obesity, cancer, how vitamin C increases phagytosis (a function of the immune system), indeed how virtually every vitamin tested in the study has been shown in other studies to perform vitally important functions essential to your health.
But honestly, I give speeches, write books and columns and run a health website for a living, I don’t have research assistants. I don’t have graduate student interns who can look all this stuff up and find the references.
So what I’m hoping is that one of the more brilliant health bloggers like Denise Minger or Chris Masterjohn, avowed and self-described data-nerds, will spend a week sitting up all night with the research and will come up with their usual brilliant, referenced, unimpeachable, “just the facts, ma’am” rebuttals to the findings in this study.
Meanwhile let me just say this: It’s a tempest in a teapot. Does it make any logical sense that in a study of over 30,000 women lasting 19 years, with eight gazillion other factors involved, popping the equivalent of a Centrum or One-A-Day (or saying that you did on the three questionnaires you filled out over the course of the study) made you more likely to die?
That just doesn’t pass the smell test for me