A Small Rant About the Meaning of Significant vs. “Significant”

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.


Jim Manzi has a long blog post today about the Oregon Medicaid study that got so much attention when it was released a couple of weeks ago. Along the way, I think he mischaracterizes my conclusions, but I’m going to skip that for now. Maybe I’ll get to it later. Instead, I want to make a very focused point about this paragraph of his:

When interpreting the physical health results of the Oregon Experiment, we either apply a cut-off of 95% significance to identify those effects which will treat as relevant for decision-making, or we do not. If we do apply this cut-off…then we should agree with the authors’ conclusion that the experiment “showed that Medicaid coverage generated no significant improvements in measured physical health outcomes in the first 2 years.” If, on the other hand, we wish to consider non-statistically-significant effects, then we ought to conclude that the net effects were unattractive, mostly because coverage induced smoking, which more than offset the risk-adjusted physical health benefits provided by the incremental utilization of health services.

I agree that we should either use the traditional 95 percent confidence or we shouldn’t, and if we do we should use it for all of the results of the Oregon study. The arguments for and against a firm 95 percent cutoff can get a little tricky, but in this case I’m willing to accept the 95 percent cutoff, and I’m willing to use it consistently.

But here’s what I very much disagree with. Many of the results of the Oregon study failed to meet the 95 percent standard, and I think it’s wrong to describe this as showing that “Medicaid coverage generated no significant improvements in measured physical health outcomes in the first 2 years.”

To be clear: it’s fine for the authors of the study to describe it that way. They’re writing for fellow professionals in an academic journal. But when you’re writing for a lay audience, it’s seriously misleading. Most lay readers will interpret “significant” in its ordinary English sense, not as a term of art used by statisticians, and therefore conclude that the study positively demonstrated that there were no results large enough to care about.

But that’s not what the study showed. A better way of putting it is that the study “drew no conclusions about the impact of Medicaid on measured physical health outcomes in the first 2 years.” That’s it. No conclusions. If you’re going to insist on adhering to the 95 percent standard—which is fine with me—then that’s how you need to describe results that don’t meet it.

Next up is a discussion of why the study showed no statistically significant results. For now, I’ll just refer you back to this post. The short answer is: it was never in the cards. This study was almost foreordained not to find statistically significant results from the day it was conceived.

GREAT JOURNALISM, SLOW FUNDRAISING

Our team has been on fire lately—publishing sweeping, one-of-a-kind investigations, ambitious, groundbreaking projects, and even releasing “the holy shit documentary of the year.” And that’s on top of protecting free and fair elections and standing up to bullies and BS when others in the media don’t.

Yet, we just came up pretty short on our first big fundraising campaign since Mother Jones and the Center for Investigative Reporting joined forces.

So, two things:

1) If you value the journalism we do but haven’t pitched in over the last few months, please consider doing so now—we urgently need a lot of help to make up for lost ground.

2) If you’re not ready to donate but you’re interested enough in our work to be reading this, please consider signing up for our free Mother Jones Daily newsletter to get to know us and our reporting better. Maybe once you do, you’ll see it’s something worth supporting.

payment methods

GREAT JOURNALISM, SLOW FUNDRAISING

Our team has been on fire lately—publishing sweeping, one-of-a-kind investigations, ambitious, groundbreaking projects, and even releasing “the holy shit documentary of the year.” And that’s on top of protecting free and fair elections and standing up to bullies and BS when others in the media don’t.

Yet, we just came up pretty short on our first big fundraising campaign since Mother Jones and the Center for Investigative Reporting joined forces.

So, two things:

1) If you value the journalism we do but haven’t pitched in over the last few months, please consider doing so now—we urgently need a lot of help to make up for lost ground.

2) If you’re not ready to donate but you’re interested enough in our work to be reading this, please consider signing up for our free Mother Jones Daily newsletter to get to know us and our reporting better. Maybe once you do, you’ll see it’s something worth supporting.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate