Why you should flag estimates in your health stories

Author(s)
Published on
May 27, 2016

This is the latest post in an ongoing series called “The Power of Small Data." You can find earlier installments here. — Ed.

Presidential candidates started talking about prescription drug deaths last year on the campaign trail.

Then, after Prince died and people close to him started saying he had been prescribed Percocet, stories about overdose deaths in the United States started to pop up again.

The numbers being used in those stories — 2 million Americans addicted, 44 dying a day — are all estimates.

The latest CDC report on opioid deaths declares that from “1999 to 2014, more than 165,000 people have died in the U.S. from overdoses related to prescription opioids.” Those numbers, too, are estimates.

Most numbers you see in health stories, in economic stories, in stories about the environment or war or a natural disaster, are estimates.

Yet very few stories acknowledge that.

Here are a few ways that you can help illuminate the estimation process for your audience.

Name estimates when you use them. Most reporters present each number as a concrete fact. As if every person who ever died from a prescription drug overdose was given an autopsy, had the requisite amount of drug found in their system, and then was logged into some national database of drug deaths. Even if that were the case, we’d still have the problem of human error, which would mean that the resulting number was still an estimate. Pointing out that a number is an estimate serves two purposes in news stories. First, it’s an accurate reflection of the evidence for your audience. Second, over time it will lead to less frustration and cynicism among your audience, who may feel that one study is telling them one thing and another is telling them something completely different. Steven Reinberg at HealthDay helpfully used the term “estimates” recently in a piece about the impact of diabetes on overall health. The emphases in the text are mine.

The researchers estimated life expectancy and years lived with disability using data from Australian diabetes and death registries. At age 50, a diabetic man can expect to live another 30 years, on average — about 17 of them with disability. A woman that age with diabetes will likely live about 34 years, but she will be burdened with disabilities for roughly 21 of those years, the study authors estimated.

Explain the source of the estimates. Reinberg does this in the paragraph above, too. He says that the data came from “Australian diabetes and death registries.” Even better if you can be more specific, link back to the studies involved, or link directly to the data. Suzanne Perez Tobias at the Wichita Eagle gave a tidy list of sources for a story on health insurance rates in Kansas.

The estimates are based on statistical models combining data from a variety of sources, including the American Community Survey, Census Bureau population estimates, federal tax returns, Medicaid participation records and 2000 Census statistics. For [Hunter Health Clinic CEO Susette] Schwartz, the Hunter Health Clinic executive, all those numbers add up to more people at the clinic’s five sites in Wichita.

Get into the science. We treat our audiences as if they know far less about math and science than they actually do. I think we could afford to talk to them at a higher level, especially when we are asking them to believe health findings that may have a direct impact on their lives. Instead of being afraid that we might scare them away, bore or confuse them, we should do what Tobias started to do in that paragraph, explain a bit more about what went into making the estimates that we are asking them to put faith in.

Here’s an example from business writing. Business writers have always been more comfortable showing more of the math behind their numbers than health writers, perhaps because there is a regular stream of estimates coming out about labor, sales, prices and the like, which are always revised downward or upward with the next iteration of estimates. Nelson Schwartz, then at Fortune now at The New York Times, wrote a piece about future scenarios for oil prices, as predicted by an investment fund executive:

To come up with some likely scenarios in the event of an international crisis, his team performed what's known as a regression analysis, extrapolating the numbers from past oil shocks and then using them to calculate what might happen when the supply from an oil-producing country was cut off in six different situations. The fall of the House of Saud seems the most far-fetched of the six possibilities, and it's the one that generates that $262 a barrel. … Regressions analysis may be mathematical but it's an art, not a science. And some of these scenarios are quite dubious, like Venezuela shutting the spigot.

Notice that last bit at the end? It’s always a good idea to bring your own good judgment into the writing as well. If there are reasons to be doubtful about the estimates, tell your audience. They will trust you more over time, learn more, and make better decisions as a result.

[Photo by Paul Sableman via Flickr.]