Report smarter: Five lessons from Health News Review

Author(s)
Published on
May 12, 2014

When Gary Schwitzer recently announced that his funding had run out for HealthNewsReview.org, it caused a surprising amount of angst among health reporters.

If you know Gary, you were saddened, of course, because he’s such a generous spirit and powerful voice for positive change. But if you don’t know Gary personally, you are likely to have had one overriding impression of him. He and his team of expert reviewers assessed one of your stories for HealthNewsReview.org and probably found some problems you either didn’t know were there or were hoping that no one noticed.

Gary is a pioneer. He created the first-ever self-regulated, expert-based, scientifically rigorous assessment of health reporting in the U.S. It has made journalism better. It has made individual reporters better. I have been one of his reviewers for years, and it certainly has made me better. If it is not rescued by an angel investor, we all are going to miss it.

Fortunately for all of us, Gary analyzed 1,889 stories reviewed for the site and published the results online in JAMA Internal Medicine. Here are five lessons from that analysis.

Cover the basics. Gary founded HealthNewsReview.org on the premise that there are 10 essential elements to every news story about a new drug, device, procedure, or treatment.

In JAMA’s summary, reviewers analyzed stories based on whether the story:

(1) adequately discussed the costs of the intervention; (2) adequately quantified the benefits of the intervention; (3) adequately quantified the harms of the intervention; (4) evaluated the quality of the evidence; (5) widened the diagnostic boundaries of illness and promoted public awareness of these widened boundaries, which may expand the market for treatments, a practice that has been termed disease mongering; (6) quoted independent sources and identified the conflicts of interest of sources; (7) compared the new approach with existing alternatives; (8) established the availability of the intervention; (9) established whether the approach was truly novel; and (10) appeared to rely solely or largely on a news release as the source of information.

Uncover the true risk. Most news stories quoting risk statistics are talking about the relative reduction in risk, and that can make the benefits of a treatment look bigger than they really are. I’ve written about how to combat this phenomenon before. Gary offers one good example from Andre Picard at Toronto’s The Globe and Mail, who wrote an article titled, “Be skeptical about the Herceptin hype”:

Herceptin, according to the studies, cut the death rate by one-third. That sounds impressive, but relative risk reductions always do. In reality, the difference in the death rate between the Herceptin and non-Herceptin groups was 2% after three years, and 4% after four years.

Mark a study’s limits. All studies have limitations: Small sample sizes. New and therefore untested methods. Lack of controls. But reporters too often skip over those limitations, often failing to note that an association between two things does not necessarily mean that one caused the other. Gary wrote about repeated stories linking coffee to good and bad health outcomes . Such stories typically fail to note, always without noting the limitations of the study in question:

Each story used language suggesting cause and effect had been established, although it had not. Examples of such language are as follows: “coffee can kill you,” “2 cups of coffee lowers uterine cancer risk,” “one or more cups a day reduces stroke risk,” coffee “radically reduces the risk of colon and rectal cancer,” and “coffee fights skin cancer.”

Get a second opinion. Perhaps the most surprising finding in Gary’s JAMA analysis was this: “Half of all stories reviewed relied on a single source or failed to disclose the conflicts of interest of sources.”

If you polled journalism students around the country on five lessons they learned in their first year in school, the top answer surely would be: Always find more than one source. Yet the stories Gary and his team of experts reviewed too often lifted text directly from news releases or relied on a single person – or company – for every fact in the piece. This can be dangerous for a number of reasons, not the least of which is the reliability of the information. Gary noted that David Brown from Tthe Washington Post documented a case where a biotech CEO was criminally convicted for “willfully overstating in a press release the evidence for benefit of a drug his company made.”

Recognize the difference between signs and destinations. Reporters often write about cholesterol levels or blood pressure or blood sugar other measures that are often called “surrogate markers” as if they were the same thing as ischemic heart disease, stroke, or diabetes. They are important ways for monitoring changes in someone’s health status and sometimes the only way for clinicians to understand whether a person’s condition is improving or deteriorating.But the simple fact of cholesterol going up or down doesn’t mean that a person will have a greater or lesser chance of dying from a heart attack. Gary wrote:

WebMD reported, “Beet juice may fight dementia.” Not only did this study point to a surrogate end point, it was conducted on a sample of only 14 people in just 4 days. USA Today reported, “New drug ‘may turn back the clock on heart disease.’” However, the story focused on changes in cholesterol values that may not lead to any true benefit to patients.

I hope that a wealthy health care executive reads Gary’s piece in JAMA Internal Medicine and decides to cut him a check. I also hope that he brings his insights and his passion for making health journalism better to Reporting on Health. It’s nearly always sunny in Los Angeles, Gary!

Photo by Gunther Eysenbach via Flickr.