SXSW Interactive: Learn from scientists for better journalism

Author(s)
Published on
March 13, 2012

SXSW, Angilee Shah, reporting on health, science, health journalismBoth journalism and science are about "the quest for truth" said the presenters at the SXSW Interactive panel "What Journalism Can Learn from Science."

Journalists know something is true when two people say so, said Gideon Lichfield, media editor of The Economist, and Matt Thompson, editorial product manager at NPR and adjunct faculty the Poynter Institute. Scientists are much more rigorous. How can we make journalism more like science?

Lichfield and Thompson wore labcoats in their presentation to say that there are certain properties in scientific research that can really improve journalism. Science is collaborative; scientists know they are contributing to a huge body of work and recognize that they are building on others' work.

"There's no lone reporter in the wilderness just covering a story on their own," said Thompson.

So a story from Politico might cite a report by the Center for Public Integrity. Google News, and its grouping of reports from various sources, makes it clear that there are clusters of reporters covering a particular or issue. Scientists have a Science Citation Index that makes clear that their work is building on others. Thompson and Lichfield proposed a 'citation index for news."

"We don't know yet what a citation index would look like but the the beginnings of it are there," said Lichfield.

Science is also replicable; scientists lay bare how they've come to their conclusions. When you read a news story, can you recreate the "trail of investigation"? Lichfield said no; we tend to protect our work. While there is a standard in news to make clear the sources of facts, the depth of attributions vary. From the presentation, for example, one slide put these citations together:

...said the official.

...the official told me.

...the official told me over drinks after work.

...said the official in a call with reporters.

Which tells you the most about how the information was obtained?

Science also uses a kind of "predictive test" which allows you to test hypotheses and come up with theories, but recognizes that there is no absolute certainty. Theories "remain falsifiable," which is a trait journalism does not always have, said Lichfield.

Thompson pointed to the "Friedman Unit," the ways in which famed columnist Thomas Friedman makes very predictive statements about what will happen over set periods of time. And this is something journalists and their sources do all the time: Are aging nuclear reactors the newest, biggest threat? Will a politicians statement lead voters away? When will the war end?

"We include all these predictions," said Thompson, "but right now, that's where we leave it."

Lichfield and Thompson said that any tool journalists use to make their work more rigorous needs to have certain characteristics. Tools need to be self-justifying, with obvious usefulness. They need interoperability, or usefulness on multiple platforms. And they should be easy-to-use, even for journalists who are most resistant to change. (Storify is a good example of a news tool that meets all these criteria, the presenters said.) So what are some things news organizations are doing to add rigor and transparency to their reports?

  • ProPublica provides "reporting recipes," which explains to readers how a story was investigated and gives people a chance to repeat the process.
  • The New York Times uses a tool called DocumentCloud When you hover over certain links, it takes you to an original source and the reporters' notes.
  • Academic papers and Wikipedia use footnotes, which add context and allow for easier fact-checking and, Thompson said, could help with the flow of narratives while still providing transparency.
  • Politifact created the "Obameter" in which they tracked the President's campaign claims and whether those claims were fulfilled. This kind of tool presses politicians and sources to offer "testable hypotheses" instead of "vague predictions." When it was built, the Obameter was not the only way to evaluate the Presiden't performance, but it offered a good example for a "prediction tracker" that encourages follow-up.
  • Add data and precision. Many organizations are beginning to do this; Thompson pointed the efforts of StateImpact at NPR to integrate data into much of its reporting.

These kinds of changes are not always easy to implement, but they represent a shift of focus that Thompson and Lichfield can get excited about.

Read more from SXSW Interactive.

Photo credit: Horia Varlan via Flickr