Patient safety in Oregon--How I got the story

Author(s)
Published on
March 15, 2012

My experience reporting on health care in Oregon has been mostly positive, particularly with regards to transparency. Public information is typically handed over without fuss, officials are reachable and often willing to talk and the state, at least from my experience, has a generally favorable attitude toward the press. When I started my project on patient safety, I figured I would encounter much the same thing. I was wrong.

Oregon is far less transparent about individual hospital quality than many other states. It does not have a mechanism to release publicly data in its inpatient hospital discharge database, which is typically used to track rates of patient safety problems. In many states, that database is controlled by a public agency; in Oregon it's controlled by the hospital association. 

I had proposed my project using a dataset that I knew existed, but found out during the fellowship that there were better data and better ways to measure quality than the dataset I had planned to use. So, I spent the first few months trying to track down that data and gain access. The effort was frustrating and ultimately fruitless. I lucked out a bit, as the Centers for Medicare and Medicaid Services last fall published some of the data I had been looking for, for Medicare patients. It wasn't what I wanted but it was something I could use.

I also convinced the research branch of the state health department to do some high level analyses for me. Their analysis wouldn't let me compare hospitals, but it would let me look at rates of errors throughout the state and in Central Oregon, the region where I report.

Next, I set about finding patients. This was much easier, as I have been reporting in this community for a long time, and had a few people who had called me or who I had found out about who had been harmed by medical care. I called and found some who were willing to share their stories.

Once I had my main elements, I decided to modify my series somewhat. I had initially thought I might split out quality data by payer, but did not find significant correlations there. I did, however, want to highlight the lack of transparency around quality data in the state, so I decided to add that story. In the end, I published three stories, one chronicling the experiences of patients who had been harmed, another using Medicare data on individual hospital quality and the final story on data transparency.

In the week since the stories first began running, I have gotten quite a bit of feedback. Most people are grateful and say my stories exposed important issues. Patient safety advocates in the state have been universally complimentary, saying the stories brought to light information even they didn't know. A woman at the state who helped on the stories said the articles were making their way around to policy makers, and she heard people talking about them. The story was Tweeted by a number of journalists and patient safety professionals.Our paper's editorial board wrote about it, calling for more data transparency.

The main negative feedback I heard was from people who were upset at my choice of lead in the main story, chronicling patients' experiences. I used a recent lawsuit in which a surgeon cut a patient's nerve, and friends of that surgeon were upset that we had used his story. Some of them agreed that we had treated him fairly, but were still upset that he was named at all.

In all it was a very satisfying experience to publish. I have since learned that larger papers in the state are looking at the data we used, presumably to see if they might too run stories on patient safety. I am hoping that the stories lead to more transparency around the data release. We'll see.