Science, reporting, & communication

Last night, I posted about a basic research paper that has gotten some press attention. As is often the case, much of the nuance and context of the original work is missing from the press coverage or buried at the end of the stories.

I am a self-proclaimed protein science junkie, and my career has centered on proteins in  immunity. However, while doing my PhD in Chemistry, I also learned a bit about toxicology. My uni had a good program in the field, and part of my PhD supervisor’s lab worked on molecular and cellular toxicology – which is really biochemistry and cell biology with a dash of pharmacology. I picked up enough to be skeptical of claims as broad as “Popcorn linked to Alzheimer’s”.

Another element added to my skepticism. My PhD mentor, well-versed in toxicology, was a popcorn fanatic. So our lab got the rundown when manufacturers announced they would be changing their microwave popcorn formulas due to safety concerns regarding diacetyl. (Note this announcement came very shortly after the first account of a consumer  experiencing adverse effects under rather rare circumstances.) So my mind jumped to diacetyl when I first heard about the current story from Jason Goldman on Twitter:

I clicked the link and found this example of how to do make a hash of science reporting:

MINNEAPOLIS, Minnesota (KTLA) — Next time you’re at the movies, you may want to think twice before asking for extra butter on your popcorn.Scientists at the University of Minnesota have found that an ingredient in fake butter, known as diacetyl, may triggerAlzheimer’s disease.In addition to microwave and movie popcorn, diacetyl is also used in margarine, snack foods, candy, baked goods, pet foods and other products such as beer and chardonnay wine.

The study also found a link between the flavorant and lung damage in popcorn factory workers.

What exactly did KTLA do wrong?

  • Not only was there no link or citation for the paper, the piece didn’t even list the journal where the work is published or the lab that did the work. I suppose we should be grateful that they mentioned the institution, which allowed me to find the paper via Google Scholar.
  • Only 1 sentence out of 4 actually addressed the current study. And that one sentence was a substantial generalization and overstatement of the work done.
  • The last sentence is false – the findings stated are correct but the result of medical research from other groups more than 5 years ago.
  • There is no qualification of dose – that (as was noted in the paper’s abstract) this is proposed as a potential hazard for industry workers who are chronically exposed to very high levels of diacetyl (as compared to the typical consumer).

The original press release is pretty reasonable. And other news outlets did a better job of covering the story, for example, this CBS health story. My major qualms with most media coverage are the overly broad headline (I realize creating statements that are both concise and accurate isn’t easy) and every lede I saw started with consumer focus when the research is about industry exposure.

Don’t get me wrong. Understanding the effects of industry exposure levels of chemicals is necessary and important. Such studies inform corporate and regulatory policy. But, as seen here, the media* can (and often does) misconstrue study findings, sometimes giving the impression of data that simply isn’t there. This adds to the challenge scientists face of communicating research to the public and to policy makers. It seems at times we are expected to deal in absolutes, but in science, there are subtleties and unknowns.

It reminds me of an admonishment Captain Kirk laid before Spock one time: “Insufficient data is not sufficient, Mr. Spock! You’re the science officer. You’re supposed to have sufficient data all the time.” In reality, scientists don’t have sufficient data all the time. Data are sometimes incosistent, sometimes nonexistent. We keep pushing forward – combing the literature, designing the next experiment, looking for that next bit of data, adjusting our models – to come up with the most complete and logical answer within our power and ability. Usually it leaves us wanting more – and in some ways, part of the beauty of science is that it’s never really done.

* Clarification: Here I am referring mainly to mainstream media health and science coverage, not science journalism. It’s been noted in many conversations that bad science coverage is often the result of a journalist covering topics all over the map who has maybe 15 or 20 minutes to digest a press release and write something up. There is a lot of good science journalism out there. The case study here is simply an example of why scientists should pay attention to how the media is reporting research.

Related post: Check out Doctor Zen’s thoughts on food & trust at Neurodojo.

This entry was posted in communication, public outreach, science literacy. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s