UW Conversation on the Controversial H5N1 Papers
Those of you who follow science news may be familiar with the controversy surrounding two recent papers on H5N1, or avian flu. Two research groups, one in the US (led by Yoshihiro Kawaoka) and one in the Netherlands (led by Ron Fouchier) showed that with a small number of mutations, the virus could become transmissible in mammals.
This research became controversial when scientists raised concerns that it might be too dangerous to publish, because it could potentially be misused to create bioweapons or accidentally trigger a global pandemic as other researchers try to mimic the work. After initially recommending that the papers be published only in redacted form, however, the US National Science Advisory Board for Biosecurity (NSABB) reversed its decision and recommended publication in full.
This controversy has received a lot of coverage in the media, and I won’t rehash it here (though if you are interested, you can see how it progressed from this list of articles on ScienceInsider). The controversy over these papers is particularly close to home in Madison, however, because Kawaoka is a professor here at UW-Madison, and his research was conducted in a lab at the University Research Park.
Yesterday, the Wisconsin Institute for Discovery hosted a panel discussion about Kawaoka’s work. There were a couple of interesting points that came up during this discussion which have been largely absent from the media coverage, and I thought I might highlight a few of them here.
Yesterday’s panel included three speakers, each of whom gave a short presentation before the floor was opened to questions from the audience. The speakers were:
- Yoshihiro Kawaoka, principal investigator for one of the two controversial papers, and professor at UW-Madison
- William Mellon, an associate dean for research at UW-Madison and professor in the school of pharmacy
- Pilar Ossorio, a professor of law and bioethics at the university’s law school
Kawaoka focused primarily on the motivation for doing these experiments and why the NSABB changed its stance on the publication of his work.(*) All in all, I thought his presentation was informative, but for the most part he only rehashed material that has been covered elsewhere. See, for example, his comment in Nature about why the flu work is urgent, Ed Yong’s explainer about the risks and benefits of publishing the mutant flu studies, Ed Yong’s post about why the NSABB changed its mind and recommended publication, and (again) Ed Yong’s storify of the H5N1 conference at the Royal Society in April, at which I think Kawaoka gave a pretty similar presentation.
Mellon gave a similarly informative talk, though he focused on the federal and university rules and guidelines about dual-use research, which is a subject that I know less about. He talked a lot about the NSABB’s new recommendations about how to assess and manage dual use research of concern (which you can read in the surprisingly short document found here).
He also gave an overview of the process for assessing this type of research at the university, starting from an initial evaluation by the researcher during the planning stage, through institutional review of the risks and risk management strategies, and up to the researcher’s responsibility to publish the research in a responsible manner.
I think his main point here was that the biosafety and biosecurity concerns are considered at EVERY step of the process. From some of the mass media coverage it might be easy to conclude that Kawaoka and Fouchier did this research without thinking through any of this stuff, but Mellon’s discussion made it clear that (for Kawaoka’s work, at least) that is not true.
To me, however, the most interesting of the three presentations was Pilar Ossorio’s discussion of the ethical aspects of conducting and regulating this type of research.
Thinking in terms of trade-offs
Ossorio was particularly critical of the dual-use ‘framing’ for this work. She claimed that thinking about research as “dual-use” implicitly forces us to think of it as an either-or decision: either we keep the research secret and stay safe, or we make it available and make ourselves vulnerable to the research being misused. We begin to think in terms of direct trade-offs between knowledge dissemination and safety and security.
But, Ossorio argued, the flip-side to this is that in many senses sharing the research and promoting further work makes us more secure, not less. Knowing how to predict and respond to pandemics makes us more secure; if we don’t do the research to help us know how to respond, we make ourselves less safe.
She also commented that thinking strictly about safety and security trade-offs also “illegitimately de-emphasizes a number of other important values” including justice, civil rights, and the legitimacy of scientific institutions. She mentioned, for example, that our focus on biosecurity shifts our public health spending toward the study of organisms that are “of high importance to us, but not necessarily the ones that have the biggest health impact [worldwide],” and perhaps this is not a particularly just way for us to direct our research plans.
Ossorio ultimately advocated for an alternative framing of the problem, which she referred to as optimization. Instead of thinking about tradeoffs between two options, we should think about all of the relevant values, and find a way to balance them for the optimal outcome.
I thought this was a particularly interesting point, because much of the media coverage of Kawaoka and Fouchier’s work has focused on the direct tradeoffs between knowledge dissemination and safety/security issues. So it was rather thought-provoking to be asked to think about the other types of issues we might need to consider.
The audience asked several good questions after the formal presentations. Three that I found particularly interesting were the following (paraphrased for length and clarity):
1) Why have we only heard about the US’s role in assessing this research (through the NSABB), and as we go forward, how do we get the rest of the world involved?
I’m glad someone asked this question, because I think this is a really important issue. When potential global pandemics are at issue, decisions about what should be done and what should be published really need international collaboration and consensus.
Kawaoka responded to this question by pointing out that the US is currently the only country even prepared to address these questions; as he noted, the “NSABB is the only committee of its kind.” However, he said that the EU is currently organizing an equivalent group, and other countries may follow, so hopefully this will become an international conversation before too long.
It is not entirely true that the US was the only country to “decide” whether or not Kawaoka and Fouchier’s papers should be published, by the way; the World Health Organization (WHO) also chimed in in late February and recommended that the papers be published in full. However, I don’t think that the WHO could have forced the US to allow publication (of Kawaoka’s paper, at least) if the NSABB had not reversed its stance.
2) Should risk assessment be incorporated into the grant review process, and if so, how?
This one has received some media attention, because the US’s new guidelines for assessing dual-use research, released in late March, state that proposed research on specific toxins and viruses needs to undergo a risk assessment before funding. In many respects, this seems reasonable; if something is obviously too dangerous to pursue, it makes sense to cut it off as early as possible.
However, Mellon and Ossorio both cautioned that incorporating extensive risk assessment in the grant review stage runs the risk of hampering research. Mellon pointed out that researchers like Kawaoka are already heavily scrutinized, and that that takes much time and energy away from their other activities. As he said, “I don’t know how [Kawaoka] continues to do research, because he is scrutinized very heavily.”
Ossorio brought up a slightly different, but related, concern: that researchers would be discouraged from even planning or developing research which might be subject to this sort of strict review. As she said, “the incremental gain in safety is in no way balanced by the cost… not just monetarily, but also in terms of discouraging people from doing certain science or going into certain types of research.”
So, there’s a lot of food for thought here – it seems like some sort of early review is necessary, though I got the impression that Mellon thinks scientists and institutional review boards are already doing a pretty good job of that. But at some point, we run the risk of making this into too great a burden, and we might slow science down or prevent it from going forward. It’s unclear where that point is, but it’s certainly something we need to keep in mind.
(I think the NSABB recognizes this, by the way – see for example the last paragraph on page 5 of their March recommendations. This aspect just hasn’t gotten a lot of media attention).
3) If there is research which we decide needs to be restricted, are universities (and especially public universities) the right place for this research to be carried out?
Okay, this was my question, though I had to have a friend ask it since I’d lost my voice. The reason I wanted to ask this question is that in my mind universities, especially public universities, have a duty to the public, and to do research that benefits (and is communicated to) broader audiences. To me, it seems like highly restricted or classified research belongs in institutions which are really designated for that purpose, like national labs and such.
Mellon agreed that this is a troubling problem. Particularly with the new risk assessment regulations, funding agencies have the ability to classify research or remove funding if it is deemed too risky, which throws a wrench in the works for university labs. But he also pointed out that we don’t really have a sense for whether the research would move forward as quickly if it were classified; it’s possible that advances are much slower when you limit the core group of people who can be involved.
“The reason we have been leaders” in this field, he said, “is because we are open about our research” and make it available to as broad a community of researchers as possible.
Ossorio also mentioned that the faculty in a university setting also need to be concerned about their graduate students and postdocs who might spend years working on research which ultimately can’t be published or that they can’t talk about when they do job interviews. This could be a serious career impediment for many students, and is something that faculty members need to keep in mind.
I’ll end this post there, because this is getting long, but this is really just a sampling of the many diverse issues which came up during this panel discussion. Overall, I found the discussion quite thought-provoking, and I’m glad I had the chance to attend.
And I think that even though the two H5N1 papers in question here have been cleared for publication, they raise a lot of issues about oversight and research ethics which we aren’t fully prepared to address. It will be really interesting to see how the public conversation on these issues evolves in the coming years.
(*) When Kawaoka was talking about why the NSABB changed its stance on publication, he spent a lot of time discussing the biosafety and biosecurity precautions in place in his lab, complete with pictures. Now I understand Ed Yong’s tweet from early April saying that Kawaoka’s lab “looks like a submarine” – it really does! Airtight doors all over the place…