Tag Archives: gain of function

Comments on the NSABB Meeting on Gain of Function

On May 5, 2015, the National Science Advisory Board for Biosecurity held a meeting to review the gain-of-function deliberative process, and solicit feedback on their draft framework on the process (published April 6).

As part of that meeting, I am presenting public comment on the ethics of the deliberative process. A copy of the handout I provided to the members of the NSABB—updated to correct a couple of typographical errors—is available here.

You can also view the webcast of my comments live. I am not sure when I’ll be speaking—the public comment sessions are planned for 2:00pm-2:30pm, and again at 3:30pm-3:50pm. However, if you want to watch me give comment (or the rest of the meeting) the webcast is available here.

Advertisements

A Risk-Benefit Analysis is not a Death Sentence

As is stated by Marc Lipsitch on the Cambridge Working Group site, the CWG reflects a consensus. My personal views do not reflect the views of the group. When you build a consensus, you often don’t end up with everything you wanted. When a group of very different people forms around a common issue, the outcomes that get devised are heavily moderated by the competing priorities and backgrounds of the participants. Sometimes that leads to a stagnation.[1] Other times, it leads to a more reasonable and practical set of priorities. In the case of the Cambridge Working Group, in which I participated as a founding member last month, our Consensus Statement on the Creation of Potential Pandemic Pathogens (PPPs) was the product of deliberation on the types of steps the eighteen founding members could agree on. For those of you who are just arriving, PPP studies involve the creation of a novel pathogen that could, if released, cause a disease pandemic. In my line of work, PPP studies are a type of “gain of function” study, and associated with dual-use research—scientific research that can be used to benefit or harm humanity. When it comes to PPP studies, the CWG stated one ultimate goal:

Experiments involving the creation of potential pandemic pathogens should be curtailed until there has been a quantitative, objective and credible assessment of the risks, potential benefits, and opportunities for risk mitigation, as well as comparison against safer experimental approaches.

And one proximate goal in the pursuit of that ultimate goal:

A modern version of the Asilomar process, which engaged scientists in proposing rules to manage research on recombinant DNA, could be a starting point to identify the best approaches to achieve the global public health goals of defeating pandemic disease and assuring the highest level of safety.

In short, we want to ask a question: what are the risks and benefits of PPP studies? To ask that question, we want to convene a meeting. And though we’ve no ability to stop them, we’d really like it if scientists could just, I don’t know, not make any new and improved strains of influenza before we have that meeting. Simple, right? Well, I thought so. Which is why I was surprised when colleague said this:

Wait what?!

Hyperbole is Not Helping

NewProf is right: *if* we shut down (all) BSL-3/4 labs, there would be nowhere (safe) for people to work on dangerous pathogens like Ebola, or train new people to do the same. The only problem is that no-one—that I know of—is saying that.

First: the CWG statement says nothing about shutting down laboratories. As a consensus statement, it is necessarily limited by pragmatic considerations. The CWG calls for a risk assessment. It calls for collecting data. That data collection is focused on PPP studies, and primarily in the context of influenza research. Even if the CWG were to be looking at Ebola, PPP studies would (I really, really hope) be a very small subset of Ebola research. Of course, NewProd is not concerned only about individual research, but whole labs:

That is, NewProf claims that a CWG-inspired risk assessment would lead to labs shutting down, which would lead to there being “no scientists trained to study/treat/find cures for Ebola.” But that’s equally ludicrous. A risk assessment of a small set of experiments would be unlikely to result in an entire field being unable to perform. In fact, that would be a really bad thing. The risk of that bad thing would—ought—to be something that informs the risk-benefit analysis of research in the life sciences. Regulation that unduly limits the progress of genuinely (or even plausibly) beneficial research, without providing any additional benefit, would be bad regulation.

Grind Your Axe on Your Own Time

What is most frustrating, however, is how mercenary the whole thing feels. If you are concerned about the Ebola virus, you should be concerned that the public health effort to stem the tide of the virus in West Africa is failing. That a combination of poverty, civil unrest, environmental degradation,, failing healthcare, traditional practices, and a (historically justified) mistrust of western healthcare workers is again the perfect breeding ground for the Ebola virus. You shouldn’t be concerned about a risk-benefit analysis that has been advocated for a particular subset of scientific experiments—with a focus on influenza—that may or may not lead to some outcome in the future. Dual-use research and the Ebola virus, right now, have very little to do with each other. If there comes a time where researchers decide they want to augment the already fearsome pathology caused by the virus with, say, a new and improved transmission mechanism, we should definitely have a discussion about that. That, I think it is uncontroversial to say, would probably be a very bad idea.

A Personal View of Moving Forward

I’ve been present the last few days talking about Ebola, primarily on Twitter (and on other platforms whenever someone asks). I’ve not had a lot of time to talk about the CWG’s statement, or my views on the types of questions we need to ask in putting together a comprehensive picture of the types of risks and benefits posed by PPP studies. So here’s a few thoughts, because it is apparently weighing on people’s minds quite heavily. I don’t know how many high-containment labs are needed to study the things we need to study in order to improve public health. I know Richard Ebright, in the recent Congressional Subcommittee Hearing on the CDC Anthrax Lab “incident” mentioned a figure of 50, but I don’t know of the basis on which he made that claim. As such, I, personally, wouldn’t back such a number without more information. I do know that the question of risk and benefits of PPP studies—and other dual-use research—has been a decade in the making. The purported benefits to health and welfare of gain-of-function research, time and again, fail to meet scrutiny Something needs to happen. The next step is an empirical, multi-disciplinary analysis of the benefits and risks of the research. It has to be empirical because we need to ground policy in rigorous evidence. It has to be multi-disciplinary because first, the question itself can’t be answered by one group; second, the values into which we are inquiring cover more than one set of interests. That, as I understand it, is what the CWG is moving towards. That’s certainly why I put my name to the Consensus Statement. I’m coming into that risk-assessment process looking for an answer, not presuming one. I’m not looking to undermine any single field of research wholesale. And frankly, I find the use of the current tragedy in West Africa as an argumentative tool pretty distasteful.


  1. The twists and turns of consensus-building are playing out on a grand scale at the current experts meeting of the Biological and Toxins Weapons Convention in Geneva. My colleagues are participating as academics and members of NGOs at the meeting, and you can follow them at #BWCMX. And yes, I’m terribly sad to not be there. Next time, folks.  ↩