Report calls for regulations around dangerous pathogen research

Researcher working in fume hood
Researcher working in fume hood

March 7, 2024 – In recent years, concerns have been raised about the dangers of research on risky pathogens—the kind that could trigger a pandemic. In February, a task force convened by the Bulletin of Atomic Scientists released a report that makes recommendations on strategies for conducting such research, calling for stronger oversight. Harvard T.H. Chan School of Public Health’s Marc Lipsitch served on the task force. Lipsitch, professor of epidemiology and director of the Center for Communicable Disease Dynamics at Harvard T.H. Chan School of Public Health, discusses key takeaways from the new report.

What was the impetus behind this report?

There’s a growing realization in many quarters that while most biological research is completely safe or poses risks only to a small number of individuals involved in research, there’s a very small subset of biological research that poses risks beyond the individuals involved, because it involves the potential for an accident or a deliberate misuse that could lead to widespread transmission of the agent being studied.

The issue is quite polarized between some individuals who suggest that scientists should be able to regulate themselves and there’s no need for outside oversight, and others who have proposed, and in some cases enacted, bans on whole categories of research at the state level, for example in Florida.

The task force was by design a very international, cross-disciplinary group, including virologists, epidemiologists, biosafety experts, social scientists, a philosopher, and others. The idea was to find some kind of common ground on what is the scope of research that needs to be more carefully scrutinized and what that scrutiny might be.

What was amazing about the process was that, despite different backgrounds and in some cases different starting positions, we were able to come to agreement on a set of principles that, if taken seriously by those designing policies, really has some teeth.

What are some key takeaways from the report?

The most important thing is that this group acknowledges that there is a scope for special regulation of the types of research that risk a pandemic. This sort of research is a small subset, but a particularly risky subset. We also state that this is not a matter for science to self-regulate. It’s a matter for public consideration, and the public has a right to be a part of the process.

The second key takeaway is that the report gets specific about the types of research that are of top concern: research with known pandemic pathogens; research that could somehow create a pandemic pathogen; and research with pathogens of unknown risk.

A third takeaway has to do with equity, which has been lacking in the discussion in the past. It is an unfortunate but true fact that the benefits of scientific research come first, in most cases, to the developed world, and only later, if at all, to the rest of the world. That means that research that promises potential biomedical benefits at the risk of possibly starting a pandemic is just further disadvantaging groups that have already been disadvantaged, because they don’t have first crack at the benefits and are likely to face disproportionate risks.

Another key principle articulated in the report is that researchers should look as hard as possible for safer alternatives to risky research.

Was there anything particularly controversial or tricky as you and the other task force members worked on the report?

The particulars about the kinds of risks, how big they are, and what the benefits need to be—that did require some judgment. There was a lot of discussion in the group about how strongly to assert the need for a public health benefit to offset a public health risk. My preferred interpretation is that the scientific benefit alone is not good enough—that you need a clear path to a public health benefit before you should accept any risk to public health.

How do you foresee the report’s recommendations being implemented?

You would have to have some kind of local, national, or institutional implementation. But the principles outlined in the report are global and the enforcement or the incentives should be global. Journals would play an important role. For example, if a researcher did something that didn’t get proper scrutiny, a journal should not accept it for publication.

What’s happening in the U.S. regarding oversight of risky pathogen research?

At the moment, there is a panel within the Department of Health and Human Services that reviews some of this research, but their identities and deliberations are secret. Our task force would like to see greater transparency in the process. In the meantime, the federal government is considering updating the framework that governs this review process. I hope that the update comes out soon and is consistent with the kind of recommendations we’ve outlined in this report.

What are your thoughts on the level of risk the word faces right now from a lab leak that could start another pandemic?

I don’t know the magnitude of the risk. I just know that it’s our responsibility to minimize it.

Karen Feldscher

Photo: iStock/pkujiahe