Flu scare

Publishing scientific research might help prevent the next pandemic, but there is legitimate fear that critical information could fall into the wrong hands. A moratorium on flu research is giving scientists and policymakers time to hash it out. But how long can it go on?

Credit: Paul Sahre

Popular culture has long fixated on the microbe that could lay waste to humankind. Just consider The Andromeda Strain, Contagion, even Stephen King's ode to a fictitious superbug he cheekily named Captain Trips. Consider our neurosis justified. Bubonic plague, tuberculosis, the Spanish flu of 1918, AIDS—all of them claimed millions of lives before running their courses or being controlled, to varying degrees, medically.

Of the above quartet of misery, experts believe the one most poised for a comeback is a highly contagious, potent form of influenza, which actually refers to a broad array of viruses that initially attack the lungs through inhalation. Many flu strains are related to the 1918 version, links that were only discovered after extensive research, including genetic recreation of the 1918 pathogen from frozen and preserved tissue samples beginning in the late 1990s.

It was the kind of research that gave insight into how flu strains could mutate so quickly. (One theory behind the 1918 version's sudden demise after wreaking so much devastation was that it mutated to a nonlethal form.) The same branch of research concluded in 2005 that the 1918 flu started in birds before passing to humans. Parsing this animal-human interface could provide clues to stopping the next potential superflu, which already has a name: H5N1, also known as avian flu or bird flu.

Given its lethality, and the chance it could turn into something far more transmissible, one might expect H5N1 research to be exploding.

This potential killer also has a number: 59 percent. According to the World Health Organization, nearly three-fifths of the people who contracted H5N1 since 2003 died from the virus, which was first reported in humans in Hong Kong in 1997 before a more serious outbreak occurred in Southeast Asia between 2003 and 2004. (It has since spread to Africa and Europe.) Some researchers argue that those mortality numbers are exaggerated because WHO only counts cases in which victims are sick enough to go to the hospital for treatment. Still, compare that to the worldwide mortality rate of the 1918 pandemic; it may have killed roughly 50 million people, but that was only 10 percent of the number of people infected, according to a 2006 estimate.

H5N1's saving grace—and the only reason we're not running around masked up in public right now—is that the strain doesn't jump from birds to humans, or from humans to humans, easily. There have been just over 600 cases (and 359 deaths) since 2003. But given its lethality, and the chance it could turn into something far more transmissible, one might expect H5N1 research to be exploding, with labs parsing the virus's molecular components to understand how it spreads between animals and potentially to humans, and hoping to discover a vaccine that could head off a pandemic.

Instead, the research has come to a voluntary standstill. Thirty-nine of the world's top flu researchers issued a joint statement in the January 20, 2012, issue of Science announcing they were temporarily suspending all work involving "highly pathogenic . . . H5N1 viruses leading to the generation of viruses that are more transmissible in animals," a move applauded and supported by the U.S. government's top research funders. "Congratulations on the voluntary moratorium," Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, told hundreds of flu researchers at their July Centers of Excellence for Influenza Research and Surveillance conference, before letting them know the piggy bank was still shut. "But NIAID, NIH, and HHS [Health and Human Services] cannot go along with lifting the moratorium on studies . . . related to pathogenesis or transmission of H5N1 that is performed with U.S. government funds."

At its root, what has frozen the players in their tracks is the same emotion that surrounds the specter of a pandemic: fear. Not of discovery. Nor knowledge. But rather the potential for nefarious use of information gained in such pursuits.

Science has been here before. The question becomes, Have we learned from the past? Johns Hopkins researcher Ruth Faden thinks we have, and that building a consensus among both the scientific and national security communities is the key to moving forward.

Since the days of Los Alamos, J. Robert Oppenheimer, and the race to build the bomb, researchers have been forced to confront the so-called dual use research of concern conundrum: that their quest for knowledge could yield results that might end up in the hands of unscrupulous parties or enemies of the state. During the latter stages of World War II, the time line appeared so compressed, the opponent's aggression and intent so obvious, that our domestic national security concerns demanded a full-scale effort to unlock the secrets of the nuclear bomb.

If the Los Alamos scientists had qualms about creating atomic weapons that might someday be turned on their makers, they kept such concerns to themselves—or were too caught up in the rush of discovery to overtly admit them—until it was too late. As Oppenheimer noted when the first Trinity bomb test turned the sky to fire, "A few [researchers] laughed, a few cried." Many broke into spontaneous dance. One, a Harvard physicist, sardonically summed up the mood when he shook Oppenheimer's hand and said, with no irony, "Now we're all sons of bitches."

The fear of bioweapons blasted onto the front page in 1995, when sarin nerve gas attacks killed 13 people in the Tokyo subway system.

At its heart, the dual-use question generally comes down to opposing agendas: national security interests wanting to tightly control information versus scientists who believe investigation and achievement are based upon open dissemination, publication, and replication of results. While the two sides can be united, as they were in World War II, often there is conflict. This is especially true regarding emerging science and technologies. A particularly gray area has been biotechnology. The fear of bioweapons blasted onto the front page in 1995, when sarin nerve gas attacks killed 13 people in the Tokyo subway system. Those concerns escalated in the month after 9/11, when anthrax-laced letters caused five deaths and infected nearly two dozen people. Talk of genetically manipulated "weaponized" anthrax began filling the airwaves, and it didn't take long for both defense analysts and scientists to wonder what other pathogens could be manipulated.

The conversation quickly turned to research on common infectious diseases, the type of work that has saved lives by the millions. Smallpox, polio, TB, measles—each has been either eradicated or greatly controlled, especially in developed countries, because research into its cause, prevention, and potential cure continued unabated.

So, too, is the case with the flu. Research has shown that most modern-day strains of the flu are slight variations on those previously experienced by the public, so a good deal of the human population already has immunity. This, along with annual flu shots for common seasonal strains, limits yearly cases of influenza. But with avian flu and its relative lack of human exposure lurking in the background for the past decade, H5N1 researchers faced a particular dual-use challenge. They felt compelled to determine how the flu could move from animals to humans, mainly by manipulating different genes within H5N1's gene sequence. The idea was to see if a mutated H5N1 form that transmitted well in a lab setting between animals actually resembled something already occurring in nature. If that were the case, public health officials could test birds for the transmissible mutations and attempt to stop a pandemic before the virus entered the human population—for example, by killing off infected farm-raised poultry before the disease could spread to human workers.

Yet the very nature of such experiments raised security implications—so much so that a nearly decade-old government committee actually stepped in for the first time last December to voice concerns that effectively halted publication of certain H5N1 research. When virologists Ron Fouchier of the Netherlands and Yoshihiro Kawaoka of the University of Wisconsin independently discovered they could inoculate ferrets—which sneeze like humans—with mutated forms of H5N1 and create a potentially airborne transmissible form of the virus, the U.S. National Science Advisory Board for Biosecurity voiced objections. In an unprecedented move, the NSABB, an independent body of science and biosecurity experts that consults with federal agencies including the Department of Health and Human Services, strongly urged Science and Nature to delay publication of the papers. The NSABB wanted the authors to alter their respective papers' language to "explain better the goals and potential public health benefits of the research," and notably, exclude the methodology sections that are vital for replication of science. HHS agreed with the suggestions, with an NSABB spokesperson noting, "The recommendations were that the papers not be published in full, that the papers be modified and the results be redacted so that someone with malevolent intent could not exactly replicate the results."

There was plenty of potential fear of bioterrorism surrounding Fouchier's research, much of it generated by the researcher himself. In November 2011, he told Science that the mutated form of H5N1 that he injected into ferrets was "probably one of the most dangerous viruses you can make," and NSABB members agreed. Debate erupted throughout the scientific and national security communities about whether the experiments should have been conducted in the first place. In the midst of the tumult, Fouchier and Kawaoka, backed by 37 of their colleagues, gave themselves the equivalent of a time-out: the research moratorium they agreed to in January 2012.

It didn't take long for Johns Hopkins' Ruth Faden to find herself in the middle of the fray. The public first heard from Faden on the subject on NPR's nationally syndicated Diane Rehm Show last December, where she joined Anthony Fauci, Science Editor-in-Chief Bruce Alberts, and NSABB member and infectious disease policy expert Michael Osterholm. A bioethicist and head of the Johns Hopkins Berman Institute of Bioethics, Faden deftly laid out some of the perspectives of the scientific and defense communities. "We are not a zero-risk culture," she told Rehm. "We have to assume some level of risk when we're pursuing something of great [scientific] importance. The question is, Do we have the mechanisms in place to assure that we are properly assessing the benefits and risk and that we have the plans to manage the risk?"

For Faden, looking at the potential effect of an experiment before the first gene is spliced is vital for the proper handling of dual-use research of concern. That wasn't the case with the Fouchier and Kawaoka studies. In an editorial published in Science just weeks after the NPR appearance, Faden and Ruth Karron, director of the Bloomberg School of Public Health's Center for Immunization Research and the Johns Hopkins Vaccine Initiative, took the NSABB to task for not being able to look at controversial research when it was in the formative stages. They also noted that, in the case of Fouchier and Kawaoka, the NSABB did not accompany its recommendations for holding back full details of the scientists' works with a proposal as to who in the scientific and public health communities should have total access to the science to judge its merits and concerns.

For Faden, looking at the potential effect of an experiment before the first gene is spliced is vital for the proper handling of dual-use research of concern.

By waiting until the ferrets were sneezing across their cages, Faden and Karron essentially argued, the NSABB was put in the difficult reactive position of having to quash information and media rumors of a "superflu" just a security breach away from becoming a terrorist weapon.

"This question [of what to do with information in papers the NSABB finds concerning] should not have caught the NSABB or NIH by surprise," the authors wrote in Science. "According to the chair of the NSABB, that committee was not given the job of developing a system for distributing sensitive information. But if that is the case, then this remit should have been given to some other identified entity when the NSABB was established."

If the authors sound annoyed, their frustration is understandable. Faden, one of the most respected bioethicists in the country, was part of the 2001 National Academy of Sciences committee called in the wake of the anthrax attacks. The committee's charge was to balance dual-use research concerns regarding possible bioterrorism with continuing bioscience advancement.

The committee issued its Fink Report (named after Massachusetts Institute of Technology genetics professor Gerald R. Fink, who chaired the report) in 2004, published under the ominous title Biotechnology Research in an Age of Terrorism. The report recommended establishment of the NSABB and also called for numerous other steps, including "harmonized international oversight."

"One of the things we had to acknowledge was that even if the U.S. had a perfect system—and right now we have a nonsystem—but even if it were perfect it really would not make for a secure world if other countries didn't have systems and if the systems didn't work together," says Faden. "This is about collective global action: One of the roles of the NSABB we envisioned would be as the lead entity on the U.S. side to be working with counterparts in other countries to create a global governing structure at the intersection of science and biodefense."

Such is not the case, with Faden and Karron noting in their essay that "in the eight years since [the Fink Report], no coordinated system for oversight of dual-use research, either national or international, has been implemented."

Faden is careful not to take sides in either the publishing or investigators' moratorium—as an ethicist without access to the NSABB's inner workings, she says, "I don't have the expertise on my own to make that call. This is where science and technical experts in this area have to be thinking this through with national security people to make the judgment." However, she notes that scientists have used moratoriums profitably in the past, particularly during the development of the pioneering use of recombinant DNA. In the early 1970s, researchers began introducing DNA from different gene sources—viruses, plants, or bacteria—into host cells to see what grew. Lack of institutional safeguards at the time potentially exposed lab workers—and perhaps the general public, if a genetically modified virus escaped the lab—to unknown hazards. The concerns grew great enough that a National Academy of Sciences committee in 1974 called for a halt to all recombinant DNA research until a comprehensive framework for safely conducting such experiments could be established. This occurred in the landmark Asilomar Conference seven months later, and the moratorium was lifted.

The success of Asilomar isn't lost on flu investigators who agree with the current temporary work stoppage. Andrew Pekosz, an associate professor of molecular biology and immunology in the Bloomberg School, who was not a signatory to the moratorium because his flu lab does not work with H5N1, nonetheless supports the move as a way of restoring public confidence in both the researchers and the research. "When [Fouchier's and Kawaoka's] research came out, there were a lot of people who didn't realize this research was being done under strict containment conditions. That unknown contributes to fear. I think this speaks to the fact that we as scientists need to do a better job of communicating not only what we are doing but how we're doing it. These issues of safety and biocontainment are important to us.

"That's why the moratorium was a good idea," Pekosz continues. "We need to send the message out to the general public that there are issues people want answers about, and rather than just going on like we are, let's all take a break and discuss this and not have pressure of continuing the research and having somebody censor the material before it's released."

Interestingly, the research moratorium may have played a role in resolving the NSABB's publishing dilemma. At the time when Science and Nature announced they were putting a temporary hold on the Fouchier and Kawaoka papers, their move created deep divides, including some surprising responses from notorious free-speech advocates. "We nearly always champion unfettered scientific research and open publication of the results," wrote the New York Times Editorial Board. "[But] in this case it looks like the research should never have been undertaken because the potential harm [from accidental or intentional release of the virus] is catastrophic and the potential benefits from studying the virus so speculative."

Former Johns Hopkins infectious disease expert Thomas Inglesby, now at University of Pittsburgh Medical Center, bluntly agreed. He told CNN that, in dealing with the mutated virus, "we are playing with fire. . . . It could endanger the lives of hundreds of millions of persons."

Mount Sinai microbiologist Peter Palese offered a countering view, worth noting since his lab helped reconstruct the 1918 influenza virus in 2005 with NSABB's approval. "During our discussions with NSABB, we explained the importance of bringing such a deadly pathogen back to life," Palese wrote in Nature. "Although these experiments may seem dangerously foolhardy, they are actually the exact opposite. They gave us the opportunity to make the world safer, allowing us to learn what makes the virus dangerous and how it can be disabled." Palese claimed publishing the work allowed other researchers to show that the 1918 flu, should it return, can be combated using "seasonal flu vaccines [and] common flu drugs"— vital information for public health workers and emergency planning.

Both of the controversial flu research papers were eventually published. Upon further review, the NSABB cleared Kawaoka's paper and it appeared in full in the May issue of Nature. NSABB recommended "further scientific clarification" of the Fouchier research, and it appeared, with WHO and NIH support, in the June issue of Science, with a clarified methodology section.

For their part, Science's editors, in a magazine issue completely devoted to H5N1, reflected Faden and Karron's concerns that the NSABB needed international and domestic strengthening. The editors added that they agreed with the NSABB "mechanism," calling the eight-month delay in publishing "a 'stress test' of the systems that had been established to enable the biological sciences to deal with 'dual-use research of concern.'"

"This is about recognizing and ensuring the public's continued trust in science."
Ruth Faden

While the publishing moratorium has ended, the research moratorium goes on. Originally scheduled for 60 days, it is now on an indefinite extension. NIAID's Anthony Fauci has announced a conference, scheduled for this month, to bring together scientists, biosafety experts, NSABB personnel, and the public to further discuss the risks and rewards of continued H5N1 research. International experts have also been invited, with their cooperation absolutely essential to future research: Indonesian scientists have been involved in an ongoing quarrel over the withholding of vital local strains of H5N1 from global researchers if they are not party to data obtained from those strains.

At this point, some scientists appreciate that their colleagues are willing to tread lightly for the time being. The September/October issue of mBio, the journal of the American Society for Microbiology, was dedicated to the topic, with an editorial calling the research pause "a historic moment for science." One contributor, Stanford infectious disease expert Stanley Falkow, supported continuing the moratorium, arguing that it should have started "once the first ferret sneezed." He noted that cloning of certain genes was held up for several years while safety and other concerns were addressed and suggested that the same should hold true of H5N1 research.

On the other hand, in that same issue, Fouchier and Kawaoka, along with moratorium signatory Adolfo García-Sastre, called for the immediate resumption of H5N1 research: "Now we know it is possible that these viruses could adapt to mammals, but without more data, we cannot fully assess the risk or implement appropriate containment measures," they wrote. "To contribute meaningfully to pandemic preparedness, we need to conduct more experiments . . . in a timely manner."

Perhaps there's an awareness that, going forward, what is needed is both standards of practice and, most importantly, transparency of process for the real shareholders in this research. As Ruth Faden notes, "This is about recognizing and ensuring the public's continued trust in science. Being a scientist is an awesome privilege, and scientists have an individual moral obligation to reduce the likelihood that their work will bring about bad consequences for the world."

And getting there, says Faden, means opening dual-use research concerns to all and sharing the burden for its proper application among both scientists and nonscientists. Karron agrees that building consensus is key. "My thought is in this day and age it's important to broadly engage individuals outside your area of expertise when these questions arise," Karron says. "[NIAID's] Tony Fauci said, 'You must engage civil society.' We can learn from each other. It is important to listen, to hear about potential opportunities or threats you may not have considered."

But someone else has.

Mat Edelson is a freelance writer based in Baltimore.