Shortly before Thanksgiving science reporters and bloggers began buzzing about a newly created, genetically modified version of the deadly bird flu that could easily be transmitted between ferrets, which closely mimic the human response to flu.

"Locked up in the bowels of the medical faculty building here (Rotterdam, the Netherlands) and accessible to only a handful of scientists lies a man-made flu virus that could change world history if it were ever set free," is the lead to a Nov. 23 Science magazine blog entry, which goes on to note that the "scientists believe it's likely that the pathogen, if it emerged in nature or were released, would trigger an influenza pandemic, quite possibly with many millions of deaths."

Although the bird flu has been decimating poultry flocks overseas since the 1990s, it rarely threatens humans. That's because the virus, designated H5N1, is generally passed to people through contact with infected birds, and seldom jumps between humans. Indeed, World Health Organization figures indicate fewer than 600 people have suffered from confirmed cases of the H5N1 virus since 2003.

But of those confirmed victims, an eye-opening 59 percent have died.

While the early reports on this lab-engineered bird flu focused on the work of Ron Fouchier — the Dutch virologist who first broke the news of his discovery at a flu conference in Malta this past September — it soon was confirmed that noted University of Wisconsin-Madison researcher Yoshihiro Kawaoka had also developed an H5N1 strain that was highly transmissible between ferrets.

Although Kawaoka — whose lab is housed at UW-Madison's Influenza Research Institute located within University Research Park on Madison's west side — shied from media contact, additional details emerged and mainstream media outlets across the world started jumping on a story that read like a screenplay for a Hollywood bioterror blockbuster.

Fears were two-fold. Kawaoka and Fouchier were close to publishing their findings in academic journals, and biosecurity experts sounded the alarm that if the research details were made public, rogue scientists or terrorists would have the recipe to develop some sort of "bioweapon." Others were worried that if these agents were stolen or somehow escaped from a lab, millions could die in a devastating pandemic. 

"It's interesting that this research became a concern, because from my perspective I'm not worried about it," Paul Umbeck, the director of UW-Madison's environment, health and safety department, says after outlining for a reporter a lengthy list of federal and institutional regulations and safeguards in place to help oversee such potentially dangerous experiments. "But cutting-edge is always going to be somewhat controversial to somebody."

Or, in this case, seemingly to most everybody. 

On Dec. 20, a committee that advises the federal government on biosecurity issues voted to recommend that the details of Kawaoka's and Fouchier's research not be published. It was the first time the 23-member National Science Advisory Board for Biosecurity (NSABB) had made such a recommendation. And the vote by the board — which comprises scientists, security experts and public health officials who freely admit they almost never reach a consensus — was unanimous.

The details of this bird flu research, the argument goes, are too perilous for the public to possess. 

"Scientists bristle when they hear something like that," says Vincent Racaniello, a professor of microbiology and immunology at the College of Physicians and Surgeons of Columbia University in New York, and an outspoken critic of the advisory board's recommendation to censor the bird flu research. "But I've been told it's naïve to think science can go the way it always has because we're in a different era now."

A Jan. 7 New York Times editorial then warned of "An Engineered Doomsday" and noted that the bird flu "research should never have been undertaken because the potential harm is so catastrophic and the potential benefits from studying the virus so speculative."

Multiple messages left with Kawaoka seeking comment in the past several weeks went unanswered, as did emails sent to several of his influenza research colleagues at UW-Madison.

Michael Imperiale, a professor of microbiology and immunology at the University of Michigan who sits on the biosecurity advisory board, says despite his vote against publishing the details he disagrees with those who contend this bird flu research should never have been undertaken.

"These guys were asking some legitimate scientific questions," Imperiale says of Kawaoka and Fouchier seeking to learn whether the H5N1 virus could become more easily transmissible between mammals.

In late December, the researchers and scientific journals agreed that when the findings are published they will be devoid of critical details — as long as a mechanism can be put in place to give full access to other legitimate researchers. On Jan. 20, Kawaoka, Fouchier and 37 others announced they had agreed to stop their influenza research for 60 days — a timeout, of sorts — to allow for more discussion about this so-called dual-use research that has the potential to be used for good or bad.

"After World War II, the physics community realized, wait a minute, all this stuff about nuclear research, that's potentially dangerous," says Imperiale. "I think the life science community has to have, for the lack of a better term, a little bit of an awakening."


Kawaoka for years has conducted a range of groundbreaking research designed to shed light on the basic workings of potentially dangerous agents, particularly influenza viruses.

In an effort to bolster his research — and to keep Kawaoka from accepting an attractive recruitment offer from the University of Pittsburgh — UW-Madison in 2006 announced plans to launch a new Influenza Research Institute. In January 2009, Kawaoka and about 20 other scientists associated with his research moved into a $12.5 million facility, which includes high-level biosafety lab space, at University Research Park. 

In December 2008, a UW-Madison team led by Kawaoka and Tokiko Watanabe isolated the three genes that made the 1918 Spanish flu the most deadly influenza pandemic in history. In July 2009, Kawaoka and an international team of researchers provided a detailed portrait of that year's pandemic H1N1 virus and its pathogenic qualities. In November 2009, he landed a five-year, $9.5 million grant from the Bill and Melinda Gates Foundation to identify virus mutations that would serve as an early warning of potential pandemic flu viruses. And in February 2010, Kawaoka authored a study that warned of the potential for the bird flu to exchange genetic materials with human seasonal flu viruses to form a new strain that could be both highly contagious and deadly.

At FluGen, a company he helped found that's located nearby in the University Research Park, Kawaoka and others have been working on ways to improve the production and effectiveness of vaccines to combat a range of flu strains.

These efforts, and many more, have generally been hailed as essential for understanding more about H5N1 and other flu transmission, aiding global influenza surveillance activities and furthering the development of vaccines and other therapies to counter potential pandemics. Indeed, the National Institutes of Health funded the most recent work that's coming under fire as part of its efforts aimed at pandemic preparedness.

Putting a stop to, or even censoring, such important research is "irresponsible," Kawaoka argues in a commentary published Jan. 25 by the journal Nature.

"I argue that we should pursue transmission studies of highly pathogenic avian influenza viruses with urgency," he writes. Kawaoka later adds that "because H5N1 mutations that confer transmissibility in mammals may emerge in nature, I believe that it would be irresponsible not to study the underlying mechanisms."

It's clear Kawaoka believes that the benefits of conducting such research and fully publishing the results outweigh potential risks. And there is no shortage of those who agree.

"We have this virus that we think might be dangerous to people," says Racaniello, the virologist from Columbia University. "So if we know this is possible, don't we have to work at it? The way you solve problems is to circulate information in science. You don't hold back. Stopping the work doesn't make any sense."


Racaniello argues the NSABB and biosecurity experts who contend the risks associated with publishing details about this bird flu research outweigh the benefits are standing on weak scientific ground for two main reasons. He's also frustrated with how the risks are being framed in most media reports.

First, Racaniello believes it's intellectually dishonest to contend that only about 600 people have been infected with the H5N1 virus in recent years or that more than half of those who came in contact with the bird flu have died. He notes the World Health Organization figures only count the roughly 600 ill individuals infected with the bird flu virus who were admitted to a hospital. And while it's true that 59 percent of those patients have died, he says a recent study of rural Thai villagers indicates 9 percent have antibodies against H5N1 strains.

"Every press article you read, in one of the first sentences, notes a bird flu fatality rate of at least 50 percent," says Racaniello. "But if 9 or 10 percent of the rural Asian population has been infected, and we've only seen a handful of bird flu deaths, it would dramatically alter our view of how deadly this virus is."

Adds Racaniello: "Let's do some more studies about what the real mortality rate might be. And if it turns out it's, say, 0.3 percent, would we be putting as many resources into this virus as we are? I don't think that's been adequately discussed."

Imperiale, the Michigan professor and NSABB member, admits it's not clear what the bird flu mortality rate really is, but adds: "Even if the rate is one-tenth of the (World Health Organization's) figures, that's still huge. That's why it makes sense to say, 'Let's stop this research from being made publicly available.' Maybe a year from now we say, 'Whoa, maybe the case fatality rate really is closer to seasonal flu.' But it's better to be cautious now, because once that information is out there, you can't pull it back."

Racaniello also is frustrated with the projection that if this genetically modified bird flu can be passed from ferret to ferret through the air, it can thus be easily passed between humans. On a side note, it's been widely reported that the bird flu strain Fouchier's lab is working with is both highly transmissible between ferrets and highly deadly, with 75 percent of the infected animals dying. In his Jan. 25 Nature commentary, Kawaoka points out that while the engineered H5N1 virus his lab created could easily be passed from ferret to ferret, none of the infected animals died and that current vaccines and antiviral therapies were effective against it.

"Ferrets are the best animal model we have to study the flu," agrees Racaniello. "They display similar symptoms and immune responses. But ferrets can't be used to determine whether an influenza virus is a threat in humans. It's not predictive."

As for those who argue the bird flu virus could accidentally escape or be stolen from the lab, biosafety personnel on the UW-Madison campus contend that's highly unlikely.

Jim Turk, the university's biosafety officer — who is charged with scrutinizing proposed experiments to make sure they are safe and appropriate, and then ensuring that laboratory practices are in compliance with local, state and federal guidelines — notes there are seven UW-Madison lead investigators, plus staff, who are cleared to work with what federal regulatory officials call select agents. These select agents include roughly 80 viruses, bacteria, fungi, toxins and the like that have the potential to cause substantial harm to human, animal or plant health.

Those working in labs using these select agents must undergo an FBI background check, says Turk, adding, "The security and oversight for all select agents is over the top. Pretty much any select agent lab you would go to is more secure than a bank vault."

Turk explains that, at UW-Madison, there are several barriers to entry that must be overcome through verifying the identity of the researcher before he or she can enter such labs. He adds that there also are surveillance cameras in place that are monitored by the police.

Kawaoka also is one of 17 UW-Madison lead investigators who work in labs classified as Biosafety Level 3 — the second highest of four levels nationally. These BSL-3 labs contain agents that have the potential for aerosol transmission or for disease that may have serious or lethal consequences. Researchers wear protective clothing and masks, when necessary, and work in self-contained lab areas that have their own ventilation systems, among other safety measures.

"We know we've done everything we can and we know it's unbelievably safe," says Turk. "But we also know that once somebody hears about certain research, it's easier and more fun to believe something scary could happen."

Nonetheless, it's clear incidents can, and do, happen.

Less than two years ago, in May 2010, UW-Madison revoked professor Gary Splitter's laboratory privileges through 2013 for unauthorized experiments. Splitter and his staff had to pass FBI background checks and fill out the usual abundance of regulatory paperwork — but that didn't guarantee safety.

The university found that between 2004-07, graduate students and a post-doctoral employee generated antibiotic-resistant genes of Brucella, a highly regulated bacteria. The studies weren't approved by the university or federal regulators as required, and at least one member of Splitter's staff came down with brucellosis.

Umbeck, who was named director of UW-Madison's environment, health and safety department last summer, says the Splitter case led the university to take a hard look at its overall safety and biosafety infrastructure. In the past three years, he says the department's staff has grown from fewer than 40 to more than 60, with the biosafety office seeing its full-time staff jump from 5.5 to 14 employees.

While this increased commitment to safety is nice, critics counter the risk remains real.

"Is this a zero-risk game?" says Umbeck. "No, it's not. It never is. But that's the nature of progress."


The science vs. security debate is nothing new, says UW-Madison bioethicist Alta Charo.

Discover Madison news, via the Cap Times

Sign up for the Cap Times Daily Features email!

* I understand and agree that registration on or use of this site constitutes agreement to its user agreement and privacy policy.

It flared just a week after the Sept. 11, 2001, terrorist attacks, when the nation was further put on edge after envelopes mailed to several news media offices and two U.S. senators containing powdered anthrax killed five and infected 17 others. The incident got government officials talking about access to biological research that could lead to nefarious acts. The tragedy, in fact, ultimately led to the formation of the National Science Advisory Board for Biosecurity. 

Although the NSABB didn't meet for the first time until 2005, it was shortly after the anthrax scare that the National Academy of Sciences assembled a task force that produced a 2004 report titled "Biotechnology Research in an Age of Terrorism."

And Charo — who was the Board on Life Sciences liaison to that panel — says it became quickly apparent that the national security experts and scientists weren't going to easily agree on a framework to, ideally, minimize the threat of biological terrorism while still moving along scientific progress.

"We were still in the midst of a cultural divide between the science community and the national security community," she says. "I'm using very broad brush strokes here, but the science community in some sense grew out of the rebellious, 1960s types while the national security people grew more out of short-sleeved shirt, buttoned-up types. It was very clear in those meetings that the two groups were struggling to appreciate one another's concerns because there was a tendency to immediately discredit each other."

Charo says the scientists generally believed the national security experts had a long history of over-hyping perceived dangers in the world. Conversely, she notes the security experts criticized the researchers for their lack of respect for authority, and generally discounted the need for scientific research to be open — arguing the government has conducted classified research for years, and it hasn't had trouble moving forward and getting results.

"It was just fascinating to watch this," says Charo, who is UW-Madison's Warren Knowles professor of law and bioethics.

Yet over time, Charo says those on both sides of the debate "developed a degree of mutual respect and deference." She explains that, eventually, the security experts came to understand how science progresses incrementally and began to appreciate the value of openness. "And they also began to appreciate how the research that seemed dangerous was exactly the research on basic mechanisms that would allow you to develop defenses."

At the same time, Charo says exposure to stories about national security breaches and near-breaches involving biological weapons helped the scientific community better appreciate the potential risks.

"People slowly began to realize how much you need a community-wide consensus effort on these issues," says Charo, who spent a couple years after the 9/11 terrorist attacks traveling between Madison and Washington for these meetings while carrying thick binders filled with articles about how to create a biological weapon.

Among other things, the National Academy of Sciences' 2004 report recommended that experiments should be reviewed for potential dangers before they are carried out, and identified seven experiments of concern related to pathogens, vaccines, antibiotic resistance and weaponization of biological agents.

But although the National Institutes of Health did adopt some of the recommendations — including the formation of the NSABB — it did not implement calls to review experiments of concern prior to receiving funding. 

So, despite the years of conversations, there's no comprehensive oversight system in place, which has led some to criticize federal agencies for not working harder to figure out what, exactly, should happen if government-funded experiments conducted by scientists like Kawaoka and Fouchier succeed in creating potentially dangerous agents.

"You can understand the complexity of designing a comprehensive oversight system, where you have 20-some-odd different agencies trying to come to some sort of consensus about how to deal with this," says NSABB member Imperiale. "It's a very polarizing topic and I just hope we can all stay calm and listen to each other and figure out how best to move forward."


What, exactly, happens next in this debate is anyone's guess.

Experts from around the globe are scheduled to meet Feb. 16-17 in Geneva, Switzerland, to further discuss issues raised by the two still-unpublished research studies on the transmissibility of bird flu. The World Health Organization is coordinating the gathering.

"Participants will discuss the specific circumstances and results of the two studies and will try to reach a consensus about ad hoc, practical actions to resolve the most urgent issues, particularly related to access to and dissemination of the results of this research," says a WHO press release.

Many interviewed for this story suggested there are similarities between this current debate and the one held during the Asilomar Conference on Recombinant DNA in California in February 1975. Scientists had halted that research — which combines genetic material from multiple sources to create something that otherwise wouldn't occur — due to concerns over potential safety risks. The conference, attended by more than 100 professionals, including biologists, doctors and lawyers, led to voluntary guidelines to ensure the safety of recombinant DNA technology.

Those involved in the bird flu debate, among other things, must reach a consensus about whether the research details are too dangerous for the public to see.

Paul Keim, a microbiologist at Northern Arizona University who chairs the NSABB, sounded the alarm during a Q&A with Nature that was posted Jan. 31: "No one should presume to know all the ways in which influenza virus could be misused, and the motivations for doing so, but the consequences could be catastrophic. There are many scenarios to consider, ranging from mad lone scientists, desperate despots and members of millennial doomsday cults to nation states wanting mutually assured destruction options, bioterrorists or a single person's random acts of craziness."

Kawaoka argues in his Nature commentary that censoring the details of his research paper won't prevent harm because there's already enough information publicly available for others to create such viruses.

"The redaction of our manuscript, intended to contain risk, will make it harder for legitimate scientists to get this information while failing to provide a barrier to those who would do harm," he writes.

If the bird flu papers are ultimately censored, officials still must figure out how knowledge about H5N1 can be made accessible to researchers with a legitimate need. But who makes these decisions? What criteria need to be considered? And on and on …

"These issues are certainly being hotly debated," says Bill Mellon, UW-Madison's associate dean for research policy. "There certainly is a lot to discuss and a lot of benefits and risks to weigh. One positive aspect to come out of all this is it's spurring a serious discussion in terms of how do we deal with these dual-use research issues."