Via The Guardian, a long report from Ian Sample on a momentous debate: From anthrax to bird flu – the dangers of lax security in disease-control labs. Excerpt:
When the CDC declared its anthrax incident, Marc Lipsitch, professor of epidemiology at the Harvard School of Public Health, said we should be glad it was only anthrax. He fears that scientists, Fouchier included, pose far greater risks to the public by intentionally creating dangerous pathogens. In 2011, Fouchier announced that he had mutated bird flu to make it spread easily in animals through coughs and sneezes.
Advocates for these experiments, known as gain-of-function studies, say they give scientists crucial insights into the kinds of viruses to fear in nature. To Lipsitch and many others, the irony is all too clear. In trying to prevent the next pandemic, they say Fouchier and his colleagues make a disastrous outbreak more likely.
Earlier this week, Lipsitch convened more than a dozen researchers who shared his concerns. The result of their meeting was the Cambridge Working Group consensus statement calling on the US government to "curtail" experiments that create potentially pandemic pathogens, until proper risk assessments have been done. While Fouchier and others explain that they have already been through numerous risk assessments, and operate under extremely tight security, their critics are not reassured.
One of the signatories of the Cambridge Working Group statement is Sir Richard Roberts, a British scientist and Nobel prize winner, who now works at New England Biolabs in Massachusetts. Roberts hasn't seen the risk assessments for Fouchier's experiments, but notes that even the CDC labs, "which were generally considered to be the safest labs out there", had had problems. "How can you trust anybody? Humans are human. People make mistakes."
Lipsitch's group wants to convene a meeting that brings together scientists and other experts to debate the potential risks of making dangerous pathogens, and to draw up binding guidelines to ensure that future experiments are safe. The plan mirrors the landmark Asilomar conference in California in 1975, which was largely driven by younger scientists who had concerns over the unknown risks of swapping genes in and out of different organisms. The meeting set important ground rules – including the introduction of biosafety containment levels around the world – for genetic studies to this day.
Vincent Racaniello, a virologist at Columbia University in New York, said the Cambridge Working Group was "infuriating" because it misled people into believing that viruses made in laboratories were a serious threat to the public. But because the experiments are done in ferrets, he argues, it is impossible to know if the bugs would spread in people, and how dangerous they might be.
He added that deciding which experiments went ahead on the basis of a risk-benefit analysis was "absurd", because it was often impossible to know the benefits of an experiment beforehand.
McCauley largely supports the experiments at the centre of the controversy, arguing that they reveal how viruses in the wild transform from harmless strains to more dangerous forms. "I need to be able to advise people," he said. "And it makes me feel a whole lot happier knowing more."
Roberts, though, is having none of it. "The risks are enormous and the benefits, to my mind, are non-existent," he said. "If I suggested that you try to make the most virulent and dangerous virus that we can imagine, something that could kill a quarter of the world's population if it got out, does that seem a sensible thing to do? That strikes me as being absolutely ridiculous."