centered image

A Controversial Virus Study Reveals a Critical Flaw in How Science Is Done

Discussion in 'Microbiology' started by Hadeel Abdelkariem, Oct 9, 2018.

  1. Hadeel Abdelkariem

    Hadeel Abdelkariem Golden Member

    Joined:
    Apr 1, 2018
    Messages:
    3,448
    Likes Received:
    21
    Trophy Points:
    7,220
    Gender:
    Female
    Practicing medicine in:
    Egypt

    Last year, the world learned that researchers led by David Evans from the University of Alberta had resurrected a virus called horsepox. The virus hasn’t been seen in nature for decades, but Evans’s team assembled it using genetic material they ordered from a company that synthesizes DNA.

    The work caused a huge stir. Horsepox is harmless to people, but its close cousin, smallpox, killed hundreds of millions before being eradicated in 1980. Only two stocks of smallpox remain, one held by Russia and the other by the United States. But Evans’s critics argued that his work makes it easier for others to re-create smallpox themselves, and, whether through accident or malice, release it. That would be horrific: Few people today are immunized against smallpox, and vaccine reserves are limited. Several concerned parties wrote letters urging scientific journals not to publish the paper that described the work, but PLOS One did so in January.

    This controversy is the latest chapter in an ongoing debate around “dual-use research of concern”—research that could clearly be applied for both good and ill. More than that, it reflects a vulnerability at the heart of modern science, where small groups of researchers and reviewers can make virtually unilateral decisions about experiments that have potentially global consequences, and that everyone else only learns about after the fact. Cue an endlessly looping gifof Jurassic Park’s Ian Malcolm saying, “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.”

    Except Evans did think about whether he should, and clearly came down on yes. In one of several new opinion pieces that reflect on the controversy, he and his colleague Ryan Noyce argue that re-creating horsepox has two benefits. First, Tonix, the company that funded the research, hopes to use horsepox as the basis of a safer smallpox vaccine, should that extinct threat ever be resurrected. Second, the research could help scientists more efficiently repurpose poxviruses into vaccines against other diseases, or even weapons against cancer. (Evans politely declined a request for an interview, noting that he’d “rather let [his] piece speak for itself.”)

    Tom Inglesby, a health-security expert at the Johns Hopkins Bloomberg School of Public Health, doesn’t buy it. He says these purported benefits are hypothetical, and could be achieved in safer ways that don’t involve horsepox at all. Even if you want to use that particular virus, the Centers for Disease Control and Prevention has specimens in its freezers; Evans didn’t ask for those, because he thought Tonix couldn’t have commercialized the naturally occurring strain into a vaccine, according to reporting from NPR’s Nell Greenfieldboyce.

    “I was a little surprised that the issue caused so much controversy,” says Gigi Gronvall, who has written extensively on biosecurity and also works at Johns Hopkins. Other researchers had already synthesized smaller viruses such as polio and bigger entities such as bacteria; they’ve even made a start on far larger organisms such as yeast. Given such milestones, one should just assume that all viruses are within reach—but only to those with the right expertise, equipment, and money. Evans didn’t just order horsepox in the mail; it took years to refine the process of making and assembling it. “It’s not like anybody could synthesize horsepox,” Gronvall says.

    Sign up for The Atlantic’s daily newsletter.

    True, says Kevin Esvelt from MIT, but that feat is now technically easier because Evans’s paper spelled out several details of how to do so. It’s conceptually easier to weaponize because his paper explicitly connected the dots to smallpox. And it will become logistically easier to carry out with time, as the underlying tech becomes cheaper. “In the long run, I’m worried about the technology being accessible enough,” Esvelt says.

    There are ways of mitigating that risk. Most groups can’t make DNA themselves, and must order sequences from companies. Esvelt thinks all such orders should be screened against a database of problematic sequences as a bulwark against experiments that are unknowingly or deliberately dangerous. Such screening already occurs, but only on a voluntary basis. A mandatory, universal process could work if publishers or funders boycott work that doesn’t abide by it, or if companies build the next generation of DNA synthesizers to lock if a screening step is fixed.

    But these technological fixes do little to address the underlying debate about how society decides what kinds of experiments should be done in the first place, let alone published. Few countries have clear procedures for reviewing dual-use research. The U.S. has perhaps the strongest policy, but it still has several loopholes. It only covers 15 big, bad pathogens, and horsepox, though related to one, isn’t one itself. It also only covers federally funded research, and Evans’s research was privately funded. He did his work in Canada, but he could just as easily have done so in the U.S.

    Absent clearer guidelines, the burden falls on the scientific enterprise to self-regulate—and it isn’t set up to do that well. Academia is intensely competitive, and “the drivers are about getting grants and publications, and not necessarily about being responsible citizens,” says Filippa Lentzos from King’s College London, who studies biological threats. This means that scientists often keep their work to themselves for fear of getting scooped by their peers. Their plans only become widely known once they’ve already been enacted, and the results are ready to be presented or published. This lack of transparency creates an environment where people can almost unilaterally make decisions that could affect the entire world.

    Take the horsepox study. Evans was a member of a World Health Organization committee that oversees smallpox research, but he only told his colleagues about the experiment after it was completed. He sought approval from biosafety officers at his university, and had discussions with Canadian federal agencies, but it’s unclear if they had enough ethical expertise to fully appreciate the significance of the experiment. “It’s hard not to feel like he opted for agencies that would follow the letter of the law without necessarily understanding what they were approving,” says Kelly Hills, a bioethicist at Rogue Bioethics.

    Science reported that he did the experiment “in part to end the debate about whether recreating a poxvirus was feasible.” And he told NPR thatsomeone had to bite the bullet and do this.” To Hills, that sounds like I did it because I could do it. “We don’t accept those arguments from anyone above age 6,” she says.

    Even people who are sympathetic to Evans’s arguments agree that it’s problematic that so few people knew about the work before it was completed. “I can’t emphasize enough that when people in the security community feel like they’ve been blindsided, they get very concerned,” says Diane DiEuliis from National Defense University, who studies dual-use research.

    The same debates played out in 2002, when other researchers synthesized poliovirus in a lab. And in 2005, when another group resurrected the flu virus behind the catastrophic 1918 pandemic. And in 2012, when two teams mutated H5N1 flu to be more transmissible in mammals, in a bid to understand how that might happen in the wild. Many of the people I spoke with expressed frustration over this ethical Möbius strip. “It’s hard not to think that we’re moving in circles,” Hills says. “Can we stop saying we need to have a conversation and actually get to the conversation?”


    The problem is that scientists are not trained to reliably anticipate the consequences of their work. They need counsel from ethicists, medical historians, sociologists, and community representatives—but these groups are often left out from the committees that currently oversee dual-use research. “The peer group who is weighing in on these decisions is far too narrow, and these experiments have the potential to affect such a large swath of society,” Lentzos says. “I’m not saying we should flood committees with people off the streets, but there are a lot of professionals who are trained to think ethically or from a security perspective. Scientists don’t have that, and it’s actually unfair that they’re being asked to make judgment calls on security issues.”

    [​IMG]

    More broadly, Hills says, there’s a tendency for researchers to view ethicists and institutional reviewers as yet more red tape, or as the source of unnecessary restrictions that will stifle progress. Esvelt agrees. “Science is built to ascend the tree of knowledge and taste its fruit, and the mentality of most scientists is that knowledge is always good,” he says. “I just don’t believe that that’s true. There are some things that we are better off not knowing.” He thinks the scientific enterprise needs better norms around potentially dangerous information. First: Don’t spread it. Second: If someone tells you that your work represents an information hazard, “you should seriously respect their call.”

    Lentzos adds that scientists should be trained on these topics from the earliest stages of their career. “It needs to start at the undergrad level, and be continually done for active researchers,” she says. There is a lot of talk about educating society about science. Perhaps what is more needed is educating scientists about society.

    Source
     

    Add Reply

Share This Page

<