Shortly after the end of World War II, Albert Einstein, referring to the new global danger of nuclear weapons, uttered his now famous warning: “Everything has changed, save the way we think.” Accordingly, he and Robert Oppenheimer established the Bulletin of the Atomic Scientists to help warn the public about the dangers of nuclear war.
Perhaps the most visible face of the bulletin—for which I am currently co-chair of the board of sponsors—is the “Doomsday Clock.” Created in 1947, the clock graphically reflects how close humanity might be to human-induced apocalypse, in terms of the “number of minutes to midnight”—at which time, presumably, time itself will no longer matter.
In total, the clock has been adjusted 20 times, moving as close to two minutes to midnight in 1953, after the United States and Soviet Union each first tested thermonuclear devices, and as far as 17 minutes to midnight in 1991, after the United States and Soviet Union signed the Strategic Arms Reduction Treaty. Currently, it is set at five minutes to midnight.
Nuclear weapons continue to be the most urgent global threat to humanity: Recent developments in Iran, the continued tension between Pakistan and India, and the United States’ consideration of developing a new generation of nuclear weapons are all cause for great concern. But in the 60-odd years since the creation of the Doomsday Clock, the world has changed, in no small part to technological and scientific advance, making it even more dangerous. Unfortunately, there is no great evidence that our way of thinking about global catastrophes has evolved for the 21st century. That’s why the bulletin decided, in 2007, to factor other threats to humanity into the Doomsday Clock.
Since then, we have run three “Doomsday Symposia,” during which key scientists and policymakers assess ongoing global threats to humanity in three areas: nuclear proliferation and nuclear weapons, climate change, and biotechnology and bioterrorism. The last issue has raised a lot of heat in the media in recent years, and the specter of new lethal viruses that might wipe out populations suggested to us that there might be compelling new reasons to move the clock forward again.
Indeed, as biotechnology has undergone in the past 35 years the same explosive growth that physics technology underwent in the previous period, the emerging possibility of biologically induced weapons has increased. We now have the ability to artificially recreate genetic sequences, including viruses. DNA “hacking” has become a pastime at institutions such as MIT, among the same kind of people who used to be so enamored with computer hacking. Finally, the holy grail of genetic manipulation now involves the frontiers of synthetic biology, wherein researchers are attempting not merely to build up genetic sequences base-pair by base-pair, but also to explore the possibility of building novel life forms from scratch.
These developments are thrilling for scientists and technologists who love to take things apart and put them back together. But there remains the terrifying prospect that smart pranksters, DIYers, a laboratory, or more sinister groups could, either by accident or intentionally, accidentally create a new supervirus with the potential to wipe out all other life on Earth. (Hence the furious debate that has surrounded experiments into artificially developing forms of the avian flu virus H5N1 that is transmittable between mammals.) Indeed, just this week, a host of external watchdog organizations have called this week for a moratorium on synthetic biology.
We should encourage the vigilance and rigorous discussion that has accompanied these developments. Happily, however, the bulletin’s experts, including Harvard biologist Matthew Meselson and human genome pioneer and synthetic biologist Craig Venter, suggest the above scenarios are in the near term unlikely at best, pure fiction at worst.
In the first place, the synthetic-biology industry is well-aware of the dangers of unmonitored genetic hacking and is responding on its own. Appeased by the group’s self-policing thus far, the Presidential Commission for the Study of Bioethical Issues determined in 2010 that “there is no reason to endorse additional federal regulations or a moratorium on work in this field at this time.”
In the second place, while manufacturing dangerous biological compounds may be possible, weaponizing them is not so easy. While it might be possible to inflict significant terror locally, dispersing biological agents over broad regions to create global crises is far more challenging.
Next, there is the difficulty of reproducing appropriate technology. The field is as much an art as a science, and it is difficult to reliably reproduce results in a field where the financial benefits are likely to be so great that proprietary technology is not readily shared.
We can all (at least those of us who, unlike some of the dominant presidential candidates, accept the reality of both evolution and an old earth) take solace in the robustness of life itself, evolved over 4.5 billion years in the presence of remarkably ingenious viruses, which have also competed for survival. It is unlikely that a new organism, without the benefit of all of this “learned experience,” could outmaneuver all the mechanisms that life has developed to outwit constant biological invaders.
All of this suggested to those of us who have the unenviable task of regularly revisiting the possibility of Doomsday in order to help humanity adjust its thinking appropriately, that the current revolution in biotechnology is, for the moment, more likely to benefit humankind than destroy it.