What can we do about garage bioweapons?

Ever since I read Why the Future Doesn’t Need Us, probably some time in the early 2000s, I’ve been concerned about the idea that biotechnology will eventually be available to amateurs. And a few amateurs will be psychopaths or suicide terrorists willing to take down civilization along with themselves. My only real idea for preventing this has been heavy duty government surveillance, somehow tempered by political institutions that keep the government itself from being evil. And of course, we don’t think we want this surveillance, and I don’t really want it, but I haven’t seen clear alternatives.

Here are some suggestions by others. I am not sure if I am convinced, but these are ideas worth thinking about. First, Amodei from Anthropic in his recent TLDR essay The Adolescense of Technology (where he tries to pull a Bill Joy and doesn’t quite pull it off, in my view.) You have to keep in mind that although he seems to be motivated by ethical views, heavy government surveillance is not in the interest of private companies, so his suggestions tend toward self-regulation by industry. Anyway, his suggestions are:

  • “AI companies can put guardrails on their models to prevent them from helping to produce bioweapons.”
  • transparency requirements,27 which help society measure, monitor, and collectively defend against risks without disrupting economic activity in a heavy-handed way” – this would be government regulation, but it would constrain all competitors equally
  • “try to develop defenses against biological attacks themselves. This could include monitoring and tracking for early detection, investments in air purification R&D (such as far-UVC disinfection), rapid vaccine development that can respond and adapt to an attack, better personal protective equipment (PPE),28 and treatments or vaccinations for some of the most likely biological agents.” – One imagines scientists using AI extensively to develop these defenses, or possibly “AI scientists” developing them with some human mediation.

Now, here is an article from Big Think called How to deter biothreats in the age of gene synthesis. What they come up with is to screen all orders of biological materials (like DNA) from the companies that make and sell them to researchers. Screen the orders, screen the customers themselves, and screen for particularly dangerous DNA sequences.

This has to be done internationally, which adds to the difficulty. Even if it is done effectively by most countries most of the time, you will have enterprising countries and criminal organizations out there setting up a black market, just like you do with drugs and nuclear materials. Because it will be profitable to do so. The more effective the clamp down, the higher the risk and potential profit becomes (which is why the war on drugs has not and probably cannot be won, other than by massive and intrusive surveillance of individuals which most people in most countries agree they don’t want. Drugs are a tragedy for addicts and for people caught up in all the violence fueled by the black market created by their prohibition. But they aren’t going to spread like fires through uninvolved and unprepared populations and cause the extinction of our species. Which is why I wonder if accepting some level of intrusive surveillance on individuals is the least evil option as the technology gets more accessible and more dangerous all the time.

Nonetheless, it is probably good to take an “all of the above” approach. If I were feeling optimistic I might say perhaps the existential nature of the threat will prompt some international cooperation and formation of some more functional institutions for cooperation than we have seen over the past decade or so. But maybe it will take some sort of near miss or partially contained disaster for this to happen. (Covid-19 was not bad enough, apparently.)

Let’s close with a few quotes from Why the Future Doesn’t Need Us, which anyone who has not read must drop what they are doing and read now:

The 21st-century technologies – genetics, nanotechnology, and robotics (GNR) – are so powerful that they can spawn whole new classes of accidents and abuses. Most dangerously, for the first time, these accidents and abuses are widely within the reach of individuals or small groups. They will not require large facilities or rare raw materials. Knowledge alone will enable the use of them…

I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals.

Joy’s proposed solutions included “relinquishment” of certain technologies we all agree are too dangerous (as we have, to some extent up until now, with nuclear, chemical, and biological weapons at the nation-state level) and strong protections for whistleblowers in the scientific and technical communities.

Leave a Reply

Your email address will not be published. Required fields are marked *