Rare events can still cause catastrophic accidents. The concern that has been raised by experts going back over time, is that really, the more of these experiments, the more labs, the more opportunities there are for a rare event to occur — that the right pathogen is involved and infects somebody in one of these labs, or is released in some way from these labs.

And what I chronicle in Pandora’s Gamble is that there have been these previous outbreaks that have been associated with various kinds of lab accidents. So this is not a theoretical thing that can happen: it has happened in the past.

Alison Young

In today’s episode, host Luisa Rodriguez interviews award-winning investigative journalist Alison Young on the surprising frequency of lab leaks and what needs to be done to prevent them in the future.

They cover:

  • The most egregious biosafety mistakes made by the CDC, and how Alison uncovered them through her investigative reporting
  • The Dugway life science test facility case, where live anthrax was accidentally sent to labs across the US and several other countries over a period of many years
  • The time the Soviets had a major anthrax leak, and then hid it for over a decade
  • The 1977 influenza pandemic caused by vaccine trial gone wrong in China
  • The last death from smallpox, caused not by the virus spreading in the wild, but by a lab leak in the UK
  • Ways we could get more reliable oversight and accountability for these labs
  • And the investigative work Alison’s most proud of

Producer and editor: Keiran Harris
Audio Engineering Lead: Ben Cordell
Technical editing: Simon Monsour and Milo McGuire
Additional content editing: Katy Moore and Luisa Rodriguez
Transcriptions: Katy Moore


The case of the found smallpox vials

Alison Young: Around the same time the CDC was having all kinds of incidents in 2014, in the middle of all of that, there was a cold storage room on the campus of the National Institutes of Health, just north of Washington DC, where they were moving around some old cardboard boxes. And they look inside and they see all of these little, tiny, very fragile vials from decades ago that are labelled in typewriter print with various pathogens’ names on them. And it’s powdered material. And as they’re going through these glass vials, they see some that are labelled as variola.

Luisa Rodriguez: Which, just to be totally clear, variola is the pathogen that causes smallpox. So go on: they found vials of smallpox in a box in a storage room?

Alison Young: Exactly. In an unlocked storage room. So this should have been incredibly concerning, because smallpox is incredibly deadly. It has been eradicated from the planet and smallpox virus is only supposed to be found under treaties in two labs in the world: one is in Russia, and the other is a specific lab on the campus at the Centers for Disease Control and Prevention in Atlanta. So these vials shouldn’t have been in this cold storage room at NIH.

What was also concerning was how they responded to it when they found these vials. Ultimately, it was one scientist, by themselves, who basically picked up the cardboard box and walked it down the corridors of this building at the NIH and across the street and into another building. All the while, they’re hearing this clink, clink of these fragile old vials hitting each other as they’re walking along.

The FBI report that I read of the incident criticised the scientist and just the whole handling of this box, because when it was properly catalogued, in the end, there was a vial that had broken inside this box — and once again, the world got lucky, and it was not smallpox virus, it was some sort of a tissue sample. But as the FBI report noted, had that been the freeze-dried smallpox specimen, there was nothing really protecting the person who was carrying it.

You would hope that everyone who is working around really very dangerous pathogens like smallpox, which should not get into other people’s hands, part of the concern that was raised is you shouldn’t necessarily have one single person by themselves carrying a box that contains smallpox virus.

Luisa Rodriguez: And that’s because it’s one of the few pathogens that a single person could use as a bioweapon, basically?

Alison Young: Correct.

The Soviet anthrax leak

Alison Young: There was an accident at a lab that was believed to actually be a bioweapons facility by US intelligence, but a lab nonetheless, that was working with large quantities of anthrax. It appears that it spewed a giant plume of anthrax spores over a town. And people downwind were sickened, animals were killed, about 60 people in that case died. Initially, the authorities sought to claim that there was no airborne anthrax — that this was ultimately a result of anthrax food poisoning, possibly from black market meat or some sort of contaminated cattle feed or agricultural feed. And that was sort of where it was.

Over time, because it was such a huge and deadly outbreak, there was intense scientific community interest. And eventually, there was a group of scientists who invited officials from these former Soviet communities to come to the United States and give a presentation at the US National Academies of Sciences. There, they produced all kinds of slides and charts and told compelling stories of racing up into the mountains and how they were there to help save these people, and they showed all kinds of information that really was making the case that this was a foodborne anthrax outbreak. Coming out of that meeting, there are news clippings in The Washington Post and The New York Times and elsewhere where prominent US scientists say they’ve been incredibly transparent and they’ve made quite the case — it looks like this really was gastrointestinal anthrax, and not some sort of an airborne release.

Then it took many more years, until 1992, when then Russian President Boris Yeltsin came out and made this very surprising statement in a Russian newspaper that in fact that outbreak was the result of a military lab accident.

Luisa Rodriguez: So this case absolutely shocks me. One, it’s just horrific: 60 people died. Two, there was this extremely successful coverup by the Soviets, which was particularly because they were violating the Biological Weapons Convention and wanted to hide that. And then three, just bizarrely, Boris Yelstin later unprompted admitted that this was caused by military bioweapons research.

But I wanted to talk about what happened after all of that, which was this joint effort by American and Russian scientists to find out exactly what happened. I just found this extremely moving. Can you explain what they did?

Alison Young: Yeah, it’s fascinating. Here were these Russian scientists who, at the time all of this occurred, were incredibly brave and basically hid away evidence to keep the KGB from taking it away. So they hid away their notes. They had samples from the people who died, and they kept them in jars — but they put them out in the open, almost hiding them in plain sight, so that they wouldn’t be confiscated. They had kept these for all of these years, and so, as the political situation changed in Russia, it became possible for them to actually disclose that they had this information.

And they did some remarkable investigations, where they even went and looked at other records that weren’t destroyed, such as who got compensated. They went to graveyards and looked at the death records. And ultimately, even some of the main US scientists who were the biggest proponents that this was not some sort of a bioweapons lab and that it was absolutely what the Soviet officials had said, and they were absolutely believing of this initial cover story that this was a meat problem, those same scientists ultimately came around — some of them assisting with the Russians’ research that this was a huge anthrax plume, and there was plenty of documentation for it.

And I think the thing that is so instructive is it took 15 years to get to that point from when the accident happened, and all of the years of coverup, and all of the years of many international scientists believing the cover story, to ultimately getting to the truth.

A culture of martyrdom

Alison Young: One of the challenges is the idea of establishing safety culture within organisations. Part of my book goes way back into the history of biological safety, and I spent a lot of time reading the papers of a man by the name of Dr Arnold Wedum, who is considered the father of modern biosafety. And part of the reason the book goes into depth about Arnold Wedum’s findings is that I think many of his concerns about the lack of safety culture in microbiology, and the difficulty in getting certain scientists to accept the importance of following safety protocols, some of this resistance to safety culture that he saw way back in the 1950s are some of the same kinds of things that play out today in these incidents.

Arnold Wedum talked quite a lot about this idea of being a martyr to science. Obviously, the people who went into microbiology over time are people who are very dedicated to the study of science, to trying to improve the lives of people around the planet.

One of the things that’s important to remember is that microbiology is relatively a new science. It’s a young science compared to chemistry and radiological sciences, and Dr Wedum said that those scientists seem to be much more open to the scrutiny of their practices than those working in microbiology labs — who, for much of the history of microbiology, because there were not ways to keep them safe, were often catching their experiments. Dr Wedum also talked about how — again, this is back many years ago — some of these scientists took great pride in how many times they had become infected, because they were doing this for the greater good.

Luisa Rodriguez: I remember finding it striking in the book, reading about these cases where scientists, working before a bunch of better safety practices, would basically brag, as you said — like, “I’ve gotten TB four times already” — and it was almost a battle scar that they wore with pride.

Maybe there, the takeaway is this field is coming from this initial foundation of getting these diseases is a norm and is even kind of a good thing. It’s like a badge of honour. So when you try to throw all these safety practices on top, they’re resistant because they’re used to this; they don’t regard it as a terrible thing. And that’s part of what’s made making safety a norm a much harder problem. Does that sound right?

Alison Young: Some of that, I think, is very much the case. Also, there’s just not a culture of tracking these kinds of infections. There never has been a culture of that. To this day, there are no universal tracking systems for these kinds of illnesses in labs or accidents.

I think part of the challenge as well is that nobody likes having to do things that make it harder to do your job. And one of the realities of the kinds of safety procedures and equipment that are required, depending on the pathogen, they can make doing your work slower and more cumbersome. It can be more expensive. There may be limited access to certain kinds of equipment. All of those kinds of things — at least over time, in what Dr Wedum wrote about — created a culture where there were questions about whether any of it was necessary.

And that’s where that idea of the “martyr to science” culture comes from. So that was back in the ’50s, ’60s, and ’70s, when Dr Wedum was really writing about those kinds of things. Here we are in 2023: What is the culture inside individual labs? It’s hard to say, but you can see in incident after incident that there are individuals and institutions that are not paying the attention to safety that they should be.

No one wants to regulate biolabs

Alison Young: This is a topic I’ve been now covering for 15 years, and it’s important to know that going back at least 10 years ago, the US Government Accountability Office started issuing reports raising concern that as more of these kinds of biological research facilities are built and doing more experiments with more risky pathogens, there is this increase in the aggregate risk of a catastrophic accident. So I’ve been covering hearings in Congress going back over time, and back then, there was not one political party or another that was interested in this: this was a bipartisan concern.

And as I wrote Pandora’s Gamble, it was a huge reminder as I went back and read through some of the transcripts of hearings that I’d sent in as a reporter, and seeing both Democrats and Republicans asking really important questions about the policy issues of how we deal with the safety of these labs. There was a recognition of the importance of conducting biological research. I mean, we all need this — I don’t want lost in any of this the idea that this world has benefited greatly from the COVID-19 vaccines and from all kinds of work that these labs do. But we also need that work to be done safely. And how many labs do we actually need?

And Congress was holding hearings and looking at this stuff closely. There were pushes in the 2014–2015 timeframe — when I was writing about a bunch of accidents at the Centers for Disease Control and Prevention and at Dugway, as we’ve discussed — there were even more hearings raising questions of did there need to be a single federal entity that was overseeing lab safety? And then it went nowhere. And that has played out over and over, over the years.

Part of it is that the organisations that operate labs, nobody wants more regulation on them: nobody wants more scrutiny, nobody wants more red tape. And the federal agencies that Congress and the public rely on to advise on what we need to do in these arenas all have potential conflicts of interests. The agencies like the National Institutes of Health: it’s one of the largest funders of biomedical research in the world. They conduct their own research; they are funding the research often at the labs that are having the accidents that are of concern. You have the Centers for Disease Control and Prevention: they are one of the two primary regulators in the limited subset of these labs that are actually subject to any regulation on safety. The CDC’s labs have their own series of issues with safety problems in their labs.

So it is something that every few years, at least in my coverage of it, you see interest in Congress and then it dies back down again. And now with COVID-19, obviously this is back in Congress and being discussed again, but the whole political climate in Washington has become so toxic that that is now adding a new layer to the whole debate.

Nobody is tracking how many biosafety level 3 labs there are

Alison Young: One of the things that just is so frustrating in this arena is that nobody is even tracking how many of these labs there are. One of the biggest surprises for me when I started covering this is that the US Government Accountability Office, which is the nonpartisan investigative arm of Congress, produced reports going back more than a decade ago that said even the US government doesn’t know how many biosafety level 3 labs there are.

Part of the issue here is that it is such a fragmented area. If you are a privately funded lab, and you’re not taking government money and you are not working with a select agent pathogen, the government may not really know that you exist as a lab. They may know piecemeal — like, you might have to have workers’ compensation, or you might have to have some OSHA things, or you might have to have a wastewater permit. But you don’t have a lab permit, and so there’s no chronicling of where all these labs are.

So one of the things we did when I was a reporter on USA Today‘s national investigative team is we set out to find out how many biosafety level 3 labs can we even identify. And it was incredibly difficult. We identified a couple hundred of these labs across the country, but what it took to do that is literally googling “biosafety level 3 lab” and then we could find where places advertised it. Or we looked at government grant records where they mentioned that they were using a biosafety level 3 lab or a BSL-3 lab. Or we looked at LinkedIn and looked where people promoted the fact that they’d worked in these labs. But this is cobbling it together from an incredible number of records that it’s something that you would think that the government would know.

And that’s just in the United States. I have a Google alert that is set up for BSL-3 and BSL-4 labs, so I see the press releases that go out when various countries or various universities are announcing that they’re building a BSL-3 or a BSL-4 lab. But there is no one place that policymakers or the public can go to see where these labs are, or how many there are.

Luisa Rodriguez: How can we not be tracking those?

Alison Young: There just is no mechanism. There’s a case right now that has gotten some recent attention out in California, where there was a biotechnology lab in Reedley, California, that has gotten some attention because literally a code enforcement officer in this small city discovered that there was this lab, and they had 1,000 mice, and they had -80° freezers out there, they had all sorts of biological materials. And ultimately, and I’ve been working on some reporting in this area, what the local officials have said is that the only way they were able to address this lab — because it was privately funded, it didn’t receive any government grant money, and they weren’t obviously working with any select agent pathogens — they had to cobble together and use local code enforcement and other piecemeal regulations in order to address the facility. There was no lab authority to go to to address the biohazards of the facility.

And this issue has come up over and over over the years, but it’s not one that policymakers have so far addressed. There has been a lot of talk, and it has been known for a long time that there are gaping holes in the oversight because of the fragmented nature of how we look at these biolabs.

There’s one other aspect of the proposed legislation that is worth pointing out: It does include a provision that asks for a biosecurity board in the US government to evaluate the effectiveness of the current Federal Select Agent Program in overseeing biorisks in this country. And it asks for proposals to, in its words, “harmonize” the various fragmented pieces — whether it’s at the NIH; the NIH Guidelines; the Select Agent Program; and the recommendations in something called the BMBL, basically the biosafety manual, its recommendations (but not regulations) of safety practices. But what’s interesting in how it is written is it sounds like harmonising, but leaving in place the fragmented system of multiple agencies being responsible for this kind of work.

Articles, books, and other media discussed in the show

Upcoming fellowship opportunity:

The Astra Fellowship pairs fellows with experienced advisors to collaborate on an AI safety research project of two to three months. Apply by November 17 for the January–April 2024 programme — you could be paired with past guests of the show like Richard Ngo from OpenAI, Ajeya Cotra and Tom Davidson from Open Philanthropy, or Robert Long from the Center for AI Safety.

Alison’s work:

Other journalists’ coverage of lab accidents:

Biosafety research and regulations:

Related episodes

About the show

The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.

The 80,000 Hours Podcast is produced and edited by Keiran Harris. Get in touch with feedback or guest suggestions by emailing [email protected].

What should I listen to first?

We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:

Check out 'Effective Altruism: An Introduction'

Subscribe here, or anywhere you get podcasts:

If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.