Enjoyed the episode? Want to listen later? Subscribe here, or anywhere you get podcasts:

Nobody is in favor of the power going down. Nobody is in favor of all cell phones not working. But an election? There are sides. Half of the country will want the result to stand and half the country will want the result overturned; they’ll decide on their course of action based on the result, not based on what’s right.

Bruce Schneier

November 3 2020, 10:32PM: CNN, NBC, and FOX report that Donald Trump has narrowly won Florida, and with it, re-election.

November 3 2020, 11:46PM: The NY Times, Washington Post and Wall Street Journal report that some group has successfully hacked electronic voting systems across the country, including Florida. The malware has spread to tens of thousands of machines and deletes any record of its activity, so the returning officer of Florida concedes they actually have no idea who won the state — and don’t see how they can figure it out.

What on Earth happens next?

Today’s guest — world-renowned computer security expert Bruce Schneier — thinks this scenario is plausible, and the ensuing chaos would sow so much distrust that half the country would never accept the election result.

Unfortunately the US has no recovery system for a situation like this, unlike Parliamentary democracies, which can just rerun the election a few weeks later.

The constitution says the state legislature decides, and they can do so however they like; one tied local election in Texas was settled by playing a hand of poker.

Elections serve two purposes. The first is the obvious one: to pick a winner. The second, but equally important, is to convince the loser to go along with it — which is why hacks often focus on convincing the losing side that the election wasn’t fair.

Schneier thinks there’s a need to agree how this situation should be handled before something like it happens, and America falls into severe infighting as everyone tries to turn the situation to their political advantage.

And to fix our voting systems, we urgently need two things: a voter-verifiable paper ballot and risk-limiting audits.

He likes the system in Minnesota: you get a paper ballot with ovals you fill in, which are then fed into a computerised reader. The computer reads the ballot, and the paper falls into a locked box that’s available for recounts. That gives you the speed of electronic voting, with the security of a paper ballot.

On the back-end, he wants risk limiting audits that are automatically triggered based on the margin of victory. If there’s a large margin of victory, you need a small audit. For a small margin of victory, you need a large audit.

Those two things would do an enormous amount to improve voting security, and we should move to that as soon as possible.

According to Schneier, computer security experts look at current electronic voting machines and can barely believe their eyes. But voting machine designers never understand the security weakness of what they’re designing, because they have a bureaucrat’s rather than hacker’s mindset.

The ideal computer security expert walks into a shop and thinks, “You know, here’s how I would shoplift.” They automatically see where the cameras are, whether there are alarms, and where the security guards aren’t watching.

In this impassioned episode we discuss this hacker mindset, and how to use a career in security to protect democracy and guard dangerous secrets from people who shouldn’t have access to them.

We also cover:

  • How can we have surveillance of dangerous actors, without falling back into authoritarianism?
  • When if ever should information about weaknesses in society’s security be kept secret?
  • How secure are nuclear weapons systems around the world?
  • How worried should we be about deep-fakes?
  • The similarities between hacking computers and hacking our biology in the future
  • Schneier’s critiques of blockchain technology
  • How technologists could be vital in shaping policy
  • What are the most consequential computer security problems today?
  • Could a career in information security be very useful for reducing global catastrophic risks?
  • What are some of the most kind of widely-held but incorrect beliefs among computer security people?
  • And more.

Get this episode by subscribing to our podcast on the world’s most pressing problems and how to solve them: type 80,000 Hours into your podcasting app. Or read the transcript below.

The 80,000 Hours Podcast is produced by Keiran Harris.

Highlights

We need to convince organizations that they need technologists on their staff, and whether it’s organizations working on climate change, or social justice, or human rights, that technologies are critical to what they’re doing. That all legislative staffs, investigative journalism, sort of again and again, these groups need technologists. What can we do?

I think we need to push the fact that technology is intimately entwined in public policy, inseparable, and that we need more people who understand this. Maybe the way to push it is by convincing policymakers that they need technologists. Convincing that senator who questioned Facebook that maybe if you had a tech person on your staff who you can clear your questions with first. Maybe that you who are working in some kind of immigration rights office… you need a technologist because that’s how you’re going to understand what sorts of surveillance technology are being used against the people you’re trying to represent and you’re trying to keep safe. So I think maybe that’s where we have to push.

Uncle Milton’s ant farm… it’s two pieces of plastic, and they have maybe a quarter inch of space between them, and you fill the space with some sand. Then you get ants and you put them in the ant farm and you watch them dig tunnels, right?

Super cool, if you’re like a 12 year old boy. Now, when you buy this at a toy store, it doesn’t come with ants because, well because, and you can do one of two things: you can get some ants out of your backyard and they don’t have a queen, so they’re going to die soon but you know, they’ll make tunnels. Or you can, at least back, when I was a kid, there’d be a card. In the ant farm, you could write your name and address and send it to company and they’d mail you a tube of ants. Now the normal person says, “Oh that’s kind of neat, I can get a tube of ants”. Someone who’s a hacker looks and says, “Wow, I can send a tube of ants to anybody I want”, and that’s how I characterize the mentality needed to be a computer security person, to be a hacker. The ability to look at a system and sort of naturally see how it might be misused.

The problem is not lack of ideas. I used to run a movie plot thread contest on my blog every April 1st and the idea was to come up with the the “most scary impressive computer security everybody dies” disaster scenario and I got email from people saying, “Don’t give the terrorists ideas!”. I was like, “Are you kidding? Ideas are the easiest thing in the world to get… Execution is hard.”

So no, I am not worried that lists of bad things that can happen will give people who want to do us harm, ideas. They’ve already got the ideas. We need the ideas out in the world so that we can think about them. Please do not, do not promulgate that myth. I think that myth is harmful and dangerous and keeps this stuff secret and we’re all worried that we’ll talk about it and the people who have the ideas, which are the bad guys, are the ones who are going to do all the thinking.

…I think we can invent a scenario, again a great movie plot, where this idea is so obscure and weird that you wouldn’t want to make it public. In general, in my world, we call this “security by obscurity” and we laugh at it. Right, you do not want something so fragile as the idea of the thing being what makes you secure. If that’s what makes you secure, you are not secure. Because I assure you somebody in the lab across the street or across the world is almost as good as you and will come up with the idea, if not today, in two weeks or in a month.

So you’re not going to get enough headstart. In general, you make these things public so the good guys can think about them. We do not get security through secrecy. That is much too fragile. We get security through actual security.

Unlike other countries, the United States doesn’t have a Federal Bureaucracy for elections. The security we use is that we have people from each party sitting in the same room watching each other. When you go to vote, there are poll watchers from Republicans and Democrats and they’re sitting at the table and they’re, sort of all there for this, if you think about this very mid-1800s threat. And their great security against this mid-1800s threat is the way election stealing would happen.

We’re all going to watch each other and if you do anything suspicious, I’m going to notice and that’ll keep all of us honest. It doesn’t work against 21st century threats. And, we are hurt by the fact that we don’t have a Federal Bureaucracy in charge of accuracy and elections.

Articles, books, and other media discussed in the show

Bruce’s work

Everything else

Related episodes

About the show

The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.

The 80,000 Hours Podcast is produced and edited by Keiran Harris. Get in touch with feedback or guest suggestions by emailing [email protected].

What should I listen to first?

We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:

Check out 'Effective Altruism: An Introduction'

Subscribe here, or anywhere you get podcasts:

If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.