Enjoyed the episode? Want to listen later? Subscribe here, or anywhere you get podcasts:

This podcast highlighted Sam Bankman-Fried as a positive example of someone ambitiously pursuing a high-impact career. To say the least, we no longer endorse that. See our statement for why.

The show’s host, Rob Wiblin, has also released some personal comments on this episode and the FTX bankruptcy on The 80,000 Hours Podcast feed, which you can listen to here.

If you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns? Well, in terms of doing good, there’s no such thing: more good is more good. It’s not like you did some good, so good doesn’t matter anymore…

That means that you should be pretty aggressive with what you’re doing, and really trying to hit home runs rather than just have some impact — because the upside is just absolutely enormous.

Sam Bankman-Fried

If you were offered a 100% chance of $1 million to keep yourself, or a 10% chance of $15 million — it makes total sense to play it safe. You’d be devastated if you lost, and barely happier if you won.

But if you were offered a 100% chance of donating $1 billion, or a 10% chance of donating $15 billion, you should just go with whatever has the highest expected value — that is, probability multiplied by the goodness of the outcome — and so swing for the fences.

This is the totally rational but rarely seen high-risk approach to philanthropy championed by today’s guest, Sam Bankman-Fried. Sam founded the cryptocurrency trading platform FTX, which has grown his wealth from around $1 million to $20,000 million.

Added 30 November 2022: What I meant to refer to as totally rational in the above paragraph is thinking about the ‘expected value’ of one’s actions, not maximizing expected dollar returns as if you were entirely ‘risk-neutral’. See clarifications on what I (Rob Wiblin) think about risk-aversion here.

Despite that, Sam still drives a Corolla and sleeps on a beanbag, because the only reason he started FTX was to make money to give it away. In 2020, when he was 5% as rich as he is now, he was nonetheless the second biggest individual donor to Joe Biden’s general election campaign.

In today’s conversation, Sam outlines how at every stage in FTX’s development, he and his team were able to choose the high-risk path to maximise expected value — precisely because they weren’t out to earn money for themselves.

This year his philanthropy has kicked into high gear with the launch of the FTX Future Fund, which has the initial ambition of giving away hundreds of millions a year and hopes to soon escalate to over a billion a year.

The Fund is run by previous guest of the show Nick Beckstead, and embodies the same risk-loving attitude Sam has learned from entrepreneurship and trading on financial markets. Unlike most foundations, the Future Fund:

  • Is open to supporting young people trying to get their first big break
  • Makes applying for a grant surprisingly straightforward
  • Is willing to make bets on projects it completely expects to fail, just because they have positive expected value.

Their website lists both areas of interest and more concrete project ideas they are looking to support. The hope is these will inspire entrepreneurs to come forward, seize the mantle, and be the champions who actually make these things happen. Some of the project proposals are pretty natural, such as:

Some might raise an eyebrow:

And others are quirkier still:

While these ideas may seem pretty random, they all stem from a particular underlying moral and empirical vision that the Future Fund has laid out.

In this conversation, we speak with Sam about the hopes he and the Fund have for how the long-term future of humanity might go incredibly well, the fears they hold about how it could go incredibly badly, and what levers they might be able to pull to slightly nudge us towards the former.

Listeners who want to launch an ambitious project to improve humanity’s future should not only listen to the episode, but also look at the full list of the kind of things Sam and his colleagues are hoping to fund, see if they’re inspired, and if so, apply to get the ball rolling.

On top of that we also cover:

  • How Sam feels now about giving $5 million to Biden’s general election campaign
  • His fears and hopes for artificial intelligence
  • Whether or not blockchain technology actually has useful real-world applications
  • What lessons Sam learned from some serious early setbacks
  • Why he fears the effective altruism community is too conservative
  • Why Sam is as authentic now as he was before he was a celebrity
  • And much more.

Note: Sam has donated to 80,000 Hours in the past

Get this episode by subscribing to our podcast on the world’s most pressing problems and how to solve them: type ‘80,000 Hours’ into your podcasting app. Or read the transcript below.

Producer: Keiran Harris
Audio mastering: Ben Cordell
Transcriptions: Katy Moore

November 17 2022, 1pm GMT: This podcast highlighted Sam Bankman-Fried as a positive example of someone ambitiously pursuing a high-impact career. To say the least, we no longer endorse that. See our statement for why.


Taking a high-risk approach to doing good

Sam Bankman-Fried: If your goal is to have impact on the world — and in particular if your goal is to maximize the amount of impact that you have on the world — that has pretty strong implications for what you end up doing. Among other things, if you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns? Well, in terms of doing good, there’s no such thing: more good is more good. It’s not like you did some good, so good doesn’t matter anymore. But how about money? Are you able to donate so much that money doesn’t matter anymore? And the answer is, I don’t exactly know. But you’re thinking about the scale of the world there, right? At what point are you out of ways for the world to spend money to change?

Sam Bankman-Fried: There’s eight billion people. Government budgets run in the tens of trillions per year. It’s a really massive scale. You take one disease, and that’s a billion a year to help mitigate the effects of one tropical disease. So it’s unclear exactly what the answer is, but it’s at least billions per year probably, so at least 100 billion overall before you risk running out of good things to do with money. I think that’s actually a really powerful fact. That means that you should be pretty aggressive with what you’re doing, and really trying to hit home runs rather than just have some impact — because the upside is just absolutely enormous.

Sam Bankman-Fried: Your strategy is very different if you’re optimizing for making at least a million dollars, versus if you’re optimizing for just the linear amount that you make. One piece of that is that Alameda was a successful trading firm. Why bother with FTX? And the answer is, there was a big opportunity there that I wanted to go after and see what we could do there. It’s not like Alameda was doing well and so what’s the point, because it’s already doing well? No. There’s well, and then there’s better than well — there’s no reason to stop at just doing well.

Sam Bankman-Fried: So, if your goal is to maximize the expected value of the impact that you have, then I think it implies interesting things about how you should behave. And in particular, the expected value of how much impact you have, I think, is going to be a function sort of weighted towards upside tail cases. That’s what I think my prior would be. And if your impact is weighted towards upside tail cases, then what’s that probability distribution of impact probably look like? I think the odds are, it has decent weight on zero. Maybe majority weight.

Sam Bankman-Fried: So I think there are really compelling reasons to think that the “optimal strategy” to follow is one that probably fails — but if it doesn’t fail, it’s great. But as a community, what that would imply is this weird thing where you almost celebrate cases where someone completely craps out — where things end up nowhere close to what they could have been — because that’s what the majority of well-played strategies should end with. I don’t think that we recognize that enough as a community, and I think there are lots of specific instances as well where we don’t incentivize that.

Sam Bankman-Fried: There are all these cases where I think we give not enough attention to think about the high-upside impact you can have. Forget about the common paths and forget about even the probability of success for a sec. Just think about what would massive success look like, and what would maximize your odds of getting there — and then evaluate that path, because I think it’s a pretty plausible one.

Sam Bankman-Fried: I think that often does imply, I don’t know, should you be trying to become a US senator? That’s a question that you could ask. I think the answer’s like, “Well, maybe.” Actually, if you do the math, it seems plausible. But if you do follow that, probably you won’t be one.

Sam Bankman-Fried: But that’s not a path that we talk about very much. I think people often sort of round the odds of that to zero or something in their minds. And I think it’s like, not zero. And on the flip side, there’s too much emphasis, traditionally, on making a bit of money, without having thought hard about whether that’s what you should be doing or not. I think that’s maybe another side of this.

What people get wrong about Sam's success

Sam Bankman-Fried: I think for a lot of people, they just don’t have a model for how it happened. It’s just sort of this weird property of the world; it’s a little bit inexplicable. I don’t know, it happens sometimes: you look at someone and they have incredible success, and you’re like, “Huh. That person is really successful.” It’s sort of like when people think about why was Elon Musk so successful, or why is Jeff Bezos so successful? Most people don’t really have an answer for that, because they don’t even see it so much as a question they’re asking. It just is this weird property of the world, that they were.

Sam Bankman-Fried: But my felt sense — from having been through a lot of it — the first thing is that, to the extent there are multiplicative factors in what’s going on (and I do think there are) that your ultimate “how well you do” is a product of a lot of different things. One thing that implies is that, if it’s a product of four different things, then in order to get anywhere near the peak, you need to do well sort of at all of them. You need to be pretty good at all of them. It’s a high bar. You can’t skip leg day, so to speak. You can’t be like, “I’m going to be really good at some set of things and just ignore the others” — you just lose that multiplicative aspect of it.

Sam Bankman-Fried: I think it’s an important and a weird point. It’s not an absolute point. I don’t want to claim that in all cases, this is the right way to think about things or anything like that. What I’d say instead is something like, you should try and understand in which ways something is multiplicative — in which ways it is the case that, were that factor set really low, you’d be basically fucked. As opposed to, that’s just another factor among many.

Sam Bankman-Fried: What are some of those? One example of this, which I learned early on, is management. If you’re trying to scale something up big, and you’re very good at the object-level task but bad at managing people, and no one on the leadership team is good at managing people, it just becomes a mess. It almost doesn’t matter how good you are at the original thing — you’re not going to become great as a company. It’s really hard to substitute for that. It’s amazing how quickly things can go south, if organizational shit is not in a good state.

Sam Bankman-Fried: That was one example of a case where I originally didn’t particularly think of it as multiplicative, but I do think it was. And I learned that lesson eventually, that you can’t forget about that. I think there are a lot of other things like that that came up.

Sam Bankman-Fried: So we had to be good on a number of different realms. We had to be really ambitious. That was an important part of it. It was just so, so, so easy for us to fail to accomplish what we did, if we just decided our goal was a lot lower. Or in a lot of ways, just getting lazy when we started doing well and being like, “Ah, we’ve done well. No point trying anymore.”

Sam Bankman-Fried: But also, just a lot of strategic decisions, where it’s like, “Are we willing to take any risk in our trading?” If the answer is no, it’s going to really limit the amount of trading we can do, but it is a safer thing to do. That’s an example of a question that we had to face and make decisions about. Another part of this was just aiming high and remembering that — not so much aiming high, but aiming to maximize expected value, is really what I’d say.

Sam Bankman-Fried: I think the way I saw it was like, “Let’s maximize EV: whatever is the highest net expected value thing is what we should do.” As opposed to some super sublinear utility function, which is like, make sure that you continue on a moderately good path above all else, and then anything beyond that is gravy.

Sam Bankman-Fried: I do think those are probably the right choices, but they were scary. I think even more so than some chance of going bust, what they sort of entailed was that we had to have a lot of faith in ourselves almost — that they really would have had a significant chance of going bust if we didn’t play our cards exactly right. There were a lot of things that were balanced on a knife’s edge. Any amount of sloppiness would have been pretty bad. I also think it was a little bit of a thing of, could we play this really well?

The importance of taking responsibility as a CEO

Sam Bankman-Fried: I think that gets to another thing that I’ve ended up feeling is really, really important for running a company, and I think Holden [Karnofsky] was one of the people who sort of helped me realize this. If you’re running a company, and you assign Bob the task of turning the widget, and the widget doesn’t get turned, it’s very tempting for your takeaway to be like, “Fuck Bob. Bob failed.”

Rob Wiblin: And by blaming Bob, I’ve solved the problem.

Sam Bankman-Fried: Exactly. Right. Let’s put aside blaming Bob for a second. Maybe the blame isn’t helpful. Maybe it’s not. Probably it’s not, but let’s even ignore that part. It’s missing the bigger picture, which is that the widget still hasn’t been turned. The important thing is, it’s my fault if the widget ultimately doesn’t get turned. Nothing else changes that. I can do whatever sort of mental gymnastics I want, but in the end I have to make sure the widget gets turned. And my strategy of assigning it to Bob was maybe just the wrong strategy. And instead I should have assigned it to Bill, or Jill, or I don’t know.

Rob Wiblin: Two people.

Sam Bankman-Fried: Or reminded Bob, or hired somebody, or done it myself.

Rob Wiblin: Yeah. Yeah. Or motivate Bob differently.

Sam Bankman-Fried: Exactly. Who knows exactly what I should have done, but somehow, apparently I was not doing the right thing.

Rob Wiblin: Yeah. It’s a very constructive attitude. I’d love to do an episode at some point on how civil aviation became so safe, because it’s interesting from a risk management point of view. As far as I understand it, one important aspect of it was whenever they investigate a plane crash or an accident or anything like that, it’s never acceptable to have the bottom line be the pilot made a mistake.

Sam Bankman-Fried: Yep.

Rob Wiblin: Because the pilot is just a component of the plane that breaks like any other component sometimes. And you have to build the entire system around pilot failure, around human error. So if the pilot made a mistake and it caused a bad outcome, then it’s the system that’s broken, not the pilot. So you just view people like a piece of machinery in this, at least in this particular context. Not in a cruel way.

Sam Bankman-Fried: Yeah. I completely agree.

Sam's views on productive applications of blockchain and crypto

Sam Bankman-Fried: I don’t know for sure, but I do think that there will be a bunch. Some of these have to do with blockchain, some have to do with crypto, and some just have to do with market structure in a way that wouldn’t need to be crypto-specific, but I think often does turn out to be.

Sam Bankman-Fried: One thing that I feel pretty compelled by is just having equitable, direct access to financial markets. The current economic system is really difficult to get good access and outcomes from, for most people. If you want to go buy Apple stock and you’re a typical consumer, how many intermediaries do you think you’re going through, from start to finish?

Sam Bankman-Fried: It’s a few. It’s like 10. It’s a pretty impressive number. And what’s going on there, is basically that you go from the broker to a PFA firm, to an ATS, to another PFA firm, to an exchange. There’s a clearing firm, a custody firm, and then the whole thing is repeated on the other end. What that means is that your actual access to most markets is real crappy. You’re literally not allowed to see the order books that you’re trading on for most people.

Sam Bankman-Fried: You’re submitting orders blind. You’re trading kind of blind. You don’t see market data — that, you need to pay tens of millions of dollars a year for. That seems a little bit insane to me. One of the biggest points of markets is that you get price discovery from them. And if you’re not allowed to see the market data, that’s gating a really important piece of it behind tens of millions of dollars per year, per entity that wants to get market data. So that seems kind of fucked up to me. And I think it’s basically a serious problem with our current market structure for anyone but extremely sophisticated firms. And crypto, for a variety of reasons, is quite different in that respect.

Sam Bankman-Fried: Another piece is just payments. Payments infrastructure is really bad right now in most of the world. We casually give 3% of all of our purchases to credit card companies to cover over the fact that payments infrastructure sucks, and it takes months to clear. It’s just not a well-built system for most people. And I think, frankly, stablecoins actually just work a lot better on that front — to the point where if I want to send someone money, I would way rather send it via stablecoins than traditional systems. So I don’t feel at all conflicted about that.

Sam Bankman-Fried: So that’s one piece of this. If you want to send money back to someone in Nigeria, you’re probably paying 20% and taking a week. It’s a lot to lose on a remittance because of different payment rails in different countries, each one of which sucks. And I think blockchain stablecoins are a pretty good answer to that.

Sam Bankman-Fried: Then the last thing is an example I feel fairly compelled by: social media. So if I’m on Facebook and I want to message you on Twitter, it’s not going to pop up on your Twitter feed or in your DMs there. Those are completely non-interoperable networks. I actually think it’s a little bit weird that that’s the case. Why are there 30 social media networks, none of which can talk to each other? That’s a pretty bad user experience. And I think the one thing that we all — as a nation, as a world — can agree upon at this point, is that bad things happen when one person is the moderator for all of our content.

Sam Bankman-Fried: We tried seeing what happens when Facebook doesn’t censor, and everyone hated it. Then they tried censoring, and everyone hated it.

Rob Wiblin: Do you think the solution is some sort of pluralism in the interface or pluralism in the filtering or curation?

Sam Bankman-Fried: Right, but with the same underlying messaging protocol that everyone can draw from. So if you had on-blockchain encrypted messages, then any user experience could draw on that same set of messages — you can send someone a message from Twitter and it appears in their WhatsApp. That’s fine, so you get interoperability. And from a censorship point of view, anyone can build their own layer on top of it that does or doesn’t censor however they want, and there can be an actual competitive marketplace for it. So that’s a vision that I feel moderately compelled by for social media, as being better than the status quo.

Political giving

Rob Wiblin: Something that people have often suggested is that even though it’s only a billion, things like the presidential campaigns are a little bit saturated. They find it hard to figure out ways to spend more money, because so much of the influence is concentrated on a relatively small number of states with a relatively small number of swing voters. And so, just how many ads can you run on TV? How many times can you call these people, telling them to show up to vote? Maybe even $1 billion is actually getting you pretty close to finding it hard to spend more money.

Rob Wiblin: But then there’s tons of other political races that might be less important than the presidency, but are much less funded — where it’s very clear that your money really can shift the outcome. Do you have any thoughts on that?

Sam Bankman-Fried: I do. It’s definitely something I’ve heard. And my first response — which is not a super helpful response, but it is my first instinctual response — is that I agree one could argue that. Are you arguing that? Is that how you think the numbers turn out? It’s not how I think the numbers turn out. But I agree one could make that argument.

Sam Bankman-Fried: I feel like often when people make that argument, it’s a little motte-and-bailey sometimes, where they’re not actually trying to strongly claim that — or even weakly claim, or maybe even claim that that’s how they think the numbers turn out. But I want to drill down to like, are these people saying that they’ve done the math, and they think that it is not an effective use? Or are they just bringing up that there could be hypothetical worlds in which it was not an effective use?

Sam Bankman-Fried: My sense, when people make this argument, is that usually they are at least implicitly trying to make the argument that it is not a good use to donate. You could do both. Why not both, then? If the argument is that there are good things to do outside of the presidency, I completely agree with that. There are absolutely good things to do.

Rob Wiblin: But you don’t buy that there’s no way to spend more than $1 billion over an entire presidential campaign usefully.

Sam Bankman-Fried: That’s right. And putting aside the other things, when you look into things done by the experts in various fields — campaign operatives would be one example — do you have a sense of, in general, how impressive those things generally end up looking? I think often the state of the art is surprisingly shitty. And the answer is, oh boy, I agree it’s better than a monkey would do. It’s not literally random, but it’s not super impressive, given the stakes.

Rob Wiblin: I guess part of what might be going on here is that when people are thinking about shifting the spending from $1 billion to $2 billion on a presidential campaign, they are thinking about just scaling up exactly the things that they’re doing now. And you’re saying no, we should be thinking bigger. There’s a lot of other things that could be going on. There’s lots of ways we could improve the research, improve our understanding of what positions are good, and on and on and on. People need to expand their minds.

Sam Bankman-Fried: I think that’s right. It’s like, all right, yeah, if you do a really shit job, I agree. But what if you wanted to do a good job with that billion? Then do you think it would have impact?

Sam Bankman-Fried: And one thing to point to here is there’s some cool studies — and I don’t know how much faith to put in these — showing that at least to some extent, in some cases, the average campaign ad has net zero impact. Literally none. It’s unclear if it’s even net positive. And I think a lot of people’s takeaway from that is campaign ads don’t matter, and it’s not clear that’s the right takeaway. A different takeaway one could have is, “But what if you only look at the good campaign ads? Is it that every ad is centered around zero?” And I think the answer is basically no, that’s not what it is — they’re on both sides of zero. But what if you only did the ones on the right side of zero?

Sam Bankman-Fried: The amounts spent in primaries are small. If you have an opinion there, you can have impact. And one crazy fact is: you know which campaign almost went bankrupt in 2020, causing the candidate to drop out of the race?

Rob Wiblin: Biden?

Sam Bankman-Fried: Yeah, that’s right.

Rob Wiblin: And McCain as well, I think, back in 2008.

Sam Bankman-Fried: Yeah. It’s wild.

Rob Wiblin: That’s the margin you’re operating on sometimes.

Sam Bankman-Fried: Exactly. And so, again, I think it’s back to this “if it matters, it matters” thing: if anything matters here, then there are really impactful things to do. And I think it probably does matter. It is unlikely to be the case that the answer is that all candidates are equivalent. That’s not my best guess.

Possible Future Fund projects

Sam Bankman-Fried: There’s a lot. Really excited about them. Some of the things I’m most excited about are, on the pandemic side, I think there’s a lot of infrastructure that could be built. I think early detection is one piece of this. We didn’t know that COVID was happening until some number of months after it actually started happening. And that’s not great. So building systems to be able to detect pandemics early. Potentially building out frameworks for getting drugs to market quicker. I mean, you saw with COVID, how long was it from when we effectively had a vaccine that worked to when the first person got that vaccine? It’s like eight months or something like that. Which is kind of a while.

Rob Wiblin: I saw a draft of the website you’re putting together for the foundation, and you had a bunch of other interesting ideas that I haven’t heard promoted so much — projects where you are potentially looking for founders and you’re interested in funding them.

Rob Wiblin: One was trying to do talent scouting in the developing world: finding people who have amazing potential to become the next generation of top researchers in some area, and then pulling them out and giving them the best opportunities that they can get.

Rob Wiblin: You’re interested in starting a new newspaper that would have better integrity standards or better standards for accuracy than any existing newspaper, which would be extremely cool.

Rob Wiblin: Another one that I’ve thought about before — and I’m surprised it doesn’t exist already, because it doesn’t seem like it would be expensive — is basically just having really thorough polling of experts within lots of different domains on what their opinions are about relevant issues, things that affect people or affect policy or so on. It’s a bit surprising we don’t have a more systematic way of doing that.

Sam Bankman-Fried: Yeah, totally. This obviously starts to interface a little bit with prediction markets too, where we don’t have good infrastructure for getting consensus answers to hard but important questions. Obviously there are a lot of those. Let’s take early in COVID: what was the consensus on the infection fatality rate of COVID? There wasn’t an answer to that, right? There were just lots of incoherent, disconnected answers that differed by orders of magnitude, and clearly were not vetted across each other. That can’t be the best answer.

Sam Bankman-Fried: If only we had some consensus mechanisms. You can just have a marketplace for this. That’s what markets do: they take a lot of people’s different opinions on something and give a central order book to match those opinions against each other and see what consensus comes out. But even if you didn’t want to do that, you can just take a survey and average the results of experts — and even that we don’t really have infrastructure for. You can try to do it ad hoc, and some people have tried to do that in various cases, but it’s a mess. You’re trying to cold call people who you think might know something about it and ask their opinions. It’s bizarre that we don’t have better answers for this.

Rob Wiblin: Are there any other projects that might be a little bit unexpected or a little bit eclectic that you’d like to highlight for people?

Sam Bankman-Fried: You brought up, among other things, trying to recruit really promising people in, especially, developing world countries who would not otherwise have real access to opportunities, and give them those opportunities to do good in the world. That is one thing that I’d be super excited about. It’s not clear exactly what form that takes, but I think that it could just be really, really influential. That’s one that I think is super cool.

Sam Bankman-Fried: And you know what else, I think there’s a lot of requests for project and founder things, where if you have a good idea for something you want to happen, it’s just a really low bar. Come to us. And we’ll try and make it a really smooth process, like get rid of the trivial inconveniences that sometimes make something not come to fruition, see if that can help spur more things to action. That’s a type of thing that I’m pretty excited about.

Sam Bankman-Fried: What else? I think in politics there’s a lot, and in policy. It’s an enormously influential area, and it’s one that I think there aren’t enough effective altruists getting into right now, thinking about how they can have positive impact on policy in the States. I think that helping people get accustomed to it, figure out how to get involved, is something that we’ve been doing a bunch of somewhat behind the scenes, and excited to do more of. That’s a big area.

Sam Bankman-Fried: Then the last thing is just really trying to keep an open mind about what big projects might be great, and being willing to write a billion-dollar check if that turns out to be the right thing to do. If someone’s like, “For this area, here’s what’s blocking us” — like, “We’re not going to make progress until we have a great genotype-to-phenotype map, and it would cost a billion dollars to put that together.” I want to be in a position where we could say, “All right, we’ll think about it. And if that compels us and you seem like the right person to do it, then yeah, that’s a number that could be gotten.”

Sam's biggest uncertainties around his giving

Sam Bankman-Fried: One relevant factor is relative risk from bio versus AI versus nuclear, and how potentially preventable those are. I think that’s probably a factor of three uncertainty or five uncertainty, or something like that, in what the right thing to spend on is. I don’t think it drills down to one key assumption, probably; I think it’s a messy collection of them.

Sam Bankman-Fried: Maybe a bigger core thing is, as long as we don’t screw things up, we’re going to have a great outcome in the end versus how much you have to actively try as a world to end up in a great place. The difference between a really good future and the expected future — given that we make it to the future — are those effectively the same, or are those a factor of 10 to the 30 away from each other? I think that’s a big, big factor, because if they’re basically the same, then it’s all just about pure x-risk prevention: nothing else matters but making sure that we get there. If they’re a factor of 10 to the 30 apart, x-risk prevention is good, but it seems like maybe it’s even more important to try to see what we can do to have a great future. And that might be similar things, but it might be quite different things that you would prioritize.

Sam Bankman-Fried: So that’s one crucial consideration that I don’t feel confident about the answer to. I think different people have very different instincts about it, but that will have pretty important flow-through effects to all of this.

Rob Wiblin: Yeah. An example of something that might spill out of that kind of thinking is that it’s important to convince people that if humanity survives, we should do something really ambitious and great with our potential, rather than just being complacent and sitting on Earth and living our normal lives. Maybe we need to have an active advocacy movement around that.

Sam Bankman-Fried: Yep, that absolutely would be an example of it. Another key consideration here, which I know different people have different instincts on — I think I have a different instinct than much of the effective altruism community does — is how much various things have long-term flow-through effects. And to give some example of that, how much does the president of the United States today impact the far future, conditional on no existential risk during that president’s term? Ignoring the effect on short-term nuclear war and things like that, how much does the general political environment have, in expected value, substantial flow-through effects to the far future? One thing that gets to in the end is this question of how path-dependent things are.

Sam Bankman-Fried: But other than a few very specific things like x-risk, how much is it the case that perturbations in what happens in the world are just not going to persist, versus how much is it the case that actually there’s a lot of different places we could end up, and who really knows what’s going to happen, and it really matters? And we should be really thinking hard about having a better versus worse environment today — discourse environment, intellectual environment — for diffuse long-term flow-through effects. That, I think, is one of the other crucial considerations that I’m not confident in, but I think matters quite a bit.

Should more or fewer people earn to give?

Sam Bankman-Fried: So there’s this thing where on the one hand, [my success is] evidence that it’s easier to make a lot of money maybe — and thus, there’s going to be more money and money is less needed. On the other hand, if it’s easier to make a lot of money, then maybe you should go make a lot of money because it’s easier to do. It’s sort of two sides of the same coin there.

Sam Bankman-Fried: And I think that I’m sort of compelled by both pieces of that. In the end I guess I don’t know which direction it points in more strongly. I don’t think it’s been a huge shift on net. I think what it does mean, though, is that if you’re not super excited about your earning-to-give career path, that’s a pretty bad sign for it. The thing it points most strongly against is grudging, low-upside earning to give, because you think it has to be the right thing to do. No — I think that it is a strong factor against that.

Sam Bankman-Fried: On the one hand, I think it means you should be really excited for potential massive earning-to-give opportunities — things you’re really excited about. On the other hand, it also means there’s more funding for projects, and you should be really excited to start a project that could use funding. And I don’t know exactly which is stronger. Maybe another factor here that does nontrivially lead to my feelings on this is that I think there are a lot of things to do with money. And I think I’m way on one end of that spectrum.

Sam Bankman-Fried: So in bio, how much could you usefully spend? I think it’s like a billion or two on an early detection center, maybe more over time. On fast pathways for vaccine development and release, I think you’re talking a few billion. I think it quickly adds up to 10 billion or something in the bio area for identifiable projects: like a bunker, how much does that cost? Hundreds of millions.

Sam Bankman-Fried: AI things are harder to think about from a cost perspective. Not to say cost doesn’t matter there, but it’s a little weirder to think about because I think it’s a little more bimodal. I don’t know, either it’s just how many servers do you buy, or it’s not just like that. And if it is, then that means there might be a gigantic money pit at the end for AI safety. But if not, maybe it just ends up being not super relevant.

Rob Wiblin: Ah, I see. So you’re saying that there could be this enormous money pit of tens of billions of dollars or more if it really matters who can buy lots of compute at some essential time when AI is making big advances. But if that’s not the case, then it can be a lot harder to see where you could spend tens of billions.

Sam Bankman-Fried: Exactly. AI is more of a thinking thing than a money thing, outside of that. But that might be a real factor.

Sam Bankman-Fried: Then you look at, I don’t know, politics and policy. I’m pretty compelled that if you think it matters — and again, if you don’t think it matters, then obviously the amount that you can spend on that is zero — but if you do think it matters, the kind of numbers you’re talking about are a billion every two years or something like that, that could be potentially usefully used. And that’s a fair bit. So that probably you should think of as like, a billion every few years is like the equivalent of 10 billion today or something. I don’t know, I’m making that up.

Sam Bankman-Fried: But putting these together, a few billion a year is… I don’t want to say it’s a lower bound, but that certainly isn’t my upper bound on this. I certainly think it could get a lot bigger than a few billion a year that could be really usefully spent. I don’t know if you could spend 10 billion a year really usefully. It actually wouldn’t completely shock me if that turned out to be true, but it maybe would surprise me a little bit. But yeah, I think the numbers are big.

Articles, books, and other media discussed in the show

Sam’s recent interviews and testimony:

FTX Foundation’s work:

Related 80,000 Hours resources:

Other 80,000 Hours Podcast episodes:

Everything else:

Related episodes

About the show

The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.

The 80,000 Hours Podcast is produced and edited by Keiran Harris. Get in touch with feedback or guest suggestions by emailing [email protected].

What should I listen to first?

We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:

Check out 'Effective Altruism: An Introduction'

Subscribe here, or anywhere you get podcasts:

If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.