Enjoyed the episode? Want to listen later? Subscribe by searching 80,000 Hours wherever you get your podcasts, or click one of the buttons below:

If you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns? Well, in terms of doing good, there’s no such thing: more good is more good. It’s not like you did some good, so good doesn’t matter anymore…

That means that you should be pretty aggressive with what you’re doing, and really trying to hit home runs rather than just have some impact — because the upside is just absolutely enormous.

Sam Bankman-Fried

If you were offered a 100% chance of $1 million to keep yourself, or a 10% chance of $15 million — it makes total sense to play it safe. You’d be devastated if you lost, and barely happier if you won.

But if you were offered a 100% chance of donating $1 billion, or a 10% chance of donating $15 billion, you should just go with whatever has the highest expected value — that is, probability multiplied by the goodness of the outcome — and so swing for the fences.

This is the totally rational but rarely seen high-risk approach to philanthropy championed by today’s guest, Sam Bankman-Fried. Sam founded the cryptocurrency trading platform FTX, which has grown his wealth from around $1 million to $20,000 million.

Despite that, Sam still drives a Corolla and sleeps on a beanbag, because the only reason he started FTX was to make money to give it away. In 2020, when he was 5% as rich as he is now, he was nonetheless the second biggest individual donor to Joe Biden’s general election campaign.

In today’s conversation, Sam outlines how at every stage in FTX’s development, he and his team were able to choose the high-risk path to maximise expected value — precisely because they weren’t out to earn money for themselves.

This year his philanthropy has kicked into high gear with the launch of the FTX Future Fund, which has the initial ambition of giving away hundreds of millions a year and hopes to soon escalate to over a billion a year.

The Fund is run by previous guest of the show Nick Beckstead, and embodies the same risk-loving attitude Sam has learned from entrepreneurship and trading on financial markets. Unlike most foundations, the Future Fund:

  • Is open to supporting young people trying to get their first big break
  • Makes applying for a grant surprisingly straightforward
  • Is willing to make bets on projects it completely expects to fail, just because they have positive expected value.

Their website lists both areas of interest and more concrete project ideas they are looking to support. The hope is these will inspire entrepreneurs to come forward, seize the mantle, and be the champions who actually make these things happen. Some of the project proposals are pretty natural, such as:

Some might raise an eyebrow:

And others are quirkier still:

While these ideas may seem pretty random, they all stem from a particular underlying moral and empirical vision that the Future Fund has laid out.

In this conversation, we speak with Sam about the hopes he and the Fund have for how the long-term future of humanity might go incredibly well, the fears they hold about how it could go incredibly badly, and what levers they might be able to pull to slightly nudge us towards the former.

Listeners who want to launch an ambitious project to improve humanity’s future should not only listen to the episode, but also look at the full list of the kind of things Sam and his colleagues are hoping to fund, see if they’re inspired, and if so, apply to get the ball rolling.

On top of that we also cover:

  • How Sam feels now about giving $5 million to Biden’s general election campaign
  • His fears and hopes for artificial intelligence
  • Whether or not blockchain technology actually has useful real-world applications
  • What lessons Sam learned from some serious early setbacks
  • Why he fears the effective altruism community is too conservative
  • Why Sam is as authentic now as he was before he was a celebrity
  • And much more.

Note: Sam has donated to 80,000 Hours in the past

Get this episode by subscribing to our podcast on the world’s most pressing problems and how to solve them: type ‘80,000 Hours’ into your podcasting app. Or read the transcript below.

Producer: Keiran Harris
Audio mastering: Ben Cordell
Transcriptions: Katy Moore

Highlights

Taking a high-risk approach to doing good

Sam Bankman-Fried: If your goal is to have impact on the world — and in particular if your goal is to maximize the amount of impact that you have on the world — that has pretty strong implications for what you end up doing. Among other things, if you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns? Well, in terms of doing good, there’s no such thing: more good is more good. It’s not like you did some good, so good doesn’t matter anymore. But how about money? Are you able to donate so much that money doesn’t matter anymore? And the answer is, I don’t exactly know. But you’re thinking about the scale of the world there, right? At what point are you out of ways for the world to spend money to change?

Sam Bankman-Fried: There’s eight billion people. Government budgets run in the tens of trillions per year. It’s a really massive scale. You take one disease, and that’s a billion a year to help mitigate the effects of one tropical disease. So it’s unclear exactly what the answer is, but it’s at least billions per year probably, so at least 100 billion overall before you risk running out of good things to do with money. I think that’s actually a really powerful fact. That means that you should be pretty aggressive with what you’re doing, and really trying to hit home runs rather than just have some impact — because the upside is just absolutely enormous.

Sam Bankman-Fried: Your strategy is very different if you’re optimizing for making at least a million dollars, versus if you’re optimizing for just the linear amount that you make. One piece of that is that Alameda was a successful trading firm. Why bother with FTX? And the answer is, there was a big opportunity there that I wanted to go after and see what we could do there. It’s not like Alameda was doing well and so what’s the point, because it’s already doing well? No. There’s well, and then there’s better than well — there’s no reason to stop at just doing well.


Sam Bankman-Fried: So, if your goal is to maximize the expected value of the impact that you have, then I think it implies interesting things about how you should behave. And in particular, the expected value of how much impact you have, I think, is going to be a function sort of weighted towards upside tail cases. That’s what I think my prior would be. And if your impact is weighted towards upside tail cases, then what’s that probability distribution of impact probably look like? I think the odds are, it has decent weight on zero. Maybe majority weight.

Sam Bankman-Fried: So I think there are really compelling reasons to think that the “optimal strategy” to follow is one that probably fails — but if it doesn’t fail, it’s great. But as a community, what that would imply is this weird thing where you almost celebrate cases where someone completely craps out — where things end up nowhere close to what they could have been — because that’s what the majority of well-played strategies should end with. I don’t think that we recognize that enough as a community, and I think there are lots of specific instances as well where we don’t incentivize that.

Sam Bankman-Fried: There are all these cases where I think we give not enough attention to think about the high-upside impact you can have. Forget about the common paths and forget about even the probability of success for a sec. Just think about what would massive success look like, and what would maximize your odds of getting there — and then evaluate that path, because I think it’s a pretty plausible one.

Sam Bankman-Fried: I think that often does imply, I don’t know, should you be trying to become a US senator? That’s a question that you could ask. I think the answer’s like, “Well, maybe.” Actually, if you do the math, it seems plausible. But if you do follow that, probably you won’t be one.

Sam Bankman-Fried: But that’s not a path that we talk about very much. I think people often sort of round the odds of that to zero or something in their minds. And I think it’s like, not zero. And on the flip side, there’s too much emphasis, traditionally, on making a bit of money, without having thought hard about whether that’s what you should be doing or not. I think that’s maybe another side of this.

What people get wrong about Sam's success

Sam Bankman-Fried: I think for a lot of people, they just don’t have a model for how it happened. It’s just sort of this weird property of the world; it’s a little bit inexplicable. I don’t know, it happens sometimes: you look at someone and they have incredible success, and you’re like, “Huh. That person is really successful.” It’s sort of like when people think about why was Elon Musk so successful, or why is Jeff Bezos so successful? Most people don’t really have an answer for that, because they don’t even see it so much as a question they’re asking. It just is this weird property of the world, that they were.

Sam Bankman-Fried: But my felt sense — from having been through a lot of it — the first thing is that, to the extent there are multiplicative factors in what’s going on (and I do think there are) that your ultimate “how well you do” is a product of a lot of different things. One thing that implies is that, if it’s a product of four different things, then in order to get anywhere near the peak, you need to do well sort of at all of them. You need to be pretty good at all of them. It’s a high bar. You can’t skip leg day, so to speak. You can’t be like, “I’m going to be really good at some set of things and just ignore the others” — you just lose that multiplicative aspect of it.

Sam Bankman-Fried: I think it’s an important and a weird point. It’s not an absolute point. I don’t want to claim that in all cases, this is the right way to think about things or anything like that. What I’d say instead is something like, you should try and understand in which ways something is multiplicative — in which ways it is the case that, were that factor set really low, you’d be basically fucked. As opposed to, that’s just another factor among many.

Sam Bankman-Fried: What are some of those? One example of this, which I learned early on, is management. If you’re trying to scale something up big, and you’re very good at the object-level task but bad at managing people, and no one on the leadership team is good at managing people, it just becomes a mess. It almost doesn’t matter how good you are at the original thing — you’re not going to become great as a company. It’s really hard to substitute for that. It’s amazing how quickly things can go south, if organizational shit is not in a good state.

Sam Bankman-Fried: That was one example of a case where I originally didn’t particularly think of it as multiplicative, but I do think it was. And I learned that lesson eventually, that you can’t forget about that. I think there are a lot of other things like that that came up.

Sam Bankman-Fried: So we had to be good on a number of different realms. We had to be really ambitious. That was an important part of it. It was just so, so, so easy for us to fail to accomplish what we did, if we just decided our goal was a lot lower. Or in a lot of ways, just getting lazy when we started doing well and being like, “Ah, we’ve done well. No point trying anymore.”

Sam Bankman-Fried: But also, just a lot of strategic decisions, where it’s like, “Are we willing to take any risk in our trading?” If the answer is no, it’s going to really limit the amount of trading we can do, but it is a safer thing to do. That’s an example of a question that we had to face and make decisions about. Another part of this was just aiming high and remembering that — not so much aiming high, but aiming to maximize expected value, is really what I’d say.

Sam Bankman-Fried: I think the way I saw it was like, “Let’s maximize EV: whatever is the highest net expected value thing is what we should do.” As opposed to some super sublinear utility function, which is like, make sure that you continue on a moderately good path above all else, and then anything beyond that is gravy.

Sam Bankman-Fried: I do think those are probably the right choices, but they were scary. I think even more so than some chance of going bust, what they sort of entailed was that we had to have a lot of faith in ourselves almost — that they really would have had a significant chance of going bust if we didn’t play our cards exactly right. There were a lot of things that were balanced on a knife’s edge. Any amount of sloppiness would have been pretty bad. I also think it was a little bit of a thing of, could we play this really well?

The importance of taking responsibility as a CEO

Sam Bankman-Fried: I think that gets to another thing that I’ve ended up feeling is really, really important for running a company, and I think Holden [Karnofsky] was one of the people who sort of helped me realize this. If you’re running a company, and you assign Bob the task of turning the widget, and the widget doesn’t get turned, it’s very tempting for your takeaway to be like, “Fuck Bob. Bob failed.”

Rob Wiblin: And by blaming Bob, I’ve solved the problem.

Sam Bankman-Fried: Exactly. Right. Let’s put aside blaming Bob for a second. Maybe the blame isn’t helpful. Maybe it’s not. Probably it’s not, but let’s even ignore that part. It’s missing the bigger picture, which is that the widget still hasn’t been turned. The important thing is, it’s my fault if the widget ultimately doesn’t get turned. Nothing else changes that. I can do whatever sort of mental gymnastics I want, but in the end I have to make sure the widget gets turned. And my strategy of assigning it to Bob was maybe just the wrong strategy. And instead I should have assigned it to Bill, or Jill, or I don’t know.

Rob Wiblin: Two people.

Sam Bankman-Fried: Or reminded Bob, or hired somebody, or done it myself.

Rob Wiblin: Yeah. Yeah. Or motivate Bob differently.

Sam Bankman-Fried: Exactly. Who knows exactly what I should have done, but somehow, apparently I was not doing the right thing.

Rob Wiblin: Yeah. It’s a very constructive attitude. I’d love to do an episode at some point on how civil aviation became so safe, because it’s interesting from a risk management point of view. As far as I understand it, one important aspect of it was whenever they investigate a plane crash or an accident or anything like that, it’s never acceptable to have the bottom line be the pilot made a mistake.

Sam Bankman-Fried: Yep.

Rob Wiblin: Because the pilot is just a component of the plane that breaks like any other component sometimes. And you have to build the entire system around pilot failure, around human error. So if the pilot made a mistake and it caused a bad outcome, then it’s the system that’s broken, not the pilot. So you just view people like a piece of machinery in this, at least in this particular context. Not in a cruel way.

Sam Bankman-Fried: Yeah. I completely agree.

Sam's views on productive applications of blockchain and crypto

Sam Bankman-Fried: I don’t know for sure, but I do think that there will be a bunch. Some of these have to do with blockchain, some have to do with crypto, and some just have to do with market structure in a way that wouldn’t need to be crypto-specific, but I think often does turn out to be.

Sam Bankman-Fried: One thing that I feel pretty compelled by is just having equitable, direct access to financial markets. The current economic system is really difficult to get good access and outcomes from, for most people. If you want to go buy Apple stock and you’re a typical consumer, how many intermediaries do you think you’re going through, from start to finish?

Sam Bankman-Fried: It’s a few. It’s like 10. It’s a pretty impressive number. And what’s going on there, is basically that you go from the broker to a PFA firm, to an ATS, to another PFA firm, to an exchange. There’s a clearing firm, a custody firm, and then the whole thing is repeated on the other end. What that means is that your actual access to most markets is real crappy. You’re literally not allowed to see the order books that you’re trading on for most people.

Sam Bankman-Fried: You’re submitting orders blind. You’re trading kind of blind. You don’t see market data — that, you need to pay tens of millions of dollars a year for. That seems a little bit insane to me. One of the biggest points of markets is that you get price discovery from them. And if you’re not allowed to see the market data, that’s gating a really important piece of it behind tens of millions of dollars per year, per entity that wants to get market data. So that seems kind of fucked up to me. And I think it’s basically a serious problem with our current market structure for anyone but extremely sophisticated firms. And crypto, for a variety of reasons, is quite different in that respect.


Sam Bankman-Fried: Another piece is just payments. Payments infrastructure is really bad right now in most of the world. We casually give 3% of all of our purchases to credit card companies to cover over the fact that payments infrastructure sucks, and it takes months to clear. It’s just not a well-built system for most people. And I think, frankly, stablecoins actually just work a lot better on that front — to the point where if I want to send someone money, I would way rather send it via stablecoins than traditional systems. So I don’t feel at all conflicted about that.

Sam Bankman-Fried: So that’s one piece of this. If you want to send money back to someone in Nigeria, you’re probably paying 20% and taking a week. It’s a lot to lose on a remittance because of different payment rails in different countries, each one of which sucks. And I think blockchain stablecoins are a pretty good answer to that.


Sam Bankman-Fried: Then the last thing is an example I feel fairly compelled by: social media. So if I’m on Facebook and I want to message you on Twitter, it’s not going to pop up on your Twitter feed or in your DMs there. Those are completely non-interoperable networks. I actually think it’s a little bit weird that that’s the case. Why are there 30 social media networks, none of which can talk to each other? That’s a pretty bad user experience. And I think the one thing that we all — as a nation, as a world — can agree upon at this point, is that bad things happen when one person is the moderator for all of our content.

Sam Bankman-Fried: We tried seeing what happens when Facebook doesn’t censor, and everyone hated it. Then they tried censoring, and everyone hated it.

Rob Wiblin: Do you think the solution is some sort of pluralism in the interface or pluralism in the filtering or curation?

Sam Bankman-Fried: Right, but with the same underlying messaging protocol that everyone can draw from. So if you had on-blockchain encrypted messages, then any user experience could draw on that same set of messages — you can send someone a message from Twitter and it appears in their WhatsApp. That’s fine, so you get interoperability. And from a censorship point of view, anyone can build their own layer on top of it that does or doesn’t censor however they want, and there can be an actual competitive marketplace for it. So that’s a vision that I feel moderately compelled by for social media, as being better than the status quo.

Political giving

Rob Wiblin: Something that people have often suggested is that even though it’s only a billion, things like the presidential campaigns are a little bit saturated. They find it hard to figure out ways to spend more money, because so much of the influence is concentrated on a relatively small number of states with a relatively small number of swing voters. And so, just how many ads can you run on TV? How many times can you call these people, telling them to show up to vote? Maybe even $1 billion is actually getting you pretty close to finding it hard to spend more money.

Rob Wiblin: But then there’s tons of other political races that might be less important than the presidency, but are much less funded — where it’s very clear that your money really can shift the outcome. Do you have any thoughts on that?

Sam Bankman-Fried: I do. It’s definitely something I’ve heard. And my first response — which is not a super helpful response, but it is my first instinctual response — is that I agree one could argue that. Are you arguing that? Is that how you think the numbers turn out? It’s not how I think the numbers turn out. But I agree one could make that argument.

Sam Bankman-Fried: I feel like often when people make that argument, it’s a little motte-and-bailey sometimes, where they’re not actually trying to strongly claim that — or even weakly claim, or maybe even claim that that’s how they think the numbers turn out. But I want to drill down to like, are these people saying that they’ve done the math, and they think that it is not an effective use? Or are they just bringing up that there could be hypothetical worlds in which it was not an effective use?

Sam Bankman-Fried: My sense, when people make this argument, is that usually they are at least implicitly trying to make the argument that it is not a good use to donate. You could do both. Why not both, then? If the argument is that there are good things to do outside of the presidency, I completely agree with that. There are absolutely good things to do.

Rob Wiblin: But you don’t buy that there’s no way to spend more than $1 billion over an entire presidential campaign usefully.

Sam Bankman-Fried: That’s right. And putting aside the other things, when you look into things done by the experts in various fields — campaign operatives would be one example — do you have a sense of, in general, how impressive those things generally end up looking? I think often the state of the art is surprisingly shitty. And the answer is, oh boy, I agree it’s better than a monkey would do. It’s not literally random, but it’s not super impressive, given the stakes.

Rob Wiblin: I guess part of what might be going on here is that when people are thinking about shifting the spending from $1 billion to $2 billion on a presidential campaign, they are thinking about just scaling up exactly the things that they’re doing now. And you’re saying no, we should be thinking bigger. There’s a lot of other things that could be going on. There’s lots of ways we could improve the research, improve our understanding of what positions are good, and on and on and on. People need to expand their minds.

Sam Bankman-Fried: I think that’s right. It’s like, all right, yeah, if you do a really shit job, I agree. But what if you wanted to do a good job with that billion? Then do you think it would have impact?

Sam Bankman-Fried: And one thing to point to here is there’s some cool studies — and I don’t know how much faith to put in these — showing that at least to some extent, in some cases, the average campaign ad has net zero impact. Literally none. It’s unclear if it’s even net positive. And I think a lot of people’s takeaway from that is campaign ads don’t matter, and it’s not clear that’s the right takeaway. A different takeaway one could have is, “But what if you only look at the good campaign ads? Is it that every ad is centered around zero?” And I think the answer is basically no, that’s not what it is — they’re on both sides of zero. But what if you only did the ones on the right side of zero?


Sam Bankman-Fried: The amounts spent in primaries are small. If you have an opinion there, you can have impact. And one crazy fact is: you know which campaign almost went bankrupt in 2020, causing the candidate to drop out of the race?

Rob Wiblin: Biden?

Sam Bankman-Fried: Yeah, that’s right.

Rob Wiblin: And McCain as well, I think, back in 2008.

Sam Bankman-Fried: Yeah. It’s wild.

Rob Wiblin: That’s the margin you’re operating on sometimes.

Sam Bankman-Fried: Exactly. And so, again, I think it’s back to this “if it matters, it matters” thing: if anything matters here, then there are really impactful things to do. And I think it probably does matter. It is unlikely to be the case that the answer is that all candidates are equivalent. That’s not my best guess.

Possible Future Fund projects

Sam Bankman-Fried: There’s a lot. Really excited about them. Some of the things I’m most excited about are, on the pandemic side, I think there’s a lot of infrastructure that could be built. I think early detection is one piece of this. We didn’t know that COVID was happening until some number of months after it actually started happening. And that’s not great. So building systems to be able to detect pandemics early. Potentially building out frameworks for getting drugs to market quicker. I mean, you saw with COVID, how long was it from when we effectively had a vaccine that worked to when the first person got that vaccine? It’s like eight months or something like that. Which is kind of a while.

Rob Wiblin: I saw a draft of the website you’re putting together for the foundation, and you had a bunch of other interesting ideas that I haven’t heard promoted so much — projects where you are potentially looking for founders and you’re interested in funding them.

Rob Wiblin: One was trying to do talent scouting in the developing world: finding people who have amazing potential to become the next generation of top researchers in some area, and then pulling them out and giving them the best opportunities that they can get.

Rob Wiblin: You’re interested in starting a new newspaper that would have better integrity standards or better standards for accuracy than any existing newspaper, which would be extremely cool.

Rob Wiblin: Another one that I’ve thought about before — and I’m surprised it doesn’t exist already, because it doesn’t seem like it would be expensive — is basically just having really thorough polling of experts within lots of different domains on what their opinions are about relevant issues, things that affect people or affect policy or so on. It’s a bit surprising we don’t have a more systematic way of doing that.

Sam Bankman-Fried: Yeah, totally. This obviously starts to interface a little bit with prediction markets too, where we don’t have good infrastructure for getting consensus answers to hard but important questions. Obviously there are a lot of those. Let’s take early in COVID: what was the consensus on the infection fatality rate of COVID? There wasn’t an answer to that, right? There were just lots of incoherent, disconnected answers that differed by orders of magnitude, and clearly were not vetted across each other. That can’t be the best answer.

Sam Bankman-Fried: If only we had some consensus mechanisms. You can just have a marketplace for this. That’s what markets do: they take a lot of people’s different opinions on something and give a central order book to match those opinions against each other and see what consensus comes out. But even if you didn’t want to do that, you can just take a survey and average the results of experts — and even that we don’t really have infrastructure for. You can try to do it ad hoc, and some people have tried to do that in various cases, but it’s a mess. You’re trying to cold call people who you think might know something about it and ask their opinions. It’s bizarre that we don’t have better answers for this.

Rob Wiblin: Are there any other projects that might be a little bit unexpected or a little bit eclectic that you’d like to highlight for people?

Sam Bankman-Fried: You brought up, among other things, trying to recruit really promising people in, especially, developing world countries who would not otherwise have real access to opportunities, and give them those opportunities to do good in the world. That is one thing that I’d be super excited about. It’s not clear exactly what form that takes, but I think that it could just be really, really influential. That’s one that I think is super cool.

Sam Bankman-Fried: And you know what else, I think there’s a lot of requests for project and founder things, where if you have a good idea for something you want to happen, it’s just a really low bar. Come to us. And we’ll try and make it a really smooth process, like get rid of the trivial inconveniences that sometimes make something not come to fruition, see if that can help spur more things to action. That’s a type of thing that I’m pretty excited about.

Sam Bankman-Fried: What else? I think in politics there’s a lot, and in policy. It’s an enormously influential area, and it’s one that I think there aren’t enough effective altruists getting into right now, thinking about how they can have positive impact on policy in the States. I think that helping people get accustomed to it, figure out how to get involved, is something that we’ve been doing a bunch of somewhat behind the scenes, and excited to do more of. That’s a big area.

Sam Bankman-Fried: Then the last thing is just really trying to keep an open mind about what big projects might be great, and being willing to write a billion-dollar check if that turns out to be the right thing to do. If someone’s like, “For this area, here’s what’s blocking us” — like, “We’re not going to make progress until we have a great genotype-to-phenotype map, and it would cost a billion dollars to put that together.” I want to be in a position where we could say, “All right, we’ll think about it. And if that compels us and you seem like the right person to do it, then yeah, that’s a number that could be gotten.”

Sam's biggest uncertainties around his giving

Sam Bankman-Fried: One relevant factor is relative risk from bio versus AI versus nuclear, and how potentially preventable those are. I think that’s probably a factor of three uncertainty or five uncertainty, or something like that, in what the right thing to spend on is. I don’t think it drills down to one key assumption, probably; I think it’s a messy collection of them.

Sam Bankman-Fried: Maybe a bigger core thing is, as long as we don’t screw things up, we’re going to have a great outcome in the end versus how much you have to actively try as a world to end up in a great place. The difference between a really good future and the expected future — given that we make it to the future — are those effectively the same, or are those a factor of 10 to the 30 away from each other? I think that’s a big, big factor, because if they’re basically the same, then it’s all just about pure x-risk prevention: nothing else matters but making sure that we get there. If they’re a factor of 10 to the 30 apart, x-risk prevention is good, but it seems like maybe it’s even more important to try to see what we can do to have a great future. And that might be similar things, but it might be quite different things that you would prioritize.

Sam Bankman-Fried: So that’s one crucial consideration that I don’t feel confident about the answer to. I think different people have very different instincts about it, but that will have pretty important flow-through effects to all of this.

Rob Wiblin: Yeah. An example of something that might spill out of that kind of thinking is that it’s important to convince people that if humanity survives, we should do something really ambitious and great with our potential, rather than just being complacent and sitting on Earth and living our normal lives. Maybe we need to have an active advocacy movement around that.

Sam Bankman-Fried: Yep, that absolutely would be an example of it. Another key consideration here, which I know different people have different instincts on — I think I have a different instinct than much of the effective altruism community does — is how much various things have long-term flow-through effects. And to give some example of that, how much does the president of the United States today impact the far future, conditional on no existential risk during that president’s term? Ignoring the effect on short-term nuclear war and things like that, how much does the general political environment have, in expected value, substantial flow-through effects to the far future? One thing that gets to in the end is this question of how path-dependent things are.

Sam Bankman-Fried: But other than a few very specific things like x-risk, how much is it the case that perturbations in what happens in the world are just not going to persist, versus how much is it the case that actually there’s a lot of different places we could end up, and who really knows what’s going to happen, and it really matters? And we should be really thinking hard about having a better versus worse environment today — discourse environment, intellectual environment — for diffuse long-term flow-through effects. That, I think, is one of the other crucial considerations that I’m not confident in, but I think matters quite a bit.

Should more or fewer people earn to give?

Sam Bankman-Fried: So there’s this thing where on the one hand, [my success is] evidence that it’s easier to make a lot of money maybe — and thus, there’s going to be more money and money is less needed. On the other hand, if it’s easier to make a lot of money, then maybe you should go make a lot of money because it’s easier to do. It’s sort of two sides of the same coin there.

Sam Bankman-Fried: And I think that I’m sort of compelled by both pieces of that. In the end I guess I don’t know which direction it points in more strongly. I don’t think it’s been a huge shift on net. I think what it does mean, though, is that if you’re not super excited about your earning-to-give career path, that’s a pretty bad sign for it. The thing it points most strongly against is grudging, low-upside earning to give, because you think it has to be the right thing to do. No — I think that it is a strong factor against that.

Sam Bankman-Fried: On the one hand, I think it means you should be really excited for potential massive earning-to-give opportunities — things you’re really excited about. On the other hand, it also means there’s more funding for projects, and you should be really excited to start a project that could use funding. And I don’t know exactly which is stronger. Maybe another factor here that does nontrivially lead to my feelings on this is that I think there are a lot of things to do with money. And I think I’m way on one end of that spectrum.

Sam Bankman-Fried: So in bio, how much could you usefully spend? I think it’s like a billion or two on an early detection center, maybe more over time. On fast pathways for vaccine development and release, I think you’re talking a few billion. I think it quickly adds up to 10 billion or something in the bio area for identifiable projects: like a bunker, how much does that cost? Hundreds of millions.

Sam Bankman-Fried: AI things are harder to think about from a cost perspective. Not to say cost doesn’t matter there, but it’s a little weirder to think about because I think it’s a little more bimodal. I don’t know, either it’s just how many servers do you buy, or it’s not just like that. And if it is, then that means there might be a gigantic money pit at the end for AI safety. But if not, maybe it just ends up being not super relevant.

Rob Wiblin: Ah, I see. So you’re saying that there could be this enormous money pit of tens of billions of dollars or more if it really matters who can buy lots of compute at some essential time when AI is making big advances. But if that’s not the case, then it can be a lot harder to see where you could spend tens of billions.

Sam Bankman-Fried: Exactly. AI is more of a thinking thing than a money thing, outside of that. But that might be a real factor.

Sam Bankman-Fried: Then you look at, I don’t know, politics and policy. I’m pretty compelled that if you think it matters — and again, if you don’t think it matters, then obviously the amount that you can spend on that is zero — but if you do think it matters, the kind of numbers you’re talking about are a billion every two years or something like that, that could be potentially usefully used. And that’s a fair bit. So that probably you should think of as like, a billion every few years is like the equivalent of 10 billion today or something. I don’t know, I’m making that up.

Sam Bankman-Fried: But putting these together, a few billion a year is… I don’t want to say it’s a lower bound, but that certainly isn’t my upper bound on this. I certainly think it could get a lot bigger than a few billion a year that could be really usefully spent. I don’t know if you could spend 10 billion a year really usefully. It actually wouldn’t completely shock me if that turned out to be true, but it maybe would surprise me a little bit. But yeah, I think the numbers are big.

Articles, books, and other media discussed in the show

Sam’s recent interviews and testimony:

FTX Foundation’s work:

Related 80,000 Hours resources:

Other 80,000 Hours Podcast episodes:

Everything else:

Transcript

Rob’s intro [00:00:00]

Rob Wiblin: Hi listeners, this is The 80,000 Hours Podcast, where we have unusually in-depth conversations about the world’s most pressing problems, what you can do to solve them, and whose fault it is if Bob doesn’t turn the widget. I’m Rob Wiblin, Head of Research at 80,000 Hours.

Over the last two years, Sam Bankman-Fried has gone from relative obscurity to being the biggest single source of funding committed to effective altruist and longtermist projects, so it was only a matter of time before we got him on the show.

Sam is more than just a megadonor though — he’s also one of the more prominent people both leading and critiquing the blockchain and distributed finances communities.

And on top of that, he’s just as fun and forthright as he always used to be when nobody knew who the hell he was.

If you’d like to learn more about Sam’s philanthropic plans, more precisely what sort of stuff his foundation is hoping to fund, and how you could apply for funding either for yourself or a project you’re working on, you can find all of that at ftxfuturefund dot org.

It’s a very nicely designed website that packs a lot of information into just a few pages. That’s ftxfuturefund dot org.

OK, without further ado, I bring you Sam Bankman-Fried.

The interview begins [00:01:07]

Rob Wiblin: Today I’m speaking with Sam Bankman-Fried. Sam is the CEO of FTX, one of the world’s largest cryptocurrency exchanges, which he founded in 2019. His wealth fluctuates a fair bit, but it is in the tens of billions of dollars — a decent sum for someone still in their 20s, and one which lands him firmly among the 100 richest Americans. More interestingly than that, though, Sam plans to give away almost all of the money he’s made — and indeed, a desire to make money to give to charities and other projects that improve the world was the main reason he went into business in the first place.

Rob Wiblin: After studying physics at MIT, Sam first worked at the trading firm Jane Street, intending to earn to give for impactful charities. He then broke away to start his own private trading firm focused on cryptocurrencies, and then moved on to try to build the world’s best cryptocurrency trading platform. That effort turned into FTX, which now handles tens of billions of dollars in trading volume every day, and has quarterback Tom Brady and supermodel Gisele Bündchen as brand ambassadors (among others).

Rob Wiblin: On top of all that, he’s a long-time vegan, was the second-largest public donor to Joe Biden’s election campaign, took the Giving What We Can pledge in 2016, sleeps on a beanbag at his office, continues to live with roommates, and describes himself as a utilitarian. Thanks so much for coming on the podcast, Sam.

Sam Bankman-Fried: Thanks for having me.

Rob Wiblin: I hope to get to chat about your plans to give away the money you’ve made, and your views on developing cryptoassets as a way to do good. But first, as always, what are you working at the moment, and why do you think it’s important?

Sam Bankman-Fried: So the things I’m doing day-to-day, it varies quite a bit. There’s almost no relationship between my day-to-day a year ago and my day-to-day today. I do think that it’s going to keep changing, but right now probably the biggest things I’ve been working on in the crypto side are regulatory environments.

Sam Bankman-Fried: There is actually an enormous difference now versus a year ago in the crypto environment. A year ago there just wasn’t that much emphasis on the regulatory environment; most regulators were only just starting to think about it. Today by far the most defining feature of the crypto industry is the stance of various players from a regulatory standpoint. I’m probably spending more than half of my time on regulatory efforts. I’m going to DC basically every couple weeks to talk with regulators and policymakers. It’s just become by far the biggest thing in the industry.

Sam Bankman-Fried: So that’s one piece of it, but there’s just an enormous long tail of things that I’m working on as well. Everything from project managing various things that the company’s doing, to recruiting and hiring, to media and PR — and then overarching all of that, trying to make sure we don’t become a shitshow as a company, which is probably my single biggest job.

Rob Wiblin: Yeah. I think that’s often a huge challenge for anyone running any large organization, especially one that’s grown pretty rapidly, as FTX has. Speaking of regulation, you gave a bunch of testimony in front of the US Senate in December, right?

Sam Bankman-Fried: Yep.

Rob Wiblin: I quickly skimmed over that. It was actually pretty good.

Sam Bankman-Fried: Thank you.

Rob Wiblin: Well, your testimony was good, but also just in general the conversation seemed surprisingly sophisticated. I can remember people from the social media companies testifying in front of the Senate, and it just being a bit ridiculous how little knowledge the congresspeople had about even the most basic things to do with the internet. But here it seems like we’ve come many steps forward.

Sam Bankman-Fried: I think that’s absolutely right. It’s just a really enormous difference, frankly, between where we are today and where we were even just six months or a year ago. But I’ve also been really happy overall with how the regulatory conversations have been going. And specifically when I was testifying, the questions were good. People cared; they’re trying to make progress on the issues. It was a really constructive environment. I think that regulators in industry and Democrats and Republicans were all on at least moderately similar pages. I don’t want to say exactly the same page, but they felt like they were from the same book at least.

Rob Wiblin: It’s interesting because it’s a slightly new topic. People haven’t formed into partisan trenches on this yet. They don’t know what their ideological position is meant to be on this.

Sam Bankman-Fried: Oh yeah, yeah. That’s definitely a big part of it. They’ll figure out how to be partisan eventually, then everything will stop.

Rob Wiblin: But right now they’re just confused.

Sam Bankman-Fried: Yeah. For now. And that’s a lot of what I’m trying to do also. I think we were actually in pretty big danger, six months ago, of falling into that trap. There is this moment when all of a sudden everyone’s like, “Oh shit, it’s just being political. The party alignment is now known.” And that poses a really big risk to the industry, frankly. I’m really happy that it seems like, at least for now, we’ve avoided that and stayed decently bipartisan. I think the industry was at fault for a lot of this looking like it was going to be more politicized, just not interfacing well with policymakers and regulators. And I think especially on the left, there’s a lot of skepticism because of that. But I think things have turned around quite a bit.

Taking a high-risk approach to doing good [00:06:10]

Rob Wiblin: Yeah. Let’s back up a bit, and help to set the scene for listeners. What motivated you to take such a high-risk, high-return approach to doing good as starting your own crypto trading firm? And then also just saying, “We don’t like the exchanges we’re operating on. I’m going to start my own crypto exchange and try to compete there.”

Sam Bankman-Fried: This probably won’t be super shocking to you, but when you think about things from — taking a step back —

Rob Wiblin: Expected value?

Sam Bankman-Fried: If your goal is to have impact on the world — and in particular if your goal is to maximize the amount of impact that you have on the world — that has pretty strong implications for what you end up doing. Among other things, if you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns? Well, in terms of doing good, there’s no such thing: more good is more good. It’s not like you did some good, so good doesn’t matter anymore. But how about money? Are you able to donate so much that money doesn’t matter anymore? And the answer is, I don’t exactly know. But you’re thinking about the scale of the world there, right? At what point are you out of ways for the world to spend money to change?

Sam Bankman-Fried: There’s eight billion people. Government budgets run in the tens of trillions per year. It’s a really massive scale. You take one disease, and that’s a billion a year to help mitigate the effects of one tropical disease. So it’s unclear exactly what the answer is, but it’s at least billions per year probably, so at least 100 billion overall before you risk running out of good things to do with money. I think that’s actually a really powerful fact. That means that you should be pretty aggressive with what you’re doing, and really trying to hit home runs rather than just have some impact — because the upside is just absolutely enormous.

Rob Wiblin: Yeah. Our instincts about how much risk to take on are trained on the fact that in day-to-day life, the upside for us as individuals is super limited. Even if you become a millionaire, there’s just only so much incrementally better that your life is going to be — and getting wiped out is very bad by contrast.

Rob Wiblin: But when it comes to doing good, you don’t hit declining returns like that at all. Or not really on the scale of the amount of money that any one person can make. So you kind of want to just be risk neutral. As an individual, to make a bet where it’s like, “I’m going to gamble my $10 billion and either get $20 billion or $0, with equal probability” would be madness. But from an altruistic point of view, it’s not so crazy. Maybe that’s an even bet, but you should be much more open to making radical gambles like that.

Sam Bankman-Fried: Completely agree. I think that’s just a big piece of it. Your strategy is very different if you’re optimizing for making at least a million dollars, versus if you’re optimizing for just the linear amount that you make. One piece of that is that Alameda was a successful trading firm. Why bother with FTX? And the answer is, there was a big opportunity there that I wanted to go after and see what we could do there. It’s not like Alameda was doing well and so what’s the point, because it’s already doing well? No. There’s well, and then there’s better than well — there’s no reason to stop at just doing well.

Rob Wiblin: So Alameda was the trading firm. When you were considering moving on and instead trying to make a platform, did you formally think, “Here’s the probability that we succeed at becoming a major platform. And if we do, then this is the amount of money. So if we multiply it through, here’s the expected value that’s higher than the amount I get from sticking with this current plan. So I’m going to switch.”?

Sam Bankman-Fried: Yeah, that’s basically right. That is effectively the math that we went through. The core of it was like, what are the odds we’d be successful? I certainly can’t say with confidence, “The odds are exactly X,” but we felt pretty confident we could build a good platform. For that, I think we put like 80% at least that we could build a better platform than the existing ones.

Sam Bankman-Fried: But I had no fucking clue how to get a user. It’s a consumer-facing product, right? Building a good platform isn’t worth anything if no one ever uses it. I didn’t even know where to start there. So I was the most optimistic at the company when we were thinking of starting FTX, and I was at 20% that it would be at all successful. That was the most optimistic of anyone. So why do it then? If we’re already successful, and 80% to fail at what we were going to do?

Sam Bankman-Fried: And the answer is, well, there are big numbers out there, right? And like 20% sure, OK, you divide by five. How much are these platforms making, a few billion a year? 20% chance of success puts you at 400 million a year or something if you became the biggest. If you thought that, conditional on success, we definitely were the biggest — which you probably shouldn’t think, so maybe discount that some — maybe think it’s 100 million a year, maybe think it’s like a billion dollars of value or something. These are big numbers, right? Even if we were probably going to fail, in expectation, I think it was actually still quite good.

Signs that Sam will actually donate his money [00:11:30]

Rob Wiblin: Yeah. We’ll come back to your personal story and how you’ve made various career decisions over the course of the last decade later on. But I know what you’re like, Sam, and how maximizing the good that you do has been this driving passion for you since at least undergrad — and I expect probably before that, at high school. But I’ve seen some folks out there on Twitter and elsewhere in the media who are kind of skeptical that you’re ultimately going to follow through and give away most of the money you’ve made, as you say that you will. Is there anything you could plausibly say to the audience that might make them more likely to believe you and take you seriously?

Sam Bankman-Fried: It’s a good question. Obviously in the end, I don’t know. That’s people’s decisions to make for themselves, and they should treat this with whatever skepticism they think is important. But what can I say that maybe puts a dent [in that skepticism]? I think one piece of this is it’s something I’ve wanted to do for decades. It’s been the most important thing to me. I’ve given away something like 50 to 100 million so far. That’s still plausibly in the range where some would say, “OK, sure. That’s cool and all, but that’s just PR.”

Rob Wiblin: You’ve still got much more than that.

Sam Bankman-Fried: Yeah. I think part of my answer is, let’s check in again in a year. If in a year, that number isn’t any bigger, then I think that’ll be a bad sign for me. If in a year, it’s a lot bigger, that’ll be a good sign. That’s honestly part of my answer. It’s hard for me to prove that right now, but I think that that will change over the next year or so. I’m optimistic that it will get to a point where I’ve given away a lot more than that, and certainly over the next few years. That’s part of it, but also you can talk to the people who I’ve known for a while — obviously you’re one of them — and the people who are working with me on the EA side, and decide for yourself to some extent how serious that seems. I’m really excited about the people who’ve been working with me on this.

Rob Wiblin: Yeah. I think one reason why I don’t doubt it is I just think about what do people think the temptation is for you, Sam? That you’re going to get into having really fast, expensive cars and then buy a yacht? You couldn’t care less about that because it would provide no benefit to you anyway.

Sam Bankman-Fried: Right. And not one fast expensive car, right? Just a fleet, 300 of them. So many cars.

Rob Wiblin: Yeah. I think that would require a pretty major shift in interests on your part.

Sam Bankman-Fried: I think so.

Rob Wiblin: Are there any fancy expensive things that you are tempted by on a selfish level? Or is it just nothing?

Sam Bankman-Fried: Yeah, let me think. A little bit. I don’t know, I kind of like nice apartments. I don’t want to say there’s literally nothing, but there’s not a lot. I’m not really that much of a consumer exactly. It’s never been what’s important to me. And so I think overall a nice place is just about as far as it gets.

Rob Wiblin: Yeah. To me that makes a ton of sense. The thing that surprises me is that there ever was a time when people got super rich in order to try to spend the money on themselves. It just seems kind of like a waste of time.

Sam Bankman-Fried: It does.

Sam’s philanthropy plans [00:14:35]

Rob Wiblin: But anyway, I’d like to focus now on something that you haven’t covered in that much detail in other interviews — and which I think listeners will be particularly excited to learn more about — which is exactly your plans for how you’re going to try to do the most good with your philanthropy. How much money do you expect to be able to give away each year over the next couple of years, if things go well?

Sam Bankman-Fried: Obviously a fair bit of uncertainty over exactly how this will be, but the hope is to scale into the hundreds of millions to billions over the next couple of years, and certainly to the billions per year over the next five to 10 years. And obviously a lot of this depends on exactly how well the company goes. But I would be pretty disappointed if it never reached the point of a billion a year.

Rob Wiblin: Another thing for context, as we were talking about, you’re pretty risk loving as an investor and a businessperson. So as a result, how uncertain is the amount that you’re ultimately likely to be able to give? What are the odds that it’s under one billion ultimately, because things really fall apart — or alternatively, over 50 billion because things go incredibly well?

Sam Bankman-Fried: I think under one billion would be pretty difficult. Something real bad would have to happen. I certainly wouldn’t say the odds are less than 1% on that, because I don’t know; I don’t even know how to think about the tail cases. But I would certainly say less than 10%. I think for the odds of more than 50 billion, 50/50, something like that. I don’t know; I’m sort of making it up, but that’s a ballpark.

Rob Wiblin: OK. So the odds of it getting really low isn’t so high, but then there’s a substantial upwards tail if FTX continues to grow into all kinds of other markets.

Sam Bankman-Fried: Yeah.

Rob Wiblin: Are there any grants that you’ve already made that you are kind of proud of and happy to talk about?

Sam Bankman-Fried: Yeah. I can talk about some of them. We’ll see how ultimately all of them end up. Many of them are still effectively in process, but I think far enough along that you can start to judge them a little bit.

Sam Bankman-Fried: One of the things I’ve been doing the most on has been pandemic preparedness stuff. And when I say I’ve been doing things here, I may be being a little bit too generous to myself: I’ve been giving some money; other people have been doing the actual work. Basically lobbying Congress and then some states to invest more in preparing for future pandemics has been probably the thing I’ve given the most to so far. And we’ll see what ends up happening. Some of this was for things in the Senate infrastructure bill, which now looks like the whole bill might be scuttled. I think prior to that, it probably had a couple billion dollars of pandemic spending of impact.

Rob Wiblin: Is this the Apollo Program that people talk about?

Sam Bankman-Fried: Yeah. One piece has been pushing for that, and I think probably had significant, nontrivial impact on that happening. There’s a similar effort in California right now, which I’m excited and optimistic about. I think the goal of these is to set up multibillion-dollar foundations dedicated to it. That’s been one side of this.

Sam Bankman-Fried: Some pro-democracy and reform things, which are still in progress and we’re going to have to see how those end up, but I think will probably end up having real at least probabilistic impact. Then a bunch of other things that are just way too early to judge.

Rob Wiblin: Just on the lobbying for the pandemic preparation stuff, I’ve said on the show before — and don’t want to belabor the point — but it’s just a bit crazy that after the US has suffered trillions of dollars in costs from COVID-19, that it’s hard to rustle up a few billion dollars to actually prevent the next pandemic from causing that kind of damage.

Sam Bankman-Fried: Completely insane.

Rob Wiblin: If you’re on the pointy end of actually trying to get the bill passed, what’s happening on the other side? What’s the case against?

Sam Bankman-Fried: There isn’t exactly a case against, which is sort of interesting, right? Who is the pro-pandemic legislator who is on the side of COVID?

Rob Wiblin: Did a virus write this bill?

Sam Bankman-Fried: Right, exactly. On the one hand you see the humans arguing for X, but on the other hand the viruses are pushing back. The basic answer is that the default is nothing happens, and so a draw is a loss. The momentum to push for something to change has to be big enough to overcome inertia.

Sam Bankman-Fried: So in particular, we’re seeing right now the whole fucking infrastructure bill might not pass. It is hard to get anything passed. When the whole bill might not pass, first of all, that’s one way it could fail: there fails to be alignment between legislators on the size of it. But another thing that can happen there is when people start to sense, “Oh shit, this might not pass,” the machetes come out real quick for things to cut. It’s like they’re trying to get it under some number, right? “If we can get the spending low enough, then we can pass this bill. And if we can’t, we get no bill.”

Rob Wiblin: So things might get cut somewhat at random.

Sam Bankman-Fried: Exactly. And the worry is maybe they didn’t cut it enough, so the bill won’t pass. So there has to be real pressure to not cut a particular part of it for that part to remain. I think that’s probably the biggest piece of this: how do you keep something in a bill when there’s so much pressure to get rid of it? And it’s not clear which person is championing it, right? Who is going to say, “No, you cannot fucking cut this part of the bill”? I mean, maybe everyone — that would be one possible answer, but it didn’t turn out to be the answer. And it’s not like, “This is the person that had COVID in their state, so they’re going to keep fighting to try and prevent the future pandemic.” Everyone kind of did. It sort of traces back to the original question of, “Why does no one care that much in the first place about future pandemics?”

Rob Wiblin: Yeah. I think to be honest, I wouldn’t have predicted this one. I would’ve thought that people would be more interested in funding the science that would stop the next COVID-19. But I don’t know. The world constantly surprises one.

FTX Foundation [00:20:22]

Rob Wiblin: What is the current state of the FTX Foundation that you’re trying to set up? And I suppose the other philanthropic infrastructure that you want to build in order to get the most out of your money?

Sam Bankman-Fried: So over the past year and a half or so, we’ve been giving without a formal foundation, just sort of ad hoc. About six months ago, we started actually setting up an actual foundation, and hiring actual people for it, and formalizing that it exists. And so that has happened now. It is now a thing that exists. This guy, Nick Beckstead — who I know you know and probably many of the listeners do — is heading it up.

Rob Wiblin: Yeah, I think he was a guest back in the very early days. Episode 12, I think, from memory. It’s been a little while, but yeah, you could hear some of his thoughts on how to do good through philanthropy in that episode. Sorry, carry on.

Sam Bankman-Fried: You have a good memory. But yeah, he’s heading it up and we have a few other people who have joined or are joining. Ramping up, I think we’re aiming to give upwards of 100 million this year. We’ll see how much it ends up exactly being; that’ll depend on a lot of details. But I’m overall really excited for it. It’s the thing that matters the most in the end, and it’s really nice to actually see it starting to come to fruition.

Rob Wiblin: Yeah. Turns out that the Nick Beckstead episode was number 10. So my memory is —

Sam Bankman-Fried: Never mind, bad memory.

Rob Wiblin: It could be worse. It could be worse. Where do you hope that the foundation will get to, in terms of staffing and capabilities, in the fullness of time?

Sam Bankman-Fried: Yeah. It’s an interesting question. In terms of staffing, we try and run relatively lean. I think often people will try to hire their way out of a problem, and it doesn’t work as well as they’re hoping. I’m definitely nervous about that. FTX is a lot bigger than it once was, but it’s a lot smaller than many of our competitors are. In terms of staffing, I think we have about 250 people globally now. Which feels like an enormous number to me, but it’s also 5% of what many of the other exchanges have. In terms of raw staffing, I’m not totally sure, because I think you can do a lot without that many people.

Rob Wiblin: With the right people. Yeah.

Sam Bankman-Fried: Yeah. But in terms of scope, I think ultimately giving away hundreds of millions a year, possibly more. But beyond that — and this is something that we’re really excited to be exploring — actually taking the initiative on some projects. Whether that’s incubating them, helping to found them, giving operational support and advice to them, or actually just spinning off some philanthropic projects ourselves. I think that funding is helpful, but it’s not the only missing ingredient. There’s a weird thing where sometimes there’ll be a project that everyone knows should happen, and that everyone thinks would make sense, including funders and founders and everyone else in the community.

Rob Wiblin: And yet it doesn’t happen.

Sam Bankman-Fried: Yeah. It doesn’t happen. Right? And the question’s like, why did it not happen? What went wrong? I don’t know exactly, but I think often you get this sort of weird thing where there was funding, but the funding is like, “OK, but where’s the founder?” And the founder’s like, “Well, is there funding?” It’s sort of a mess.

Rob Wiblin: It requires a slightly happy coincidence if all of the ingredients that you need to actually start the project all come together at the right moment, when the person is free to actually start a project rather than busy with something else.

Sam Bankman-Fried: Exactly. That’s my sense of why sometimes things don’t actually come together in practice. And part of what we want to do is see if we can help fight against that a little bit, by just actually trying to make something happen ourselves and say, “Look, we have the funding. It is definitely here. Because we are the funding. We have a person to start it. There’s full knowledge between those two that the other one is here. We are declaring that this is going to start.”

Rob Wiblin: What are some of the kinds of projects that you are going to be advertising, that you want founders to apply to you to get the money to get them going?

Sam Bankman-Fried: There’s a lot. Really excited about them. Some of the things I’m most excited about are, on the pandemic side, I think there’s a lot of infrastructure that could be built. I think early detection is one piece of this. We didn’t know that COVID was happening until some number of months after it actually started happening. And that’s not great. So building systems to be able to detect pandemics early. Potentially building out frameworks for getting drugs to market quicker. I mean, you saw with COVID, how long was it from when we effectively had a vaccine that worked to when the first person got that vaccine? It’s like eight months or something like that. Which is kind of a while.

Rob Wiblin: Are there any specific ideas you have for technologies or changes that would speed that up?

Sam Bankman-Fried: Yeah, it’s a good question. I think that the technology is one piece of this, although basically having ready-made virus antidotes is one answer to this, as opposed to having to react to it. Having them already produced and manufactured, just having a giant store of them. But it’s also worth noting that in this case, we actually got kind of lucky. It turns out Pfizer and Moderna just had a machine that basically immediately produced the vaccine. They got the sample and two days later, great, we got the vaccine. And even from then it was eight months.

Sam Bankman-Fried: So part of this is around being prepared with having an actual antidote. But part of this is also, what happened to those eight months? I think the basic answer is regulatory. Operation Warp Speed is what it was called, but what does that actually mean? “Warp speed” meant eight months to get the vaccines to market. That was not good for COVID, but it would’ve been way worse if COVID were way worse, right?

Rob Wiblin: If it had a 10% fatality rate.

Sam Bankman-Fried: Exactly. You just take COVID but tack on the fatality of SARS, and that’s pretty terrifying. We don’t actually really have a process for getting things out there quickly in basically any circumstances. That’s got to be part of the answer here.

Rob Wiblin: I saw a draft of the website you’re putting together for the foundation, and you had a bunch of other interesting ideas that I haven’t heard promoted so much — projects where you are potentially looking for founders and you’re interested in funding them.

Rob Wiblin: One was trying to do talent scouting in the developing world: finding people who have amazing potential to become the next generation of top researchers in some area, and then pulling them out and giving them the best opportunities that they can get.

Rob Wiblin: You’re interested in starting a new newspaper that would have better integrity standards or better standards for accuracy than any existing newspaper, which would be extremely cool.

Rob Wiblin: Another one that I’ve thought about before — and I’m surprised it doesn’t exist already, because it doesn’t seem like it would be expensive — is basically just having really thorough polling of experts within lots of different domains on what their opinions are about relevant issues, things that affect people or affect policy or so on. There are some things like this; there’s this Chicago Booth School survey of economists, where every month or something they ask them one question about some topical issue. It’s a bit surprising we don’t have a more systematic way of doing that. Do you want to talk about that one?

Sam Bankman-Fried: Yeah, totally. This obviously starts to interface a little bit with prediction markets too, where we don’t have good infrastructure for getting consensus answers to hard but important questions. Obviously there are a lot of those. Let’s take early in COVID: what was the consensus on the infection fatality rate of COVID? There wasn’t an answer to that, right?

Rob Wiblin: Yeah.

Sam Bankman-Fried: There were just lots of incoherent, disconnected answers that differed by orders of magnitude, and clearly were not vetted across each other. That can’t be the best answer.

Rob Wiblin: Yeah. “There has to be a better way!”

Sam Bankman-Fried: I know, right? If only we had some consensus mechanisms. You can just have a marketplace for this. That’s what markets do: they take a lot of people’s different opinions on something and give a central order book to match those opinions against each other and see what consensus comes out. But even if you didn’t want to do that, you can just take a survey and average the results of experts — and even that we don’t really have infrastructure for. You can try to do it ad hoc, and some people have tried to do that in various cases, but it’s a mess. You’re trying to cold call people who you think might know something about it and ask their opinions. It’s bizarre that we don’t have better answers for this.

Rob Wiblin: Yeah. I don’t want to act like there’s nothing like this. Academics write review papers that try to summarize what people think within a particular field, but that stuff is very slow, very gradual. And I guess it’s not always that representative — it’s not a thorough survey of the full range of opinion that you might get within some domain. So it’s just not suitable for every use.

Sam Bankman-Fried: Yeah. I agree. I think just having standing panels of experts who you could call together and just quickly ask a battery of relevant questions and see what they said would be potentially really valuable for things. That’s probably even just step one. I do think prediction markets are maybe the right way to go eventually — it uses the right incentive mechanism and is a pretty good way to aggregate opinions. But there’s a lot of things that need to happen for that to be a viable answer, and I don’t think we’re that close to that.

Rob Wiblin: Are there any other projects that might be a little bit unexpected or a little bit eclectic that you’d like to highlight for people?

Sam Bankman-Fried: Yeah, there are a bunch, some of which you’ve also talked about. You brought up, among other things, trying to recruit really promising people in, especially, developing world countries who would not otherwise have real access to opportunities, and give them those opportunities to do good in the world. That is one thing that I’d be super excited about. It’s not clear exactly what form that takes, but I think that it could just be really, really influential. That’s one that I think is super cool.

Sam Bankman-Fried: And you know what else, I think there’s a lot of requests for project and founder things, where if you have a good idea for something you want to happen, it’s just a really low bar. Come to us.

Rob Wiblin: Bring it to you, yeah.

Sam Bankman-Fried: Yeah, exactly. And we’ll try and make it a really smooth process, like get rid of the trivial inconveniences that sometimes make something not come to fruition, see if that can help spur more things to action. That’s a type of thing that I’m pretty excited about.

Sam Bankman-Fried: What else? I think in politics there’s a lot, and in policy. It’s an enormously influential area, and it’s one that I think there aren’t enough effective altruists getting into right now, thinking about how they can have positive impact on policy in the States. I think that helping people get accustomed to it, figure out how to get involved, is something that we’ve been doing a bunch of somewhat behind the scenes, and excited to do more of. That’s a big area.

Sam Bankman-Fried: Then the last thing is just really trying to keep an open mind about what big projects might be great, and being willing to write a billion-dollar check if that turns out to be the right thing to do. If someone’s like, “For this area, here’s what’s blocking us” — like, “We’re not going to make progress until we have a great genotype-to-phenotype map, and it would cost a billion dollars to put that together.” I want to be in a position where we could say, “All right, we’ll think about it. And if that compels us and you seem like the right person to do it, then yeah, that’s a number that could be gotten.”

Rob Wiblin: Looking overall at the strategy that you’re taking and the areas that you’re interested in, they seem somewhat similar to Open Philanthropy and Longview Foundation and other people who are trying to give away lots of money in an effective altruist–flavored way. I guess that’s not so surprising, given that you poached Nick Beckstead to lead on your efforts, and he’d spent many years working at Open Philanthropy. What might be distinctive about your approach that will allow you to find things that all the other groups haven’t already found or are going to find?

Sam Bankman-Fried: Before I answer that, let me give a non-answer first, which is that in the end, I’m happy for them to find things too. It doesn’t matter who finds the things. I do think that’s worth just making explicit: that our goal isn’t to figure out how to maximize the amount of the impact that gets attributed to us or something like that. It doesn’t matter who it gets attributed to.

Sam Bankman-Fried: But having gotten that out of the way, I think that being really willing to give significant amounts is a real piece of this. Being willing to give 100 million and not needing anything like certainty for that. We’re not in a position where we’re like, “If you want this level of funding, you better effectively have proof that what you’re going to do is great.” We’re happy to give a lot with not that much evidence and not that much conviction — if we think it’s, in expectation, great. Maybe it’s worth doing more research, but maybe it’s just worth going for. I think that is something where it’s a different style, it’s a different brand. And we, I think in general, are pretty comfortable going out on a limb for what seems like the right thing to do.

Is the effective altruism community too conservative? [00:33:55]

Rob Wiblin: Yeah. I guess you might bring a different cultural aspect here because you come from market trading, where you have to take a whole lot of risk and you’ve just got to be comfortable with that or there’s not going to be much out there for you. And also the very risk-taking attitude of going into entrepreneurship — like double-or-nothing all the time in terms of growing the business.

Rob Wiblin: I’ve had a worry that’s been developing over the last year that the effective altruism community might be a bit too conservative about its giving at this point. Because many of us, including me, got our start when our style of giving was pretty cash-starved — it was pretty niche, and so we developed a frugal mindset, an “I’ve got to be careful” mindset.

Rob Wiblin: And on top of that, to be honest, as a purely aesthetic matter, I like being careful and discerning, rather than moving fast and doing lots of stuff that I expect in the future is going to look foolish, or making a lot of bets that could make me look like an idiot down the road. My colleague, Benjamin Todd, estimated last year that there’s $46 billion committed to effective altruist–style philanthropy — of course that figure is flying around all the time, but it’s probably something similar now — and according to his estimates, that figure had been growing at 35% a year over the last six years. So increasingly, it’s been growing much faster than we’ve been able to disburse these funds to really valuable stuff.

Rob Wiblin: So I guess me and other people might want to start thinking that maybe the big risk that we should be worried about is not about being too careless, but rather not giving enough to what look like questionable projects to us now — because the marginal project in 10 years’ time is going to be noticeably more mediocre or noticeably less promising. Or alternatively, we might all be dead from x-risk already because we missed the boat.

Sam Bankman-Fried: Completely agree. That is roughly my instinct: that there are a lot of things that you have to go out on a limb for. I think it’s just the right thing to do, and that probably as a movement, we’ve been too conservative on that front. A lot of that is, as you said, coming from a place where there’s a lot less funding and where it made sense to be more conservative.

Sam Bankman-Fried: I also just think, as you said, most people don’t like taking risks. And especially, it’s often a really bad look to say you’re trying to do something great for the world and then you have no impact at all. I think that feels really demoralizing to a lot of people. Even if it was the right thing to do in expectation, it still feels really demoralizing. So I think that basically fighting against that instinct is the right thing to do, and trying to push us as a community to try ambitious things nonetheless.

Rob Wiblin: I suppose it makes sense that inasmuch as there’s a very strong trend in how abundant funding is, and how open you should be to funding high-risk things, the culture is always going to lag in that because it’s just so hard to fully catch up. You always have a culture or a mindset that was appropriate three years ago, and yet today things are very different.

Sam Bankman-Fried: I think that’s right, and that there’s also a little bit of a tendency sometimes for there to be kind of bullshit — you know, expected-value calculations that implies something might be really important, and a lot of skepticism. And if you just fast forward six years, those calculations are now canonical, and everyone has bought that that’s what the expected value is, but we have high uncertainty. And sometimes I think you can skip that six-year period and be like, “All right, I see where this is going. Obviously super interesting arguments against it, but…”

Sam Bankman-Fried: I think AI went through something like that in some communities, where there’s early people being like, “Even if there’s only a 1% chance that AI’s going to destroy the world this century, it’s really important.” Other people are like, “That sounds like bullshit. Come on, that didn’t work.” And six years later, I think that is basically the canonical line in EA, with some more sophisticated thinking, but the core logic that was laid out a long time ago is effectively still the logic that’s used to justify it at a high level. I think there’s some things that are like that now.

Rob Wiblin: And you think people should have been open to funding stuff that is very high risk in that sense?

Sam Bankman-Fried: Yeah.

Rob Wiblin: And perhaps have been a bit more skeptical than was justified, given the general abundance of funding that we should expect?

Sam Bankman-Fried: I think that’s basically right, and there’s probably some analogs of that today. The political realm is probably one of those where my instinct is like, we don’t really know what we’re talking about there in some ways — but come on, we can make guesses. And if you’re forced to write down quantitatively, like, “What do you think the impact of X will be in the political realm? What do you think is the cost of… ?” — if you’re forced to write down those numbers, it just looks very compelling, unless you have very extreme, weird guesses.

Rob Wiblin: Yeah. A possible example of you maybe trying to be more aggressive in terms of funding stuff that urgently needs money is the nuclear security space. The MacArthur Foundation recently, for various reasons, has substantially reduced its funding towards organizations that are one way or another trying to reduce the risk of the use of nuclear weapons, and the worst-case scenario of a massive nuclear war. I think you’ve jumped into that space and been partially substituting for the funding that unfortunately they’re no longer providing.

Sam Bankman-Fried: Yeah. We’re really actively looking into what we can do there. I think it’s one of these things where I’m not an expert in it — our opinions could change over time, but we can have priors — and it seems like an important space. You can try and calculate what you think the odds are. I think it leads you to think that it’s at least a pretty plausible candidate, and that it’s something we should be acting pretty fairly quickly on. Just every year matters to some extent — you can’t, after the world blows up, go back.

Rob Wiblin: Put it back together again? Yeah. I guess there’s actually two kinds of urgency there. One is, I suppose it’s been more salient the last couple of months, with Russia and Ukraine and so on, that there is an ongoing annual risk that things will spiral out of control, and nuclear weapons are going to end up being used. And you really don’t want to have been holding your money in reserve, expecting to use it to prevent a nuclear war at some future time, and then one happens already.

Rob Wiblin: Another thing in this specific case, because there’s a funder who’s pulling out their funding — withdrawing it unfortunately quite quickly — the organizations that were relying on that, and the people whose careers were premised on that funding being available, that can all decay. It can begin to rot away, and it won’t be available necessarily to fund in three years’ time.

Sam Bankman-Fried: I think that’s absolutely right, and so there is some sense of supporting the organizations when they need it — as bridge funding, if nothing else.

Rob Wiblin: What, in your mind, is the best counterargument to all of what we’ve been saying here? The best argument that actually we should keep our powder dry, be happy to have substantial amounts of funding in reserve, and not be giving it away right away?

Sam Bankman-Fried: I think there are some moderately compelling arguments in the opposite direction. I don’t want to seem like I’m implying that you just obviously should be dumping all of it right now. And so, what are some of those? One thing is, do we think we have better things to give to now, or what we were thinking six years ago? And if you think they’ve gotten better over time, then is that just going to continue? Or in six years are we going to be like, “What the fuck? Obviously we should all be giving to pineapples. Why were we bothering with x-risk? There’s just not enough pineapples.”

Rob Wiblin: I guess the logic there might be that we’re just becoming more informed about these general issues over time, that more person-hours have gone into thinking about how to have the biggest impact with funding. And so maybe even though there’s more money, the opportunities will be better in five or 10 years.

Sam Bankman-Fried: Yeah. I think that’s basically right; that is part of the thought there, and we should be taking that seriously and making sure that we have enough dry powder. And what this gets me to in the end is, I think it starts to become a really hard question. If and when you get to the point where your plan is to blow 80% of the money in the next five years, that’s potentially a big problem. It’s not provably wrong, but there’s a real worry there.

Rob Wiblin: There’s a big risk that something could come along in the sixth year and you won’t be able to fund it.

Sam Bankman-Fried: Exactly. Whereas if it’s the sort of thing where your worry is that you’re going to be spending 10% of the money in the next five years, that seems like a lot less of a concern to me.

Rob Wiblin: Yeah. Philip Trammell is a thinker who’s been on the show before, talking about this issue of giving now versus giving later. It’s a very complicated intellectual exercise to try to figure out what the answer is here. We can link to the episode that we did with him 18 months ago or so. It seems like the thing that tends to spit out of the models is that you should be giving away some fixed percentage of your total assets each year — and that figure might be as low as 3%, but even 3% would imply substantial increases in total disbursement. And that’s interesting that even if you’re being quite conservative and allowing the total asset stock to grow a lot year on year, then we’re not even getting close to that.

Sam Bankman-Fried: I think that’s right, and if you think total assets in this space are 50 billion or something, what’s 3% of that? That’s one-and-a-half billion a year. I don’t think the space is giving away one-and-a-half billion a year right now. We’re probably undershooting what the right thing to do is by a fair bit right now — I think a couple billion a year is probably about the right amount for the space to be giving, absent extremely good opportunities. And if you see an extremely good opportunity, maybe a lot more than that that year.

Sam Bankman-Fried: I’m actively thinking, like, “What could happen in the next two years that I might want to drop $1 billion on?” I have like six things on that list. I think probably one is going to happen. I’d actually be slightly surprised if none of them turned out to be what we thought the right thing to do was. So it is dependent on a lot of outside factors, but I think people have been way not trying hard enough to find really impactful things to drop $1 billion on, and I think there are a lot of them.

Biggest uncertainties [00:43:37]

Rob Wiblin: What are your biggest uncertainties about what you want to do with the money?

Sam Bankman-Fried: Yeah, what are some things that I’d want to resolve? One relevant factor is relative risk from bio versus AI versus nuclear, and how potentially preventable those are. I think that’s probably a factor of three uncertainty or five uncertainty, or something like that, in what the right thing to spend on is. I don’t think it drills down to one key assumption, probably; I think it’s a messy collection of them.

Sam Bankman-Fried: Maybe a bigger core thing is, as long as we don’t screw things up, we’re going to have a great outcome in the end versus how much you have to actively try as a world to end up in a great place. The difference between a really good future and the expected future — given that we make it to the future — are those effectively the same, or are those a factor of 10 to the 30 away from each other? I think that’s a big, big factor, because if they’re basically the same, then it’s all just about pure x-risk prevention: nothing else matters but making sure that we get there. If they’re a factor of 10 to the 30 apart, x-risk prevention is good, but it seems like maybe it’s even more important to try to see what we can do to have a great future. And that might be similar things, but it might be quite different things that you would prioritize.

Sam Bankman-Fried: So that’s one crucial consideration that I don’t feel confident about the answer to. I think different people have very different instincts about it, but that will have pretty important flow-through effects to all of this.

Rob Wiblin: Yeah. An example of something that might spill out of that kind of thinking is that it’s important to convince people that if humanity survives, we should do something really ambitious and great with our potential, rather than just being complacent and sitting on Earth and living our normal lives. Maybe we need to have an active advocacy movement around that.

Sam Bankman-Fried: Yep, that absolutely would be an example of it. Another key consideration here, which I know different people have different instincts on — I think I have a different instinct than much of the community does — is how much various things have long-term flow-through effects. And to give some example of that, how much does the president of the United States today impact the far future, conditional on no existential risk during that president’s term? Ignoring the effect on short-term nuclear war and things like that, how much does the general political environment have, in expected value, substantial flow-through effects to the far future? One thing that gets to in the end is this question of how path-dependent things are, and I think this actually makes it a little related to the previous question.

Sam Bankman-Fried: But other than a few very specific things like x-risk, how much is it the case that perturbations in what happens in the world are just not going to persist, versus how much is it the case that actually there’s a lot of different places we could end up, and who really knows what’s going to happen, and it really matters? And we should be really thinking hard about having a better versus worse environment today — discourse environment, intellectual environment — for diffuse long-term flow-through effects. That, I think, is one of the other crucial considerations that I’m not confident in, but I think matters quite a bit.

Rob Wiblin: On that topic, we should come back to some of your non-longtermist interests later on, but in terms of thinking about the long-term future and making that go well, there’s some folks who take a fairly narrow view of what’s important for shaping the future. There’s the classic shortlist of risks from artificial intelligence, risks from biological weapons, risks from nuclear war, maybe one or two other things — but it’s pretty narrow.

Rob Wiblin: And then by contrast, there’s other folks who think it’s really uncertain what’s going to turn out to be most important, and it could be something that we haven’t thought of at all. So they want to focus more broadly on improving society’s rationality and ability to handle any threats that might come up — just improving our institutions and improving how we think about things, so that humanity can do a better job.

Rob Wiblin: Do you fall anywhere on that narrow-versus-broad spectrum, or are you kind of agnostic?

Sam Bankman-Fried: I do, not confidently. I don’t want to express confidence about my answer here, but I do fall somewhere on the spectrum. Especially, I think, with respect to the EA community, I am more on the broad end.

Rob Wiblin: Is there a key reason for that?

Sam Bankman-Fried: So part of my answer is, I feel like there needs to be a key reason for the narrow explanation — but that’d be a little weird if nothing mattered or something like that. Part of my response is, what is this mysterious force that causes long-term convergence in what the world is like? But outside of that, thinking about what matters in the end. Let’s just start tracing back. If in the end there’s some crucial moment with AI, let’s just say for now, then what’s going to matter is the people who are making that decision at that time: what do they think and how do they act?

Sam Bankman-Fried: And then you’re like, well, what impacts that in turn? I guess that, in turn, is probably impacted by what most people who are in crucial AI-related field positions are like, and what they think and what they value and what impacts that. And I don’t know, I think the general discourse environment has, in expected value, impact on everyone. If you said, “Here’s two very different general discourse environments” — think for every Trump-like world or a Biden-like world or something like that — and I say, “I just found a random AI researcher. Talk to me about what they care about and what they’re thinking about in life.” I kind of think you would not say exactly the same thing in those two cases — I think you’d have, in expectation, some differences.

Sam Bankman-Fried: So that’s the diffuse part. But I also need to argue that there’s at least some medium-term coherence and continuity of this, that it doesn’t just all self-correct. And I think that’s probably true — I think there’s pretty decent arguments for what is it that has big impact on discourse today, for instance. I think World War II probably does, among other things — it seems to me that that reshaped the general political environment by a fair bit in a way, which is probably —

Rob Wiblin: Quite persistent.

Sam Bankman-Fried: Quite persistent. At least it has been persistent. At least I think the world is still quite different because of it. And do we anticipate that’s going to wash out sometime soon? I don’t know. I’m not super sure about that. What if Hitler had won World War II? Do you think we’d have the same political norms today? Do you think AI researchers would be thinking about the same types of questions? No, would be my guess.

Rob Wiblin: So the high-level point is that it seems to you there isn’t a super strong pull or attractor towards just a natural way that history is going to flow no matter what. That it seems more contingent than that, and the stuff that happens now matters in the long run.

Sam Bankman-Fried: I think that’s basically right. Or at the very least, I’m not super convinced that there is one. I don’t want to seem like I’m confident in what the answer is here, but that’s my instinct.

Rob Wiblin: Yeah. So I guess that helps to explain why on the list of projects that you’re interested in funding, there’s things like starting a new university with a different intellectual focus, and forecasting long-term trends in wellbeing and income and education and health and so on, and a newspaper whose number-one priority is just covering the most important things accurately. You’re taking this broader view on how to improve the intellectual environment and how to improve the world.

Sam Bankman-Fried: Yeah, I think that’s basically right.

Rob Wiblin: I guess another disagreement that sometimes comes up is how bad is it, for example, if there’s a medium-sized disaster? And for our purposes, a medium-sized disaster might include a war between the US and China — something where it’s like, it’ll be the worst thing that ever happened, but humanity wouldn’t necessarily be destroyed in that event, so there would be potential to recover. I noticed that there’s quite a lot of things on your list that seem addressed at that medium-sized catastrophe, like preventing war and making humanity more likely to recover quickly and effectively from a disaster like that.

Sam Bankman-Fried: Yeah, and I think it’s a similar thing as using World War II as a proxy for what that could mean. But I definitely agree with your framing of it. One thing that we are thinking about a fair bit is, what would be the impact of something like a great power war? And if the answer is very bad, then what could we do today to try to make that less likely? And I do think my answer is probably very bad — it’s just in the same way as some of these other factors make everything a lot worse in the world.

Rob Wiblin: Yeah. There’s multiple different channels that one might worry about. I suppose one will be that a great power war could escalate towards full extinction. Another one might be that eventually humanity wouldn’t recover, even given hundreds of years potentially. And another one might be that we would recover, but human culture would be permanently worse because of this horrible scarring, catastrophic experience.

Sam Bankman-Fried: Yeah. And that last part is at least a decent piece of how I think about it.

Rob Wiblin: Going through a Mad Max era might not be optimal for human cultural advancement.

Sam Bankman-Fried: Exactly. Part of this is that you might have a key x-risk moment in that period. I do think that’s part of it, but I don’t think that’s the whole story. Part of this also that even if there’s no actual x-risk event during that Mad Max period of the world, we might just emerge with a shittier society that was less trying to build ambitious great things, and more tearing itself apart and factional and kind of like the Middle Ages. I don’t want to express too much confidence about what it would look like, because I don’t fucking know — but I think that’s almost my point: I don’t feel compelled that we know what these things look like long term. So that’s my best guess, is those things do matter.

Near-term giving [00:53:40]

Rob Wiblin: We’ve been mostly talking about the longtermist giving. Do you want to talk for a minute about the giving focused on more near-term concerns, like global poverty and animal welfare?

Sam Bankman-Fried: Yeah. We do some amount of giving that is more short-term focused, and some of this is in connection with our partners. I think some of this is trying to set a good standard for the fields, and trying to show that there are real ways to have positive impact on the world. I think that if there’s none of that going on, it’s easy to forget that you can have real impact, that you can definitely have a strong positive impact on the world. So that’s a piece of it. But frankly, to be straightforward about it, I also think that in the end, the longtermist-oriented pieces are the most important pieces, and are the pieces we are focusing on the most.

Sam Bankman-Fried: But just setting a standard of, “We do good for the world, no matter what” is really valuable for communicating what it is we’re doing. I also think that there’s a big difference between thinking about devoting 10% of your giving versus 60% of your giving to that. And that might be a place that I feel somewhat differently than you’ll see in some places — where I really do feel sometimes that people are sacrificing too much expected value for that, and that the arguments are a lot more compelling to do some slice of that than to do a massive amount of it. But sometimes people seem to be dedicating like two-thirds of their giving to what seems to not be the most important thing, according to them.

Rob Wiblin: Yeah. Is it possible there’s another aspect to this? Because you’ve been a vegan for a long time, for example, and I’ve been vegetarian since I was 13. I think that you did leafleting at a university campus, trying to convince people to become vegetarian or vegan at some point. I think both of us find perhaps the suffering of animals in factory farms to be among the most arresting, disturbing things in the world, as it is right now. I imagine that that has to be motivating on some level.

Sam Bankman-Fried: That’s absolutely right. And I don’t know — all the ways of saying this are a little bit cheesy — but some amount of keeping myself honest, reminding myself of what really matters in the end, and —

Rob Wiblin: I guess that shit is real.

Sam Bankman-Fried: Yeah, it’s real. It’s real and it’s massive. The numbers are fucking staggering. There’s about as many animals tortured and killed on factory farms each year as there are humans in the world. It’s not like, “there is some suffering somewhere” type of thing — the world is fucking filled with it right now. And it’s of our own creation, and we should be able to do better than that — we should be better than that. I do really feel that there’s some amount of, if we can’t stand up for anything that’s happening today, how do we trust that we actually mean it when we say that it’s all to do good in the world? Or something like that, if that makes sense.

Rob Wiblin: Yeah.

Sam Bankman-Fried: I don’t want to try and oversell this argument, but —

Rob Wiblin: It gets a bit of weight.

Sam Bankman-Fried: Yeah. And it’s something that’s important to me, and important to me feeling like I’m actually trying to do good here. That’s not just a thing I say when it’s costless. Having some sacrifice that I make on a daily basis to remind myself that that is the right thing to do, and that I shouldn’t get caught up in vagueness as an excuse to wander in terms of what my ultimate goal and impact are.

Rob Wiblin: Yeah, yeah. Are there any projects on global poverty or on animal wellbeing or any of the other non-longtermist causes where you’re particularly excited to find an entrepreneur to take them forward?

Sam Bankman-Fried: That’s a good question. On animal welfare, another interesting property of it is that it is somewhat tractable, and there are real ways to have impact that are somewhat straightforward. I basically think that really lobbying corporations that are consumers of factory farmed goods to only do so on the grounds that they’re more humanely raised is one classic example of this that’s just been incredibly effective historically, and it’s something that is pretty tractable and people could just be doing more of. So that’s one thing that I think is potentially really exciting.

Sam Bankman-Fried: In the political realm, there’s a fair bit you can do. Convincing politicians, for instance, to support more humane methods. And I think just setting an example does have some impact there. There are a lot of organizations that are doing good work on this front, like The Humane League. I think it is just a pretty tractable problem. It sort of has to be, given that it’s a problem of our own making.

Rob Wiblin: We could fix it just by stopping doing what we’re doing. Yeah.

Sam Bankman-Fried: Yeah, I think that’s basically right. In terms of climate change stuff, which is maybe somewhere in between, I think carbon removal or other geoengineering approaches are potentially really high impact. All the straightforward decreasing carbon production things are somewhat saturated, because there’s enormous amounts of money and thought going into them from a political level right now. But I think not enough work is going into potential scientific approaches to it, and engineering approaches.

Rob Wiblin: Let me just put to you a random thought that I’ve had recently, that I’m not sure I’ve talked about on the show. When I think about, broadly speaking, the most important areas from an effective altruist point of view, and the most important ways of solving things, again and again, I’m like, “What we need is more biomedical researchers working on pandemic prevention.” And then in poverty, it’s like, “The best stuff would be more biomedical researchers working out how to figure out diseases of poverty.” And then in climate change, “What we need is more research engineers figuring out how to do clean energy.” It’s just again and again, what we need is really smart people properly funded to do this science and tech R&D stuff.

Rob Wiblin: I think that could just be an accurate bottom line. But the problem would be that if you started funding all of these things, you’re going to run out of people who are qualified and capable of taking on these kinds of roles. That’s maybe when your talent scouting stuff becomes highly relevant. Conceivably, in coming decades, if a lot of people really get on board with this agenda, then it’s not going to be limited by funding — it’s going to be, at a global scale, limited by talent, by people who are qualified to take on all of these research projects. And that’s going to need a huge pipeline to bring in thousands, tens of thousands, possibly hundreds of thousands of top research scientists.

Sam Bankman-Fried: Yeah, I think that’s moderately compelling in that there’s people who can actually do things, who can work on the ground on these most important problems. I think particularly founder-like people is a big bottleneck: people who can create an organization from nothing in those same areas, and can manage it and run it. I think that’s a big piece of this. But also someone who really fucking knows the science behind this — and in a practical way — is really valuable and important. And more people who have in-depth biomedical knowledge working on pandemic preparedness in functional organizations is one example of that.

Rob Wiblin: Yeah, I suppose at the moment, our framing of this is our social networks — effective altruism, longtermism, and people who care about these things — those groups need more people who both have a scientific background and an entrepreneurial background. But as you scale up, the conclusion becomes not that we need that, but the United States needs that, Europe needs that — we just need better ways of training people. And then you start converging to maybe more common-sense ideas about how do you improve society, which includes improving education and building better universities and so on.

Sam Bankman-Fried: Yeah, if you can figure out how to do it.

Rob Wiblin: Yeah, it’s a challenging one. What are the biggest challenges that you foresee with your philanthropy over the next five years? Anything that we haven’t covered yet?

Sam Bankman-Fried: This does go back to something we’ve talked a little bit about. The biggest challenge that I would guess we’d run into is getting people to execute well on the things that we want to do. That we’re like, “Here’s a great thing that could happen. We have plenty of funding for it. Let’s make it happen.” And then we’re like, “Who wants to take this funding and make this thing happen?” And we can’t find that person. That is, I think, the single biggest problem that I anticipate us running into.

Rob Wiblin: Right. Do you have any preliminary thoughts on how to get around that? I suppose one thing is just spreading the word a lot about FTX Foundation potentially.

Sam Bankman-Fried: Yeah. That’s part of my hope, is that we’ll be very loud about the foundation in general and also like, “Hey, here’s a great thing. We’d love someone to do it.” Like early detection for pathogens. I would love someone to do that. I’m going to fucking build this. I’m going to take charge of this and build the most badass detection center, so we will know within a picosecond if there’s a new pathogen anywhere in the world. And if someone has the right background for that and is really excited, driven, good at leading projects and wants to do that, we have plenty of funding for that. And those people have to exist somewhere.

Sam Bankman-Fried: So yeah, shouting from the rooftops is a piece of this. Making it clear that if you’re excited about this, you can just do it. We’ll make it as painless as possible. We will provide operational support for it to the extent that we can, and hopefully that helps. We will be happy to help with entity incorporation and things like that, which I think sometimes get in the way of people trying to make progress on this stuff. I think those things matter and are not super easy and do sometimes just get in the way of progress on that. So that is another piece of this. And if you have other ideas, I’m all ears. I’m excited for this.

Rob Wiblin: Yeah, maybe we need some kind of organization that can get people to think about how to have more impact with their careers.

Sam Bankman-Fried: I know, right?

Rob Wiblin: It could write about problems that people… I don’t know. It would never work.

Should more or fewer people earn to give? [01:03:45]

Rob Wiblin: Anyway, a lot of listeners to this show have gone out trying to make money in order to give it away. They’ve had the same vision for their careers that you had eight years ago. But I suppose some of them now fear that huge donors like you and Open Philanthropy and a couple of others that are coming through the pipeline are going to mean that funding is going to be relatively plentiful — maybe really plentiful — and so the giving isn’t going to be helpful.

Rob Wiblin: Do you have a view on whether your success is evidence that more people who have similar values to you should go and try to make a lot of money because you’ve succeeded so much? Or on the other hand, a reason for people to instead, rather than trying to make money, try to start the projects that you want to fund because funding is now more available?

Sam Bankman-Fried: So there’s this thing where on the one hand, it’s evidence that it’s easier to make a lot of money maybe — and thus, there’s going to be more money and money is less needed. On the other hand, if it’s easier to make a lot of money, then maybe you should go make a lot of money because it’s easier to do. It’s sort of two sides of the same coin there.

Rob Wiblin: Yeah.

Sam Bankman-Fried: And I think that I’m sort of compelled by both pieces of that. In the end I guess I don’t know which direction it points in more strongly. I don’t think it’s been a huge shift on net. I think what it does mean, though, is that if you’re not super excited about your earning-to-give career path, that’s a pretty bad sign for it. The thing it points most strongly against is grudging, low-upside earning to give, because you think it has to be the right thing to do. No — I think that it is a strong factor against that.

Sam Bankman-Fried: On the one hand, I think it means you should be really excited for potential massive earning-to-give opportunities — things you’re really excited about. On the other hand, it also means there’s more funding for projects, and you should be really excited to start a project that could use funding. And I don’t know exactly which is stronger. Maybe another factor here that does nontrivially lead to my feelings on this is that I think there are a lot of things to do with money. And I think I’m way on one end of that spectrum.

Rob Wiblin: I see. Yeah, could you talk about that? For quite a long time you’ve been maybe more optimistic than others, including me, about how much one might be able to spend usefully in a lot of these areas.

Sam Bankman-Fried: Yeah. So in bio, how much could you usefully spend? I think it’s like a billion or two on an early detection center, maybe more over time. On fast pathways for vaccine development and release, I think you’re talking a few billion. I think it quickly adds up to 10 billion or something in the bio area for identifiable projects: like a bunker, how much does that cost? Hundreds of millions.

Sam Bankman-Fried: AI things are harder to think about from a cost perspective. Not to say cost doesn’t matter there, but it’s a little weirder to think about because I think it’s a little more bimodal. I don’t know, either it’s just how many servers do you buy, or it’s not just like that. And if it is, then that means there might be a gigantic money pit at the end for AI safety. But if not, maybe it just ends up being not super relevant.

Rob Wiblin: Ah, I see. So you’re saying that there could be this enormous money pit of tens of billions of dollars or more if it really matters who can buy lots of compute at some essential time when AI is making big advances. But if that’s not the case, then it can be a lot harder to see where you could spend tens of billions.

Sam Bankman-Fried: Exactly. AI is more of a thinking thing than a money thing, outside of that. But that might be a real factor.

Sam Bankman-Fried: Then you look at, I don’t know, politics and policy. I’m pretty compelled that if you think it matters — and again, if you don’t think it matters, then obviously the amount that you can spend on that is zero — but if you do think it matters, the kind of numbers you’re talking about are a billion every two years or something like that, that could be potentially usefully used. And that’s a fair bit. So that probably you should think of as like, a billion every few years is like the equivalent of 10 billion today or something. I don’t know, I’m making that up.

Sam Bankman-Fried: But putting these together, a few billion a year is… I don’t want to say it’s a lower bound, but that certainly isn’t my upper bound on this. I certainly think it could get a lot bigger than a few billion a year that could be really usefully spent. I don’t know if you could spend 10 billion a year really usefully. It actually wouldn’t completely shock me if that turned out to be true, but it maybe would surprise me a little bit. But yeah, I think the numbers are big.

Rob Wiblin: Yeah, yeah. An audience member wrote in with this question: “Sam’s wealth is conceivably a third of all EA donor-committed wealth. From a worldview diversification perspective or a moral parliament pluralism perspective, that level of concentration has some downsides. Is Sam and his giving explicitly going to try to preserve a vibrant marketplace of ideas and prevent funding capture, where folks go out of their way to avoid antagonizing Sam?” And also, I guess, trying to please your kind of idiosyncratic beliefs and preferences to an excessive degree?

Sam Bankman-Fried: Yeah. So going out of their way to avoid angering me — I’m not going to negatively fund. You can’t go below zero. And to some extent, you can always find funding from other sources, and so you can’t get more extreme than that. But putting that aside for a second, I think that we want to fund a pretty broad array of things. And for what it’s worth, probably more so than most, or at least many, people in EA, I do think that there are a pretty broad set of things that matter. I’m definitely not a “direct action on AI x-risk is the only thing that matters” type of person.

Sam Bankman-Fried: So I’d like to think that we are excited to do a pretty broad array of things. And if you look at our early RFPs and stuff, I think they reflect that. We’re also potentially looking to do a significant regranter program, where we give basically fully discretionary funds to people in the community to regrant. The exact shape of that is still under discussion, but I think that’s one way for us to diversify to some extent. And if nothing else, we’re not going to see all the great opportunities, and we don’t want an opportunity to be missed just because it’s not directly in front of us. So this is part of our attempt to address that.

Rob Wiblin: Yeah, yeah. Thinking about this issue of how you maintain lots of diversity of opinion: because of the way that business tends to work, a small number of people end up making much more money than most others, and you tend to have a lot of concentration of funding. But at the same time, we want to have lots of ideas floating around, and not have too much of a concentration of power over opinion. If you’re someone who’s funding 1% of all effective altruistic–style giving, then it seems like you can just go with your own opinions — and fund the stuff that you think is great, and not fund the stuff that you don’t like.

Rob Wiblin: But as you get bigger and bigger, once you’re a third of all of the funding as a foundation — or half of it or something like that — then you might even want to go out of your way to fund some things that you personally don’t think are good, just because other people think they’re good. That could even be good by your own lights because of the risk that you could make a mistake. And even further, you could end up wanting to spend 1% of all of your money funding people to write about how you’re making terrible mistakes, and the stuff that you’re funding is a bad use of money or possibly even harmful — because that will allow you to potentially see the errors in what you’re doing from your own point of view. Do you have any thoughts on that?

Sam Bankman-Fried: Yeah. I think we’re going to have something more public coming out about this at some point, but very explicitly, we are excited to fund people to tell us how we might be wrong. I think that that’s something that we’ve already written down as being on the stack of things that we’re going to express potential excitement to fund.

Rob Wiblin: Yeah. It’s tricky with that, because it seems like almost by definition when you’re funding people to tell you that you’re wrong, it’s not useful if you already agree with them or agree with them too easily. So you kind of have to fund stuff you think is bad in some way — or at least mistaken or misguided in some way. And yet you have to figure out which of the mistaken or somewhat misguided stuff you want to fund in order to get more of it. I guess you’ve got to think, “Well, what is challenging me? That I still think it’s wrong, but it’s challenging me and it seems very thoughtful?” Maybe that’s the least-bad option here.

Sam Bankman-Fried: Yeah, I think that’s right. I think you have to try and find things which seem like not what you think, but not provably wrong. More like, “Oh, that’s different and interesting, and I could imagine a world in which that were right. And it’s not currently my best guess, but let’s see how I feel after seeing what they have to say about it.”

AI [01:12:41]

Rob Wiblin: Yeah. Let’s talk a bit more about artificial intelligence, and how that could influence the long-term future and what one might do about that for a minute. You’ve been running FTX for the last couple of years, which I imagine has taken up a lot of your time, and so you might not have had tons of spare time to be reading on the internet and forming your own opinions about this. But do you have any thoughts to share on what might be best to fund in terms of positively shaping the development, and ultimately the deployment, of substantially more advanced AI than what we have now?

Sam Bankman-Fried: Yeah. I think it’s super important and I also don’t feel extremely confident on what the right thing to do is. Not always the best combination. But whatever, it is what it is.

Rob Wiblin: This is a podcast. You can say whatever you like.

Sam Bankman-Fried: I definitely don’t feel confident that there’s just a binary thing here of safe or not safe, and that you figure out how to do the safe thing and then you press the safe button and you’re safe. I think AI could have a lot of different impact on the world, and there’s a pretty wide range of it from amazing to catastrophic and everything else in between. And maybe that just sounds like, “Well, obviously that’s what you’d think,” but I don’t think that’s clearly the consensus in EA. I think the consensus is in some senses closer to, “Well, let’s make sure it’s safe.” And I feel a little bit more like, “What’s that mean exactly?”

Rob Wiblin: It’s more of a continuum.

Sam Bankman-Fried: There’s a lot of things in the world.

Rob Wiblin: Or a lot blurrier.

Sam Bankman-Fried: Yeah. And so that’s a piece of this that is probably important to how I think about it. Outside of that — well, not just outside of that; I think it’s a little bit consistent with it — I think that differential AI progress matters a fair bit. I’m fairly worried about what happens if it’s nonaligned AI labs that end up getting to really powerful AI first, and that there might be a fair bit of lock-in around that. That might be a pretty crucial period, where whatever sort of design considerations the first person to get there has will impact the long-run vision of AI. And “long run” might not mean long run in years — that’s one of the weird things here, that the long run could mean the next two days, as AI scales up massively.

Rob Wiblin: Because you can pile a lot more compute onto one system potentially very quickly? Or you could get rapid improvements in the efficiency of it?

Sam Bankman-Fried: Yeah. So I think the self-improvement factor there is a big one. Could we end up in a world where pretty rapidly we go from zero to 60? Absolutely. So it could really matter what we look like as we’re doing that. I’m certainly not the only one to be saying this, so I don’t want to claim that this is exactly revolutionary, but I do think it’s important and I don’t think it’s something that has been really fully appreciated in some cases. So that’s one thing that I feel moderately strongly about.

Rob Wiblin: On that topic, an audience member wrote in the question: “In a crunch time, would Sam be willing to spend a big part of his fortune in a relatively short amount of time — like under a couple of years — on AI safety?” So could you try to blow 10 billion in three years?

Sam Bankman-Fried: I mean, yeah. But I kind of wonder why the question is a few years and not a few weeks. I mean, sure, but same thing for a few weeks or a few days. I don’t know, whatever makes sense makes sense. And in one sense, that’s a vacuous thing to say, but I think in this context it’s not vacuous, because I think that’s legitimately like —

Rob Wiblin: It’s anticipatable.

Sam Bankman-Fried: Yeah. I do basically think that it’s really important to be willing to do that. And I’d be really sad if I didn’t.

Rob Wiblin: So in terms of if there is a crunch time or crunch moment like that, when might it occur? People who are as informed as one can be about a speculative topic like this vary quite widely. At the nearest level, some people say five years or 10 years. Looking further out, some people think it’s actually going to be 50 years before we could have an artificial general intelligence that would be transformative in this way. Do you have any kind of inside view on that timelines question?

Sam Bankman-Fried: I’m skeptical of a lot of the views that people express on it. I’m particularly skeptical of confidence in it. If you want to say that the odds are not zero of very short timelines, then yeah, I agree. They’re not zero. Is there a chance of extremely short timelines? Yes, there is a chance of that. But a chance is not the same thing as a high chance. And I think that there’s a lot of people who have what seems to be a surprisingly binary view of that. My thought is like, “Look, we don’t fucking know what we’re talking about.”

Sam Bankman-Fried: What are the odds it’s in the next year? Extremely low, but not zero. What are the odds it’s in the next five years? I don’t know. Moderately low, but not implausible. What are the odds it’s in the next 25 years? Yeah, absolutely could be. What are the odds it’s in ever? I don’t know. I’d be skeptical of someone saying more than 80% that there is ever transformative AI. And so I think it’s a pretty diffuse probability distribution.

Rob Wiblin: Yeah, I have the same view, which is kind of agnosticism, or just taking the range of opinions that educated people have and just saying, “Wow, we really don’t know within that range.” And then I’m really not sure what to do with that. Because I’m like, “Should we be talking on the show a lot more about the fact that in 10 years’ time there could be a massive revolution in how society functions?” I’m just not quite sure what that implies for my actions and I guess maybe for yours.

Sam Bankman-Fried: Right. No, it’s a good question. I think that’s a problem. I think people have not necessarily come up with extremely compelling, concrete things to do about that fact. I don’t want to say there are none.

Rob Wiblin: Yeah. People who are knowledgeable about machine learning and about AI development, any of them who want to work on finding ways of making AI more reliable in how it’s used, so it’s less likely to do something super bad and unexpected — I think that they should all have the funding that they want to pursue that work, as long as it’s broadly sensible. And then beyond that, I’m kind of a bit unsure.

Sam Bankman-Fried: Yeah, I basically agree. I do think that making sure that good positive work in that field does have funding is important, and we’ve been super excited to fund things there and have already started doing a bit of that. So yeah, that’s part of it. But outside of that, it’s not totally obvious. Other than I think it is another reason to be skeptical of not spending anything right now. If you do think that short timelines have nontrivial probability, then it actually doesn’t prove you shouldn’t be spending a shit ton right now, right? Because you could imagine taking the position that all our impact is in the world where that’s not the case. But I think it is another reason to be skeptical of spending less than a few percent a year.

Rob Wiblin: Even beyond the technical side of things, I feel very nervous about AI, in part because there’s this whole other issue of deployment and proliferation. Where now we have this technology that’s very powerful in terms of what it can think about and what it might be able to autonomously do if you tell it to do that, that could be turned towards hostile uses or aggressive uses or malicious uses.

Sam Bankman-Fried: Yep.

Rob Wiblin: And as the algorithm becomes more and more efficient — as it improves, and as we get better at making more and more compute — you have this problem that more and more people are going to have access to this very potentially powerful influential technology, and it seems difficult to stop that.

Rob Wiblin: The analogy people make with proliferation is to nuclear weapons. A difference there was that nuclear weapons were very hard to make in the first place, and they didn’t get that much easier over time. It’s plausible to keep the number of actors who can build a nuclear weapon relatively narrow. But with computing technology, it’s not like that — things that get invented tend to get spread very, very fast.

Sam Bankman-Fried: Like copy-paste.

Rob Wiblin: Exactly. And I just don’t know how we deal with that. I haven’t really heard a great proposal.

Sam Bankman-Fried: Yeah, it gets to a really nasty point, which is what’s the end game here? All right, so you have some plan for stability for an AI system — then 10 years later someone develops a rogue AI system and you’re fucked anyway. What gets us out of the time of perils there?

Rob Wiblin: I know that people have thought about this a whole lot more than I have. I haven’t seen great reports on it yet, but I suppose you’ve got to have some kind of standoff between different AI systems, where they each agree to kind of allow different interest groups to maintain their own sphere of influence basically. Something like what countries have now.

Sam Bankman-Fried: Yep. I think one thing that gets to is who has the power? Offense or defense? Can you create a situation where the offense doesn’t have the power, or at least where there’s mutually assured destruction or something? My thought is, I don’t know, there’s a chance of it. I certainly wouldn’t want to say, “No fucking way.” But I also certainly wouldn’t want to say, “Oh yeah, that’s the answer. That’s what we’re going to do. I feel safe now.” It feels to me sort of like, “Maybe you’d get lucky.” I wouldn’t bet on it. I wouldn’t bet on that being a real possibility. Although it could be, I don’t know. It’s not super satisfying, like, “Oh yeah, that’s the end game.”

Rob Wiblin: “We’re going to be able to handle this one.” All right, this is too depressing. So let’s push on and talk about politics for a minute.

Political interventions [01:21:53]

Rob Wiblin: As I mentioned in the intro, you were one of the biggest public donors to Joe Biden’s presidential campaign. And I’ll just quickly run through the reasoning, because I think listeners to this show are not going to be shocked by it. During each president’s term, about $20 trillion is spent by the US government, and yet the amount of money that is spent on each presidential campaign is about $1 billion. And there do seem to be substantial differences between the candidates and what they want to do, and how they’d like to spend that $20 trillion, as well as the military and the regulatory state and all of these other things that the president has influence over.

Rob Wiblin: So this ratio of the $20 trillion, say, plus a whole bunch of other stuff, to $1 billion is about a 10,000-fold multiple. It seems like maybe there’s a lot of leverage from adding something to that $1 billion. That’s the broad reasoning. And obviously you also need to think that Joe Biden would make a better president — which I suppose you evaluated the situation and drew that conclusion. How do you feel about that giving in retrospect?

Sam Bankman-Fried: I basically agree with that logic, and I do have some regrets — I think the regrets are mostly not having given more. But whatever, I also just did not have nearly as much to give then. It feels like just a fucking eternity ago. I guess it was a year and a half ago. But the world was different then.

Rob Wiblin: An eon in crypto time.

Sam Bankman-Fried: Yeah.

Rob Wiblin: We’ve talked a little bit about this issue of political giving on the show before, and something that people have often suggested is that even though it’s only a billion, things like the presidential campaigns are a little bit saturated. They find it hard to figure out ways to spend more money, because so much of the influence is concentrated on a relatively small number of states with a relatively small number of swing voters. And so, just how many ads can you run on TV? How many times can you call these people, telling them to show up to vote? Maybe even $1 billion is actually getting you pretty close to finding it hard to spend more money.

Rob Wiblin: But then there’s tons of other political races that might be less important than the presidency, but are much less funded — where it’s very clear that your money really can shift the outcome. Do you have any thoughts on that?

Sam Bankman-Fried: I do. It’s definitely something I’ve heard. And my first response — which is not a super helpful response, but it is my first instinctual response — is that I agree one could argue that. Are you arguing that? Is that how you think the numbers turn out? It’s not how I think the numbers turn out. But I agree one could make that argument.

Sam Bankman-Fried: I feel like often when people make that argument, it’s a little motte-and-bailey sometimes, where they’re not actually trying to strongly claim that — or even weakly claim, or maybe even claim that that’s how they think the numbers turn out. But I want to drill down to like, are these people saying that they’ve done the math, and they think that it is not an effective use? Or are they just bringing up that there could be hypothetical worlds in which it was not an effective use?

Rob Wiblin: I think the argument isn’t so much that donating to a presidential campaign isn’t a good idea, but rather that there might be other, even more neglected and valuable opportunities within politics.

Sam Bankman-Fried: Yeah. My sense, when people make this argument, is that usually they are at least implicitly trying to make the argument that it is not a good use to donate. You could do both. Why not both, then? And so, sorry, if the argument is that there are good things to do outside of the presidency, I completely agree with that. There are absolutely good things to do.

Rob Wiblin: But you don’t buy that there’s no way to spend more than $1 billion over an entire presidential campaign usefully.

Sam Bankman-Fried: That’s right. And putting aside the other things, when you look into things done by the experts in various fields — campaign operatives would be one example — do you have a sense of, in general, how impressive those things generally end up looking?

Rob Wiblin: Personally? Not, not really, to be honest. You think it could be better?

Sam Bankman-Fried: Yeah. I think often the state of the art is surprisingly shitty. And the answer is, oh boy, I agree it’s better than a monkey would do. It’s not literally random, but it’s not super impressive, given…

Rob Wiblin: The stakes.

Sam Bankman-Fried: Yeah, the stakes, and given through, “Oh yeah, this is obviously a thing that people will have thought a lot about and worked on a ton. I’m sure the state of the art is really impressive here.” And you look into it and you’re like, “Oh, that wasn’t really impressive. That was kind of mediocre, actually.”

Rob Wiblin: I guess part of what might be going on here is that when people are thinking about shifting the spending from $1 billion to $2 billion on a presidential campaign, they are thinking about just scaling up exactly the things that they’re doing now. And you’re saying no, we should be thinking bigger. There’s a lot of other things that could be going on. There’s lots of ways we could improve the research, improve our understanding of what positions are good, and on and on and on. People need to expand their minds.

Sam Bankman-Fried: I think that’s right. It’s like, all right, yeah, if you do a really shit job, I agree. But what if you wanted to do a good job with that billion? Then do you think it would have impact?

Rob Wiblin: Oh, well, in that case, yeah.

Sam Bankman-Fried: Right. And one thing to point to here is there’s some cool studies — and I don’t know how much faith to put in these — showing that at least to some extent, in some cases, the average campaign ad has net zero impact. Literally none. It’s unclear if it’s even net positive. And I think a lot of people’s takeaway from that is campaign ads don’t matter, and it’s not clear that’s the right takeaway. A different takeaway one could have is, “But what if you only look at the good campaign ads? Is it that every ad is centered around zero?” And I think the answer is basically no, that’s not what it is — they’re on both sides of zero. But what if you only did the ones on the right side of zero?

Rob Wiblin: The ones that are good, yeah. Here’s an idea for you: back in the 2020 primary for the Democrats, I was pretty open to being in favor of whichever candidate was most likely to win the election. Because in terms of the policy outcomes that you would get, I thought it was going to be determined by the Senate and the House of Representatives — so it didn’t super matter what the specific policy preferences were of the president. And apart from that, it was also just very important to win the election.

Rob Wiblin: But I didn’t really have a very strong view about which candidate was most likely to win the election, if they were nominated. I felt pretty agnostic about that — and I still do, ex post. But it didn’t seem like there was an active effort to spend $100 million or $1 billion on a research project to answer that question. People did make arguments back and forth, but it seemed very informal and not very systematic, and not very open-minded. It was usually rationalizing people’s preferences for the candidate that they liked on the policy issues.

Sam Bankman-Fried: Yeah.

Rob Wiblin: Maybe we should try to have some neutral think tank that just asks the question, “Which candidate is most likely to win the election, and should they be nominated?”

Sam Bankman-Fried: Not a crazy idea. And I basically agree. The amounts spent in primaries are small. If you have an opinion there, you can have impact. And one crazy fact is: you know which campaign almost went bankrupt in 2020, causing the candidate to drop out of the race?

Rob Wiblin: Biden?

Sam Bankman-Fried: Yeah, that’s right.

Rob Wiblin: And McCain as well, I think, back in 2008.

Sam Bankman-Fried: Yeah. It’s wild.

Rob Wiblin: That’s the margin you’re operating on sometimes.

Sam Bankman-Fried: Exactly. And so, again, I think it’s back to this “if it matters, it matters” thing: if anything matters here, then there are really impactful things to do. And I think it probably does matter. It is unlikely to be the case that the answer is that all candidates are equivalent. That’s not my best guess.

Rob Wiblin: Yeah. I mean, there are some political systems that push really hard towards centrist positions, where candidates do end up being quite similar, but the US political system, evidently, is not one of them.

Sam Bankman-Fried: Yeah. That seems to at least not consistently be true here. I think we have some evidence about that. And yeah, I basically agree; I think that’d be a really impactful thing to look into. I think there’s a ton of things there, where if you both have developed opinions and have the means to have impact, that’s a powerful combination. And I think the scale that you can have impact there is pretty substantial. And another thing is, probably it’s bad — my guess, if I had to guess, is net bad — if a US presidential election is stolen.

Rob Wiblin: I think it seems bad to me.

Sam Bankman-Fried: It seems bad, right? It seems probably destabilizing in a bunch of ways. And interestingly, I actually think maybe it’s now a bipartisan opinion that the 2020 presidential election was in danger of being stolen. I think maybe people disagree over which direction that’s in, at least some people do.

Rob Wiblin: Yeah. I guess they both think the election was close to being stolen from them, and one group is right, I suppose.

Sam Bankman-Fried: Right.

Rob Wiblin: We could be agnostic about which one. Yeah, I guess that’s a case where a lot of people talk about how it’s really important to fund races over who’s the state returning officer, and the state attorney general, and the local county vote-counting person. These elections were, until recently, very obscure — they’re not the kind of thing that’s on the national political radar. But some of these people might end up being really important in deciding whether the votes determine the outcome in some future time, and $10,000 for one of them might help.

Sam Bankman-Fried: Completely agree with that, and I think that’s basically right. That’s another interesting place to look for having impact in politics. Another thing is you can look at things like the Electoral Count Act, you can look at various federal policies for vote counting. But yeah, I hear you: your state attorney generals all of a sudden look like they kind of matter.

Rob Wiblin: Yeah. Do you worry about blowback to you being political in the way that you are, in terms of regulatory policy, or people also just hating billionaire influence in politics?

Sam Bankman-Fried: Yes.

Rob Wiblin: Anything to add to that?

Sam Bankman-Fried: At some point, if having positive impact is your goal, there’s a limit to how much it makes sense to worry about the PR of having positive impact.

Rob Wiblin: People being mad. Yeah. I suppose I’m not super psyched about the influence of billionaires in politics. It seems problematic in some ways. And I imagine you feel the same.

Sam Bankman-Fried: Yep.

Rob Wiblin: It’s just that you also really want the US to remain a country in which the votes influence the outcome of elections in a pretty clear way.

Sam Bankman-Fried: That’s right. And I certainly feel a lot more comfortable impacting things in the making-things-more-democratic direction.

Rob Wiblin: Just finally, do you have any thoughts on how listeners who are interested in giving to political causes can get more bang per buck themselves?

Sam Bankman-Fried: Yeah. I think that there are people who are doing really good and interesting work on determining the best ways to do that, and reading what they write and listening to their recommendations can be pretty impactful. I think there’s some within the EA community who are doing that.

Rob Wiblin: Well, that brings us to the end of my questions about your philanthropic giving. Is there any message you’d like to emphasize or leave with the audience?

Sam Bankman-Fried: I think that really, in the end, it’s what’s most important in the world — and shooting really high with it and trying to have as much impact with your giving as possible is just extremely important. And that’s true of how much you give, but also of where you give. We have a website now, the FTX Foundation website that you can go to and get a lot more information on what we’re doing. Hopefully, it’s useful for some of you guys thinking about what you can do as well.

Sam’s views on productive applications of blockchain and crypto [01:33:20]

Rob Wiblin: Fantastic. Pushing on, this isn’t the main focus of the interview, but I am curious to ask you a couple of questions about the blockchain and cryptocurrencies and distributed finance — or I guess cool people call it DeFi now — things like that. For a while, I’ve been unconvinced but open-minded about how many traditionally economically productive applications of blockchain technology that would ultimately turn out to be. Do you have a view on that question?

Sam Bankman-Fried: I don’t know for sure, but I do think that there will be a bunch. Some of these have to do with blockchain, some have to do with crypto, and some just have to do with market structure in a way that wouldn’t need to be crypto-specific, but I think often does turn out to be.

Sam Bankman-Fried: One thing that I feel pretty compelled by is just having equitable, direct access to financial markets. The current economic system is really difficult to get good access and outcomes from, for most people. If you want to go buy Apple stock and you’re a typical consumer, how many intermediaries do you think you’re going through, from start to finish?

Rob Wiblin: I’m going to guess it’s a few.

Sam Bankman-Fried: It’s a few. It’s like 10. It’s a pretty impressive number. And what’s going on there, is basically that you go from the broker to a PFA firm, to an ATS, to another PFA firm, to an exchange. There’s a clearing firm, a custody firm, and then the whole thing is repeated on the other end. What that means is that your actual access to most markets is real crappy. You’re literally not allowed to see the order books that you’re trading on for most people.

Rob Wiblin: These are the buy, bid, sell offers, basically.

Sam Bankman-Fried: Exactly. You’re submitting orders blind. You’re trading kind of blind. You don’t see market data — that, you need to pay tens of millions of dollars a year for. That seems a little bit insane to me. One of the biggest points of markets is that you get price discovery from them. And if you’re not allowed to see the market data, that’s gating a really important piece of it behind tens of millions of dollars per year, per entity that wants to get market data. So that seems kind of fucked up to me. And I think it’s basically a serious problem with our current market structure for anyone but extremely sophisticated firms. And crypto, for a variety of reasons, is quite different in that respect.

Rob Wiblin: Yeah. The exchange just holds the asset and it’s just directly transferred. And there’s only one intermediary, rather than 10.

Sam Bankman-Fried: Exactly. Everyone has that same access and all the market data is completely free and public. So I think it’s more economically efficient, in some respects, as a market structure. The fact that the exchange can directly hold and clear and custody the asset means that you’re removing some of the intermediaries there as well, which is helpful. So I think that’s one piece of this.

Sam Bankman-Fried: Another piece is just payments. Payments infrastructure is really bad right now in most of the world. We casually give 3% of all of our purchases to credit card companies to cover over the fact that payments infrastructure sucks, and it takes months to clear. It’s just not a well-built system for most people. And I think, frankly, stablecoins actually just work a lot better on that front — to the point where if I want to send someone money, I would way rather send it via stablecoins than traditional systems. So I don’t feel at all conflicted about that.

Sam Bankman-Fried: So that’s one piece of this. If you want to send money back to someone in Nigeria, you’re probably paying 20% and taking a week. It’s a lot to lose on a remittance because of different payment rails in different countries, each one of which sucks. And I think blockchain stablecoins are a pretty good answer to that.

Sam Bankman-Fried: Then the last thing is an example I feel fairly compelled by: social media. What’s your favorite social media network? What do you use the most?

Rob Wiblin: I guess Twitter.

Sam Bankman-Fried: Twitter. So if I’m on Facebook and I want to message you, it’s not going to pop up on your Twitter feed or in your DMs there. Those are completely non-interoperable networks. I actually think it’s a little bit weird that that’s the case. Why are there 30 social media networks, none of which can talk to each other? That’s a pretty bad user experience. And I think the one thing that we all — as a nation, as a world — can agree upon at this point, is that bad things happen when one person is the moderator for all of our content.

Rob Wiblin: We’ve just got to make sure that they’re a philosopher king, Sam, that they’ve got the right idea.

Sam Bankman-Fried: Right, right. We tried seeing what happens when Facebook doesn’t censor, and everyone hated it. Then they tried censoring, and everyone hated it. And the answer is, it’s a tough —

Rob Wiblin: Do you think the solution is some sort of pluralism in the interface or pluralism in the filtering or curation?

Sam Bankman-Fried: Right, but with the same underlying messaging protocol that everyone can draw from. So if you had on-blockchain encrypted messages, then any user experience could draw on that same set of messages — you can send someone a message from Twitter and it appears in their WhatsApp. That’s fine, so you get interoperability. And from a censorship point of view, anyone can build their own layer on top of it that does or doesn’t censor however they want, and there can be an actual competitive marketplace for it. So that’s a vision that I feel moderately compelled by for social media, as being better than the status quo.

Rob Wiblin: OK, let’s dissect some of those ideas one by one. The first one, about access to financial markets and disintermediation and so on, my initial reaction is: isn’t active trading among retail investors bad? I suppose I’m going to really antagonize a bunch of your customers and Sam Bankman-Fried fans who have come to listen to this episode, but I’m a chump, Sam. I take a bit of my salary every month and I put it into Vanguard Diversified Funds that invest in all kinds of different companies. I never sell it; I’m not going to sell until I retire. I just kind of buy and hold, and I don’t actively trade.

Rob Wiblin: I suspect that, for me, and for many people, that’s kind of the best thing to do. If I started buying and selling things all the time, I would lose money and I’d be wasting my mental energy. Do we really want to encourage more people to get into the market in that way?

Sam Bankman-Fried: Those are some valid points about extremely active trading by novices back and forth without adding information to markets particularly. But let’s take another hypothetical. When you invest in Vanguard — and that’s what most people do: they’ll put in some ETF, some mutual fund, they’ll buy some stuff — what sort of stuff is it buying? What is your fund manager investing in on your behalf?

Rob Wiblin: I guess they’re going out into the stock market buying stuff, using this annoying process that you’ve described.

Sam Bankman-Fried: Right. So that’s one thing. Another thing they’re doing, by the way, is charging you fees because you can’t buy stuff yourself. But whatever, let’s put that aside for a second, although I do think that’s relevant. I think there are some other interesting things going on there. One of which is, what companies are you getting exposure to, if that’s your investing method?

Rob Wiblin: I suppose it’s publicly traded companies on stock markets.

Sam Bankman-Fried: Right. So there’s this weird thing where, also, you’re probably not going to be getting the best deals ever. You make sure the first $5 billion go to VC firms in any company, before it goes public, right? And it’s been so thoroughly researched by professionals that there’s no way that an individual trader could add real information to its market and make money as a result. So I think it’s sectioning off a lot of the most valuable investment opportunities for most people. And I think that’s not great.

Sam Bankman-Fried: I think it’s also the case that you take a few steps more sophisticated — say a small trading firm. Is a small training firm going to be willing to spend $15 million of expenses per year to get the connectivity you need to have real market access? I don’t know. That’s a lot. Maybe not. And so, even if you’re not talking about retail, but talking about —

Rob Wiblin: Professionals. It limits access. It limits the ability to do innovation and disruptive stuff as trading firms, even.

Sam Bankman-Fried: Yeah. I think that’s basically right. And so I think there’s costs there, as well.

Rob Wiblin: Yeah. OK, that’s interesting. I suppose maybe we’ll get a chance to come back to that some other time.

Rob Wiblin: On the social media and messaging thing: firstly, if everyone is sending messages to one another on a blockchain, doesn’t it mean that you have to have this insanely enormous blockchain that’s going to be thousands of terabytes of data that everyone has to be validating all the time, just to send messages?

Sam Bankman-Fried: It definitely does get more expensive. I think it’s not prohibitively expensive, but I do think this is a really generating thing behind what requirements do you have of a blockchain. So, how many social media messages are sent per second, ballpark, in the world?

Rob Wiblin: I guess I could expose something about my own use if I estimate from that. But I don’t know, a million?

Sam Bankman-Fried: Yeah. I think that’s the ballpark. Something like a million. I don’t know. You could think of it like, well, there’s 10 billion people — there aren’t actually 10 billion, but whatever, that’s about right. There is, what, 100,000 seconds in a day? And so if you take 10 billion, divide by 100,000, that gets you down to 100,000? Is that right?

Rob Wiblin: This would mean that, on average, each human was sending a message every 1,000 seconds, which seems kind of right — maybe, if anything, conservative. But a million to 10 million, maybe.

Sam Bankman-Fried: Yeah. So let’s say that’s a ballpark, very roughly. If you wanted to do this, you would need to have a network that could process about a million transactions per second (TPS). Ethereum right now can process about 10 TPS — not 10 million, but 10. So it’s not going on Ethereum 1. Eth2, it’s an interesting thing. Eth2 can process — well, it doesn’t exist, but it’s meant to process, I don’t know, 1,000 TPS per shard. So what’s per shard mean?

Sam Bankman-Fried: Well, you’ve got all of these copies of this blockchain running in parallel, and each one has a few thousand transactions per second. And with an expensive 30-minute-long process or something, you can try and synchronize these with each other. So, what’s that mean? It means you definitely can’t have social media on a shard of Ethereum — or on a shard of Eth2 even. Could you do it across many shards? Maybe. But if I’m trying to Facebook message you, and you’re like, “I’m using Ethereum shard 78,” and I’m like, “Aw, I’m using shard 173,” then we’re like, “Yeah, guess we can’t talk then.”

Rob Wiblin: You have to end the friendship.

Sam Bankman-Fried: Right. That’s right. That’s probably not the best outcome. So I think it’s tough. Now, how about newer blockchains? You have Solana, which is at like 50,000 TPS right now. That’s getting closer. It’s not enough, but that’s 1/20 of the target or something. So it could maybe handle it until 5% of the world was using this for their social media. You could handle all the United States social media or something, but not all global social media. That’s actually not crazy. That is some actual scale.

Sam Bankman-Fried: And this is dependent on that happening, but the anticipation is that it will likely scale over the next five years to be substantially bigger than it is today, and be able to handle probably something like a million TPS, but with higher error bars. I basically think that the fastest blockchains will probably, but not certainly, get to the point over the next five years where they are able to handle all the world’s social media on-chain. But it’s not a blowout that they will be able to.

Rob Wiblin: It’s borderline.

Sam Bankman-Fried: Yeah. It’s borderline. So I think that is a factor here: are we going to be able to fit it all on-chain? And I think so. I think it’d be pretty bad luck if we never got there. But it’s going to take the fastest chain we can make, effectively, to be able to.

Rob Wiblin: OK. Another naive concern I might have is that now all of my encrypted personal messages, all of my posts, are encrypted but public. So they’re all on this public blockchain that, in theory, is stored forever. If either person’s private keys are ever leaked somehow, then people can just read all of their messages and they can’t withdraw the information that they’ve put out publicly. To some extent, I might almost trust Facebook to store my messages more securely, because at least people can’t access their database directly.

Sam Bankman-Fried: Yeah. That is a worry. There are some things you can do to make that less bad, but they do have some costs. Rather than just directly encrypting it — so if someone else leaks their private key, it’s now public — you could do a thing where what you’re actually encrypting is a pointer to it, which is then stored on some other service or something.

Rob Wiblin: That at some point gets destroyed, I suppose.

Sam Bankman-Fried: Yeah. Somewhere it’s not held permanently. That’s a thing you could imagine doing, but it’s not a perfect answer. And I do think that concerns like that are real.

Rob Wiblin: Yeah. That’s something that would have to be dealt with. But I suppose you’re just generally optimistic that lots of these technical problems that people like me are going to be worried about will ultimately be solved in the fullness of time, because there’s so many minds working on it.

Sam Bankman-Fried: Yeah. I would not say I’m completely confident they will be, but I am optimistic they will be.

Rob Wiblin: So we’ve talked a bunch about markets and communication tech there. Are there any good examples of businesses producing real non-crypto-related, non-market-related goods and services today that wouldn’t have been possible — or would’ve been a lot less productive — if Bitcoin and all of the follow-through inventions had never been?

Sam Bankman-Fried: I assume you’re not going to count remittances as well.

Rob Wiblin: So, I didn’t challenge remittances because I think remittances is legit. Although you might just end up seeing Western Union survive, but greatly reduce its profits.

Sam Bankman-Fried: Right.

Rob Wiblin: So they might reduce their rents. But that seems legit. I guess I’m curious, is someone making cars using this in some useful way, or someone giving massages or whatever?

Sam Bankman-Fried: Right. I think there are things you could do. I think there basically aren’t current examples outside of finance. I think there could be, and social media is one. I think others, like medical records: being able to have access to your own medical records and being able to easily give access to another hospital of your medical records and things like that.

Rob Wiblin: Yeah.

Sam Bankman-Fried: No one’s actually really doing that right now, but in theory, that could be valuable. So yeah, I think the basic answer is not a lot today, but in theory.

Rob Wiblin: In theory, yeah. I suppose there’s going to be people in the audience kind of shouting at me in their heads, saying, “But what about normal markets? You don’t hold the stock market to this level.” I guess at some point, the hope would be that companies are getting funding through these systems. That they’ve found much easier ways to raise capital, say, to start a business. Then we really could talk about businesses that it would’ve been too prohibitive to get them off the ground, without these kinds of financial innovations effectively.

Sam Bankman-Fried: I think that’s right, and that is definitely part of the hope.

Was the way Sam made his money good or bad for the world? [01:48:21]

Rob Wiblin: Yeah. A listener wrote in — and I’m sure you get this question from time to time — they said: “There’s a lot of people who hate crypto and think it’s bad for society on the whole, for environmental reasons, among others. I’m curious as to whether Sam thinks the way he earned his money was positive or bad for society.”

Sam Bankman-Fried: I think it’s probably positive. For the reasons I’ve given, I think that it had positive impact on the world: financial inclusion, equitable access, better and more efficient markets. I don’t want to oversell that and try and claim something that I don’t think about it. And I don’t want to necessarily say that I’m saying that this is the best thing that you could imagine.

Rob Wiblin: That in terms of direct impact, that it was the best thing you possibly could have done.

Sam Bankman-Fried: Right. But I do think it was positive.

Rob Wiblin: I suppose maybe the main concern that people have is the effect on climate change — that it’s using tons of energy. Do you have any reaction to that?

Sam Bankman-Fried: Yeah. It is a concern. I think there are ways to deal with it. So first of all, this is exclusively a concern with proof-of-work blockchains. Proof-of-stake blockchains have effectively no climate impact.

Rob Wiblin: And they’re kind of a growing share — they’re likely to take over ultimately, right?

Sam Bankman-Fried: Yeah. So as of now, 80% of deposits and withdrawals on FTX use proof-of-stake blockchains. Even when people are trying to transfer a Bitcoin, they will often put that Bitcoin on a proof-of-stake chain and transfer it using that. Because even if you don’t care about the environment at all, just the economic cost of the energy usage is significant for proof-of-work blockchain. So in addition to climate reasons, for that reason, there has to be better ways to do it.

Sam Bankman-Fried: I think we’re definitely moving towards a world where the vast majority of transactions happen on proof-of-stake blockchains. That’s the biggest thing. We also buy carbon impact offsets for all of the proof-of-work transfers that happen. And we also invest in climate change R&D. But I think the biggest thing is just transitioning to mostly proof-of-stake. And certainly, as it scales, that all of that scaling happens through proof-of-stake networks.

Rob Wiblin: I suppose another response would be that you didn’t make Bitcoin; you’re a platform that people trade it on, and Bitcoin would still exist even if FTX didn’t exist. So to some extent, you are substitutable in that regard.

Sam Bankman-Fried: Yeah.

Rob Wiblin: Some listeners of the show, actually quite a lot, would like to have a positive impact on the world by working to develop distributed finance and other blockchain-related innovations. How promising do you think that space is at this point?

Sam Bankman-Fried: It depends on what standard you’re holding it to, right? I think it’s fairly promising if you’re going to go do something finance-y — it has probably more positive impact than most things you can do in finance. And I think it probably extends to a lot of jobs. I don’t necessarily want to say you can either work on AI x-risk or blockchain tech directly, which [is better]? I don’t know. I would say AI x-risk. So I think it’s a question of if you’re saying, is this the most pressing problem for the world right now? Or, is this a net positive thing to do and commensurate or above that with just direct economic effects of it or something?

Rob Wiblin: Are there any socially positive or socially beneficial applications of blockchain tech that you’d want to highlight potentially for listeners?

Sam Bankman-Fried: I think remittances, equitable access to finance, and social media are probably the biggest things that I would point to right now.

Rob Wiblin: Yeah. I’m sure you get this question daily, but if you had to buy and hold one cryptoasset for 10 years, what would it be? Or if there’s multiple, let us know if there’s more than one.

Sam Bankman-Fried: I definitely want to be careful about giving financial advice here. I will give maybe a little bit of a non-answer, but I will say that technology-wise, I definitely think that Solana is the layer-1 blockchain I’ve been most impressed by. I think the team is quite strong, and that they’ve had the right vision for how to scale a blockchain massively. I think they’ve done a good job executing on it. Obviously, it’s up in price a lot over the last year and a half, and you can make your own judgments about what the fair value is for it. But in terms of the tech, it’s been the one I’ve been most impressed by.

Rob Wiblin: OK. Yeah. For listeners: diversify, diversify, diversify.

Sam’s personal story [01:52:34]

Rob Wiblin: Let’s back up quite a bit and talk about your personal story, because I think it’s super interesting. I mean, all billionaires are a little bit quirky and have their own interesting origin story, but I think yours is particularly interesting. Where did your interest in effective altruism and related ideas about doing good originally come from?

Sam Bankman-Fried: So originally-originally, my parents have been sort of interested in utilitarianism since way before I was born. And when I was growing up, I explored it a bit online and in theory was utilitarian, but without any particular follow-through. Somewhat intentionally, I don’t know; it seemed like follow-through would be hard and scary, and easier to sort of think without justification that maybe there’s nothing you could do.

Rob Wiblin: It’s interesting how from one point of view, utilitarianism as a philosophy is incredibly pragmatic and practical, and yet it’s so hard to see what it implies exactly on a day-to-day basis. It seems like the philosophy was almost completely disconnected from people taking action until almost very recently, for some reason.

Sam Bankman-Fried: Yeah, I agree. It does seem like it took a long time for it to start to have more impact than when people considered a political position or a policy position or something.

Rob Wiblin: Yeah.

Sam Bankman-Fried: So I got to college, not doing much, and met a friend of mine who was in a bit of a similar position. And I think the thing that first became concrete to us was animal welfare and factory farms. That was a little bit of the proof that the answer isn’t that there’s nothing you can do. The answer isn’t that you can’t impact the world, nothing matters. I don’t know. It was sort of like, that’s a pretty massive-scale problem.

Rob Wiblin: Do you remember who persuaded you of that?

Sam Bankman-Fried: It was something that’d been bouncing around in the back of my mind for a while. I think Peter Singer was one of the people who first sort of inserted that in my mind, but I’d been sort of ignoring it. And then met my friend, Adam, at college, and we talked about it and we were both sort of like, “Yeah, I can’t really justify eating meat either.”

Rob Wiblin: Yeah.

Sam Bankman-Fried: And over the course of freshman year, he slowly went vegetarian. I tried to as well and made no progress. I don’t know. I would just try and decide what I wanted to eat, and think for a bit, and then get a cheeseburger. This is most of what I did for freshman year.

Rob Wiblin: Interesting. It didn’t immediately bite in terms of your behavior.

Sam Bankman-Fried: No. I tried to, but I just kind of failed. I sort of wanted to, but I didn’t know how to create that change in myself. And the willpower cost was pretty large.

Rob Wiblin: That slightly surprises me, because at this point, you’re just so committed to working really hard to change things. And yet it seems like that wasn’t an immediate thing — that characteristic took a while to germinate.

Sam Bankman-Fried: It did. I was probably at like 20% vegetarian or something by the end of the school year. And I was eating tofu one night, and a friend said, “Hey Sam, are you vegetarian now?” I said, “Yes.” And it was not true. I mean, I’d had a burger for lunch. But I said yes; it just seemed like the answer to give. And I haven’t eaten meat since then.

Rob Wiblin: Interesting.

Sam Bankman-Fried: Yeah. For me, slowly cutting it out just made absolutely no progress, and going cold turkey just worked immediately. It was like I had to reframe myself in my own mind as someone who didn’t eat meat. The problem I ran into is that otherwise, every single fucking meal, I would have a decision to make.

Rob Wiblin: It’s a decision.

Sam Bankman-Fried: And it was just brutal.

Rob Wiblin: And it’s terrible. Yeah. Completely.

Sam Bankman-Fried: So I did that, and I started getting more into factory farm animal welfare and exploring online. Went from blog to blog, and ended up on Felicifia, which is a utilitarian forum, now defunct.

Rob Wiblin: Yeah. So this would’ve been around 2010 or 2011 or something like that. Wow.

Sam Bankman-Fried: Yeah. It’s probably 2011.

Rob Wiblin: I don’t know whether web forums like that still exist, but this was a place where you could sign up and create an account, and then you would create threads and chat about topics that people would raise. This was a forum about utilitarianism, and it attracted pretty quirky philosophically interested people who were bouncing around lots of ideas that ultimately ended up getting picked up by effective altruism and longtermism.

Sam Bankman-Fried: Yeah. I think that’s right. It was a pretty cool place, actually. It was only there briefly. So it came not that long before its demise. But yeah, that’s what started to introduce me to people who eventually were part of the EA community. And one day, neither of us can remember how, Will MacAskill somehow emailed me and said, “Hey, I’m giving a talk in Cambridge, Massachusetts. Do you want to get lunch beforehand?” And I did. And that was my first real formal introduction to EA. And that was sort of like the turning point of where I actually became part of the community.

Rob Wiblin: Just backing up slightly, you said you were raised by two law professors who, I guess in their academic life, advocate a kind of utilitarian philosophical approach to law — and I guess to life as well, maybe.

Sam Bankman-Fried: Yeah.

Rob Wiblin: But I suppose, is it because they’re at the intellectual level, where they’re thinking about things philosophically, that maybe you got a bit stuck with the idea that the utilitarian thing is to hold utilitarian opinions, but not necessarily to actually take action on them?

Sam Bankman-Fried: Yeah. I think that it sort of had some impact on what they did, and over the course of their careers also, they’ve started working on more and more actionable things. They’ve both done a bunch of things outside of their sort of standard job remit at this point. And by this point, I think actually they are working on incredibly impactful things. But I think it was also not how almost anyone was thinking about the issues 30 years ago.

Rob Wiblin: Yeah. So often, people rebel against the attitudes and beliefs of their parents, but it sounds like you and your brother have ended up agreeing with your parents mostly. Do you have any idea of why that is? How you ended up having quite distinctive but similar views?

Sam Bankman-Fried: I don’t know. I didn’t feel a lot of need to rebel particularly. They seemed pretty correct to me. There’s some specific subtopics where I’ve disagreed a bit with them, but I don’t know, it was a good upbringing. There wasn’t a whole lot of anger or angst about it to cause that.

Rob Wiblin: OK. So basically, you just like them. You’re a family that gets along. So it’s easier for their views to get transmitted down the generations.

Sam Bankman-Fried: Yeah.

Rob Wiblin: Do you think there’s any biological aspect to the views you have? Is it just cultural, or do you think there could be more to it, like in terms of temperament perhaps, that you’re born with?

Sam Bankman-Fried: I think there’s probably more to it. Thinking quantitatively about things is probably a piece of this. Thinking on the more cognitive and rational side and less emotional side about impact on the world is probably part of this.

Rob Wiblin: It seems like emotionally, you’re relatively level.

Sam Bankman-Fried: Yeah.

Rob Wiblin: And possibly the other people in your family are. Maybe when you just feel kind of the same regardless of what the topic of conversation is or how your day is going, it’s easier to intellectualize things like this, and to take intellectually consistent philosophical views, rather than respond to how you’re feeling or what your intuitive reaction is to things.

Sam Bankman-Fried: Yeah. I think I tend to have relatively low emotional swings, and some of that is probably just genetic and environmental. Some of that I think is also a little bit trained — I sort of think it’s the right thing to do in analyzing things, trying to be dispassionate. And so I try and take my own emotional perspective out of it, and think more about just, “How does this fit in the world?” I try not to see things in the light of me personally, as much as I can.

Rob Wiblin: Yeah. Makes a lot of sense, that that leads to this kind of universal perspective, moral philosophy attitude.

Rob Wiblin: OK, so it’s the second or third year of undergrad you met Will MacAskill. Somehow you two ended up connecting, and you started chatting about how you might have impact. Is that right?

Sam Bankman-Fried: Yeah. So we started chatting at lunch, and we still don’t remember how we got introduced. Basically, he sort of pitched EA to me. It seemed obviously right, and I started diving into the community and rethinking what I wanted to do with my life. I talked with him and talked with Ben Todd about what to do with my career. I sort of hadn’t had any ideas — I was pretty lost — and they pitched a bunch of things, including earning to give. And that seemed like a pretty compelling opportunity for me, given that I was a physics major at MIT who didn’t know what I wanted to do with my life. I had some friends who’d interned on Wall Street, and so I applied.

Rob Wiblin: It sounds like one of the biggest impacts that Will and this sort of intellectual scene had was just getting you to seriously ask the question, “What do my moral views imply about what I ought to do with my career?” Because it seems like it wasn’t a question that you’d been seriously dwelling on very much before.

Sam Bankman-Fried: I think that’s basically right.

Rob Wiblin: It’s super fascinating that it’s such an obvious idea to be like, “Well, I’m going to spend all this time in my career during my life. I have particular moral values. What do my moral values say would be the best thing to do?”

Sam Bankman-Fried: Right.

Rob Wiblin: Especially being open-minded about that and really thinking it through, it just was so uncommon — even for someone as incredibly smart as yourself. Basically no one was really thinking this way until kind of at least the 2000s.

Sam Bankman-Fried: I think that’s basically right. And I agree, it’s sort of weird and confusing.

Rob Wiblin: Yeah. Cool. OK, so you’re chatting with Will and Ben about various different things. Why did earning to give stand out for you?

Sam Bankman-Fried: Basically, part of this was a sort of differential skillset type thing. Like the physics major from MIT who doesn’t know what they want to do was the classic profile for someone to go on Wall Street, which was one of the highest potential earning careers. So that was a decent piece of this. And outside of that, it seemed like something I could try quickly: just try interning that summer and see what happened, and see if I was a good fit. And then I did things like, I talked to The Humane League, and was like, “Would you rather have me as an employee or my donations?” They were like, “Definitely your donations.”

Rob Wiblin: Right. In no way insulting.

Sam Bankman-Fried: And so yeah, I ended up doing that.

Rob Wiblin: OK, interesting. So you were kind of weighing up like, “Here’s the impact that I would have if I became a staff member of these organizations. Alternatively, potentially I could make enough money to fund five other people in my place.” It was that sort of reasoning that led you towards, “My comparative advantage in this system probably is the money.”

Sam Bankman-Fried: Yeah.

Rob Wiblin: In those early days, did you have any kind of reservations? Or were there downsides mentally to starting to take this attitude towards your career? Like maybe it started to feel like a burden in some ways, or you felt like, “Now I don’t feel like I have the choice to do whatever I want. I feel pressure.” Even just like internally imposed pressure to do the thing that’s most good?

Sam Bankman-Fried: There’s definitely a part of me that felt that, but it didn’t feel like the most important part of me. Maybe I’d analogize it to how you don’t eat meat, right? When you see a steak, are you like, “Man, it sucks that I can’t eat that”? I actually like steak. I would enjoy eating it, but I don’t. And sometimes I’m less happy maybe than if I did, but also it doesn’t feel like a sacrifice each time or something like that. It’s sort of like, that’s who I am. I think this felt a little bit like that. It felt like the right thing to do, and it’s helped me not dwell on alternatives.

Rob Wiblin: Yeah. Around that time, how did your ideas develop about what problems in the world were potentially most pressing to work on, where you ultimately might want to give your money? Was that a decision you were deferring to later?

Sam Bankman-Fried: Partially deferring. I think it sort of developed a little bit in the background and a little bit just with more exposure to EA. And I spent a while grappling with my thoughts on AI. I think it took me like a year or two to get comfortable with it as potentially the most important cause area. And even since then, I’ve had a lot of skepticism about a lot of the specific approaches to it. But I do think it’s worth splitting that out, and saying that it’s possible to be skeptical of every single approach anyone has ever tried to mitigating AI x-risk, and still think that it is the most important thing in the world if one were to find a good approach to it — which is closer to where I am now. I think it’s really important to have that second part, so you don’t get stuck and be like, “Well, I haven’t liked any before, thus this cause area doesn’t matter” or something.

Rob Wiblin: Yeah. So in 2012, 2013, I imagine you were meeting more people who were involved in effective altruism, and reading the kinds of things that were coming out of the Future of Humanity Institute and the other research sources around that time. How helpful did you find all of that?

Sam Bankman-Fried: Yeah. Not that helpful is the honest answer. I mean, it wasn’t bad. I think the high-level stuff I found helpful for getting me to think about the right things. But also, I felt that a lot of the more specific work was not adding that much on that.

Rob Wiblin: Yeah. I suppose your answer doesn’t super surprise me, but some people might think, “Oh, meeting all of these people made me more motivated. It made me feel more excited about the ideas, because I now had friends who have the same views.”

Sam Bankman-Fried: That is definitely true.

Rob Wiblin: OK, so there’s that aspect?

Sam Bankman-Fried: Yeah.

Rob Wiblin: Then maybe reading more that the people were writing about this stuff could have made you feel more intellectually convinced, or more like you’re not getting tricked by some stupid ideas. Or it could have had the reverse effect, potentially.

Sam Bankman-Fried: A little bit of both.

Rob Wiblin: Yeah, yeah, yeah. It seems like you formed a reasonable plan about what you wanted to do relatively quickly. So you weren’t like, “Oh no, I just really need to go away and learn all this stuff in order because I have a lot of existential angst about my direction.”

Sam Bankman-Fried: Yeah. There’s sort of parallels to people saying like, “You say that you’re a consequentialist, but what if being a consequentialist pisses people off and that has bad consequences? So that’s part of the consequence and that should go in your fucking calculation.” And similarly here, it’s sort of like, wasting time trying to decide what to do is part of what you’ve decided to do. That should go in your calculation of what your right career path is. Choosing to stall for four years and learn and then go down career path X is a different career path than just going straight down career path X. And you should compare those.

Rob Wiblin: Yeah.

Sam Bankman-Fried: And the straight down career path X might be a better decision than the waffling one. I think it often is. So that felt pretty natural to me that like, what am I really gaining by stalling here?

Rob Wiblin: Yeah. Back in those days, you had a blog where you were regularly writing about these issues, and I guess also sports analytics and election polls as well. Are there any posts for things that you wrote from that time that you remember fondly? I was writing around that time, and I cringe at a lot of the things that I wrote back then.

Sam Bankman-Fried: Oh, some of them are certainly cringey. I don’t know, I had fun. I enjoyed doing a lot of the stuff. And on the sports side, still, there’s a small part of me that really wants to be a baseball general manager and like, you know, fuck with the team. I mean, I feel pretty compelled that no pitcher should ever pitch more than two or three innings in a game ever. And — well, this might go away soon with rule changes — no pitcher should ever hit: we should always pinch hit.

Rob Wiblin: We can consider that as a new 80K cause area, maybe.

Sam Bankman-Fried: That’s right. I think so. Putting that aside for a second, I think that there were a few things that I got from that that have stuck with me. One of them is multiplicative causes: if you think the ultimate good is the product of a bunch of things, the math changes versus if you think it’s the sum of a bunch of things. If it’s a product of a bunch of things, first of all, it often encourages diversification. And second, it means that often a lot of the expected value is in the upside tails — you get sort of the exponential distribution.

Sam Bankman-Fried: So that is something I kind of do think, and think that people underestimate that fact and don’t shoot high enough because of that. And then there’s just some things I enjoyed, like writing about how I think people overestimate the importance of old things and the goodness of old things. I think you should judge them by current standards, not historical standards.

Rob Wiblin: It’s a cause close to my heart as well.

Sam Bankman-Fried: Oh yeah.

Rob Wiblin: There’s a lot of path dependence in what people think is the very best.

Sam Bankman-Fried: Oh yeah.

Rob Wiblin: Yeah. The best time to have written a play if you want people to think it was great is the 15th century, apparently.

Sam Bankman-Fried: Apparently. I mean, every play written then is great. Who knew?

Rob Wiblin: Incredible. When you graduated, you went to work at a proprietary trading firm called Jane Street. Which quite a lot of listeners might have heard of over the years, because quite a lot of people involved in the effective altruism community, and people trying to earn to give in general, have gone to work there. The case in favor is relatively straightforward: people say it’s a really enjoyable place to live, you build up lots of really good skills, it’s enjoyable for the kinds of people who listen to this show. Also, it pays very well, so you can do a lot of earning to give there. But if you can recall, what was the best argument against taking that path?

Sam Bankman-Fried: I think the best argument against was just that there are other things I could do with my life instead, and maybe I should look at those. I don’t think at the time I felt there were any compelling negatives of Jane Street, so much as maybe other things are good too.

Rob Wiblin: What other things did you seriously consider then, that might have been competitive in terms of impact?

Sam Bankman-Fried: At the time — and I’ve obviously thought about it more since then — the biggest things I was thinking of, one was journalism. I don’t think I had an extremely concrete, “this is what I would do in journalism” type thing, although I blogged a little bit. But I think it was that journalists seem to have a really outsized impact on society — they’re people who millions hang on the words of. It has this nice property from the angle of being under-monetized, I think. If you look at the main impact of journalism, I think its impact on the world is way bigger than its actual compensation is, because you have all these people who really care what appears in print, but don’t actually pay much for it. Anyway, it seemed like maybe an outsized impact on society type of thing. So that was one direction.

Sam Bankman-Fried: I was thinking a little bit about potentially politics. Again, I don’t think I had an extremely concrete sense of what that would mean, but it seemed potentially high impact. I thought a lot of our politicians seem to have big impact on the world, and again, under-monetized — maybe means that it’s inefficient in the direction of more impact.

Sam Bankman-Fried: And maybe doing my own trading thing; I had no idea what that would actually look like at the time. Maybe going to work for an EA organization; I didn’t have concrete thoughts on which one or what I’d do there.

Rob Wiblin: What was the main factor that pushed you in favor of earning to give over those other ones?

Sam Bankman-Fried: I think it was partially that I interned at Jane Street and had a really good time there and seemed like a good fit, so that seemed like a positive update for that. The numbers of what one could potentially make there seemed substantial, so it seemed like this is a good fit; I could make a lot of money. Talked with people about what that would look like and in general, people seemed to think, “Oh yeah, that seems pretty good if you can do that.” And nothing else seemed extremely concrete and compelling — it was all sort of, I don’t know, “maybe there are other good things” type stuff.

Rob Wiblin: Yeah. Thinking back, I suppose a factor that would’ve loomed pretty large in my mind would be, what’s your comparative advantage relative to people that you are coordinating with? Like out of all the people we had, you probably weren’t in the top percent of writing ability across the EA community, because we just had some extraordinary writers at that time. But you probably were in the top 10% in terms of your trading ability, or earning capacity. Maybe that was an important factor.

Sam Bankman-Fried: I think that’s basically right. Yeah.

Rob Wiblin: OK. In 2012, you did a bit of a reevaluation of your career and you actually even did an internship at an effective altruist org — you tried out being a fundraiser at the Centre for Effective Altruism for a month or two. What were the key considerations that you were weighing up at that point?

Sam Bankman-Fried: This is looking at 2017, when I left Jane Street. Again, I don’t want to portray this as being more confident than it was, because it wasn’t super confident — this is all me just trying to make the best decisions I could, given incomplete information. But basically I just got out a piece of paper and forced myself for the first time in three and a half years — basically the first time since I joined Jane Street — to think quantitatively and moderately carefully about what I could do with my life. I just got out a piece of paper and wrote down what are the 10 things that seem most compelling to me right now, and evaluate the expected value of each of them, just ballpark it.

Rob Wiblin: I often find that that’s quite hard, or I feel like a lot of people might try to do that. They try to put down numbers and stuff, and they just kind of get stuck. So I’m not always sure how useful that is as a suggestion to give to people, but you found that it worked for you.

Sam Bankman-Fried: I did. And I agree that it’s pretty difficult. You can get a lot of different answers from doing that, so I don’t want to oversell it as a technique. I don’t think it would’ve distinguished between two about-equally-good opportunities and told me which one of them was slightly more equally good or whatever. I think what it did do, which was pretty helpful, was force me to even just come up with order-of-magnitude estimates. Sort of just like, “Well, at least I can do something here. And that’s better than doing nothing.”

Rob Wiblin: I guess it might cause you to notice if something is just completely dominated, where it’s otherwise possible that if you were just being really vague about things, you might not even notice that.

Sam Bankman-Fried: I think that’s basically right. That was part of the impetus behind it. It was like, maybe it’s going to end up looking messy and indeterminate. OK, sure. And maybe it’s really hard to get a good estimate, but let’s at least see if that is the case.

Rob Wiblin: Yeah.

Sam Bankman-Fried: I think things like that can be super helpful. As an example, if you just write down a sort of Fermi estimate for how important is AI x-risk, I think you end up saying, “Oh wow, there are some big numbers here.” It doesn’t say conclusively that it is definitely the best thing or anything like that, but what it does is it says, “OK, this seems worth investigating more. I’ve got some giant factors here and I don’t feel extremely convinced that those should be dismissed.” How I was thinking about this was much more from the perspective of, basically, let’s at least just see if the answer is what the BOTEC does say, so to speak.

Rob Wiblin: “BOTEC” is a “back-of-the-envelope calculation” — an acronym for that.

Sam Bankman-Fried: Yes.

Rob Wiblin: Were there any things that you discarded at that stage?

Sam Bankman-Fried: I’m going to be honest, I didn’t get that far into the BOTEC. The reason that I didn’t get that far into the BOTEC was that I spent maybe two minutes starting to write things down, sort of, “Let me list out options.” Listed out three of them. By the time I’d listed out three or so options, I started to get a sense of where it was probably going to go. I’d sort of started from the top down of, “What are the most compelling things here? Which of these seem highest EV?”

Sam Bankman-Fried: The thing that I saw, which was really interesting, was first of all, really high uncertainty over everything. I really did not feel confident in most of this. Second of all, though, was there were a lot of things that I thought were kind of compelling to try doing with my life. They all were getting, I would say, comparable-ish, order-of-magnitude estimates for expected value, and it really was not clear how to order them. It was not clear where Jane Street fell within them either. I remained very excited about Jane Street, and it scored pretty highly on this — but so did like five other things.

Sam Bankman-Fried: I started thinking about this, and the sense I got pretty quickly was that I could think about this for a while, and I suspect where I’m going to end up is, “There’s actually a lot of plausible things here, and I don’t know which is best.” It seemed pretty implausible to me that I would end up thinking, “Oh yeah. Obviously X is the best here. It’s not even close.” That didn’t seem like it was going to happen.

Rob Wiblin: What did you do, given that?

Sam Bankman-Fried: Well, OK. There were a lot of things that seemed kind of compelling. The thing I was doing was one of them. And naively, that might make you think that the right thing to do is just keep doing that.

Rob Wiblin: You might think, “It’s kind of fine. There’s nothing obviously better, so I’ll just carry on.”

Sam Bankman-Fried: Exactly. That was one interpretation. It was not my interpretation. My interpretation of it was huge uncertainty: “I’m not going to be able to figure out which of these is best without trying.” You look at politics as an option, and who the fuck knows how that’s going to end up. It very well could be that it turns out being completely worthless, or it could be massive impact. I was pretty skeptical of my ability to do better than I’d done so far on the really rough estimate. What that made me think basically was, “The only way for me to really know what the right thing to do is to try. The only real way for me to resolve this is to dive in, start doing some of these things, and see which is highest EV in practice.”

Rob Wiblin: So I guess the value of information is super high if you’re very uncertain about a bunch of different options, but also you think you might be able to resolve the uncertainty about which one is best by actually giving it a go.

Sam Bankman-Fried: That’s right. Value of information is super high. Why does that mean not Jane Street though? Because that was one of the high ones. In the end what compelled me was, I’ve done a lot of research on Jane Street. It’s actually one of the lowest uncertainty ones. At that point I’d sort of determined it was about X, and X was about the average of the other top five things. I could stay there and just keep going on the trajectory, and I did feel pretty excited about that, but I wasn’t getting any more information from that.

Sam Bankman-Fried: I had already explored that path and it seemed pretty good. It seemed unlikely it was going to turn out to be literally the best of those options. Probably one was going to turn out to be better, although I didn’t know which one, and I probably could get a lot of information about those others. So it was high value of information because of the high uncertainty. And in particular, the path that I was on was the one for which I had low uncertainty. That was, in the end, life altering.

Rob Wiblin: So you were learning the least from that.

Sam Bankman-Fried: Exactly. That’s what compelled me in the end that I should try out something else.

Rob Wiblin: So you tried out a bunch of different things. One of the things that you started trying out around that time, not that long after that, was trying to run this cryptocurrency trading firm.

Sam Bankman-Fried: Yep.

Rob Wiblin: My initial reaction to that, when I heard about it on the grapevine, was kind of mixed. Because I was thinking, “Don’t we already have billions of dollars that Open Philanthropy is finding quite hard to give as fast as it would like to give? Maybe rather than make more money than that we’re struggling to disburse, Sam should work on the key bottleneck here” — which would mean becoming a grantmaker: trying to figure out how to take the money that’s already available and disburse it efficiently, rather than working on the stage before that and getting even more money to give away. Obviously, I feel like a bit of an idiot to have thought that — in retrospect, that opinion doesn’t look so good. What do you think I was getting wrong then, if indeed, ex ante, I was getting something wrong?

Sam Bankman-Fried: Yeah. It’s interesting. I’m not totally sure that getting something wrong is the right interpretation also. I see why it was compelling. I don’t know what the right thing was necessarily, and I don’t think it’d be crazy to have thought that this was not the thing you were most excited about of all these things — there’s a lot of exciting things here; maybe this doesn’t seem the most exciting.

Sam Bankman-Fried: One thing was that it had really high value of information. This was one of these things where in a month I was going to be able to significantly investigate this, and figure out how good it was going to be. The reason is, you just try trading and see if that made money — and if it didn’t, then that’s that. Whereas politics, that might be a 15-year roadmap before I had real information, or at least anything close to complete information about how it was going.

Rob Wiblin: I see.

Sam Bankman-Fried: That was one of the things: it was just something where I could literally just take a month and figure out, is this going to be compelling? The second thing is that the numbers just seemed really big. This is one of these things where you could say the same thing about Jane Street — about is money not what we need — and I think it’s a reasonable argument. The reason I was compelled by it was, it actually seemed like the upside was maybe a lot higher.

Sam Bankman-Fried: Here’s the ballpark calculation I went through. Lots and lots of ways to do this, but basically speaking, first of all, just looking at arbitrage for a second — just looking at cases where one thing was trading at a different price on two different exchanges. You have Bitcoin: at the time, late 2017, it was trading a few billion dollars a day of volume. On average, the spread between different exchanges was a few percent. Now, it was really messy and the data looked probably fake: some of the numbers are just too big. You see 20% arbitrages and are like, “All right, that can’t be real.” But, I don’t know, who knows what the real number is.

Sam Bankman-Fried: Let’s say we take this few percent, the median divergence between exchanges or something like that. Let’s say for some reason that it’s kind of real, or at least it’s somewhat real — it’s not totally fake. What would that then imply? You’ve got this few percent divergence; you’ve got a few billion dollars a day volume. Let’s say we were to trade 5% of crypto volume, trying to do arbitrage — so that means maybe $100 million a day of volume or something — and we were to make a percent on those trades. That gets you $1 million a day of P&L. That would double the amount of money in EA or something, at the time. That would’ve been huge if you could get that big, and obviously it doesn’t prove you can get that amount. Again, the data looked like it might be fake.

Rob Wiblin: But it seemed worth finding out. Worth taking a month to find out.

Sam Bankman-Fried: Exactly. That’s right. It seems like it’s worth taking a month here to figure out whether you could potentially double the amount of money in the movement. That was effectively what it seemed like.

Rob Wiblin: This is a good example where I think the concept of key bottleneck is something worth keeping in mind, but it can be swamped by simply you having too much impact on something that’s not the key bottleneck. You can imagine, if you become a grantmaker, you might spend your day trying to figure out which five out of 20 different grant opportunities should we fund, and that would be great.

Rob Wiblin: But if you could make a sufficiently large amount of money earning to give, then you could just fund all 20 of them, and then you have done whatever good you could do with the five that you could have funded otherwise, plus whatever you get from the remainder. It can’t be the case that only working on the thing that you identify as the most likely to be a bottleneck at any point in time is the only valuable one. It has to be possible to dominate it.

Sam Bankman-Fried: Yeah. I think that’s basically right. It’s one of these things where, all right, maybe this key bottleneck adjustment is a factor of three or something, adjustment on how good things are. But some things are just more than a factor of three better than what the state of the art was thought to be.

Rob Wiblin: Yeah. I suppose at the time I was also a bit skeptical, because I was willing to accept these arbitrage opportunities do exist at any given point in time, but I would’ve thought surely other people will come in, surely this is going to end up being quite competitive. So sure, you’ll make $1 million a day initially, but other people are going to cotton on to this and it’s not going to last. But it seems like that’s not quite true. There have been ongoing opportunities to make substantial amounts of money in this kind of trading.

Sam Bankman-Fried: Yeah. I had some of that intuition too, like, “Look, even if these numbers are real, they won’t last” — seems like a good bet for numbers that seem too big. Part of this was like, “I don’t know. Maybe this is a thing which is great for a few months. Then we make $30 million and then it’s gone” — and that still seemed worth it, for a few months.

Sam Bankman-Fried: But part of it was also — and this was drawing on some intuitions that had sort of built up over time — the sense that the world is not very efficient, and sometimes it’s sort of hilariously inefficient. That there’s a limit to how skeptical you should be of your ability to out-compete the world, even if it seems like the numbers are too big. Even if it seems like it’s implying that you’re going to make too much doing something, there’s a chance that’s real and there’s a chance that you can actually sustain it.

Sam Bankman-Fried: Being first is worth a fair bit. You get there before the other players, you build something up, become an institution in the space. I definitely didn’t feel like obviously this was going to be a long-term thing at the time, but I thought there was a chance that we would be able to snowball intuitions and context in the space, and scale up quickly.

Sam Bankman-Fried: And it had this nice property of being a pretty isolated ecosystem in terms of market structure. If these were things that were primarily trading on CME, it would be a lot less compelling to go down this path — because as soon as real institutions get involved, they have massively better connectivity than we do, and we’re just totally outclassed. Because this was this weird isolated ecosystem, it was going to be a lot easier for us to build up infrastructure, and the infrastructure other people had was going to be a lot less applicable. So that also was a reason that I felt less compelled than I otherwise would have.

Rob Wiblin: Yeah. That makes sense. It’s a big question of how efficient is the world in general? Because that kind of high-level worldview factor ends up influencing a lot, like how skeptical you are about opportunities that you see to do something incredible, like, say, “We’re just going to arbitrage and make $1 million a day or more.”

Rob Wiblin: I suppose, because my training is in economics, I tend to come with this preconception that things are quite efficient, because people will have taken most of the opportunities. But gradually that view has degraded over time as I’ve seen more and more people — who I know who are very talented, but not the best in the world at everything — just go and accomplish incredible stuff. I’ve seen them go from the stage where they’re like, “I’m not sure whether this is a legitimate opportunity” to the stage where they have kicked ass on a global level. It makes you think that maybe there just are a lot of opportunities out there.

Sam Bankman-Fried: It’s weird, because this is one of the most fundamental properties of economics, and one of the most deeply held. In particular, this is what you understand if you’re an economist, that you don’t if you’re not: if you think you see an opportunity, you’re wrong. It really is a core principle of it. I feel a lot more complicated about it now. I definitely feel like it’s a lot less ironclad than I used to, and I definitely feel a lot more like, “No, everything’s kind of shitty. You can try and out-compete anything.” But I don’t feel like that on a completely absolute scale. I don’t just feel like I’m skeptical of all claims that you can’t out-compete it.

Sam Bankman-Fried: I guess the way I feel is something more like, you should assume that the real efficient-market hypothesis is something more like: if you don’t try harder and do better than other people at a thing, then you’re probably not going to make money at it, or make more money than other people would be making from it, or something. It seems like a little bit of a modified version of the efficient-market hypothesis.

Rob Wiblin: Yeah. I think it’s not actually in tension with economics properly understood, because the actual efficient-market hypothesis would be that you exist: that this stuff does end up gradually becoming more efficient, because someone like you becomes super rich solving all of these arbitrage opportunities. But someone actually has to do it. The thing is, the world is just changing and opportunities are coming and going sufficiently quickly. The number of people who are really willing to dive in and take a risk and do something new like this is sufficiently small that your odds, if you try, are just not that low.

Sam Bankman-Fried: Yeah. I think that’s basically right.

What people get wrong about Sam’s success [02:29:49]

Rob Wiblin: What do people commonly get wrong about why you ended up having so much success in this area?

Sam Bankman-Fried: I think for a lot of people, they just don’t have a model for how it happened. It’s just sort of this weird property of the world; it’s a little bit inexplicable. I don’t know, it happens sometimes: you look at someone and they have incredible success, and you’re like, “Huh. That person is really successful.” It’s sort of like when people think about why was Elon Musk so successful, or why is Jeff Bezos so successful? Most people don’t really have an answer for that, because they don’t even see it so much as a question they’re asking. It just is this weird property of the world, that they were.

Sam Bankman-Fried: But my felt sense — from having been through a lot of it — the first thing is that, to the extent there are multiplicative factors in what’s going on (and I do think there are) that your ultimate “how well you do” is a product of a lot of different things. One thing that implies is that, if it’s a product of four different things, then in order to get anywhere near the peak, you need to do well sort of at all of them. You need to be pretty good at all of them. It’s a high bar.

Rob Wiblin: Yeah.

Sam Bankman-Fried: You can’t skip leg day, so to speak.

Rob Wiblin: What does that mean?

Sam Bankman-Fried: You can’t be like, “I’m going to be really good at some set of things and just ignore the others” — you just lose that multiplicative aspect of it. Obviously, some things are additive, and you can sort of ignore those.

Sam Bankman-Fried: So we had to be good on a number of different realms. We had to be really ambitious. That was an important part of it. It was just so, so, so easy for us to fail to accomplish what we did, if we just decided our goal was a lot lower. Or in a lot of ways, just getting lazy when we started doing well and being like, “Ah, we’ve done well. No point trying anymore.”

Sam Bankman-Fried: But also, just a lot of strategic decisions, where it’s like, “Are we willing to take any risk in our trading?” If the answer is no, it’s going to really limit the amount of trading we can do, but it is a safer thing to do. That’s an example of a question that we had to face and make decisions about. Another part of this was just aiming high and remembering that — not so much aiming high, but aiming to maximize expected value, is really what I’d say.

Rob Wiblin: If I remember, it seemed like in those early days, you were often doing things that created some risk of going bust, but offered the potential of making manyfold more money. That was kind of your modus operandi.

Sam Bankman-Fried: Yeah. I think the way I saw it was like, “Let’s maximize EV: whatever is the highest net expected value thing is what we should do.” As opposed to some super sublinear utility function, which is like, make sure that you continue on a moderately good path above all else, and then anything beyond that is gravy.

Sam Bankman-Fried: I do think those are probably the right choices, but they were scary. I think even more so than some chance of going bust, what they sort of entailed was that we had to have a lot of faith in ourselves almost — that they really would have had a significant chance of going bust if we didn’t play our cards exactly right. There were a lot of things that were balanced on a knife’s edge. Any amount of sloppiness would have been pretty bad. I also think it was a little bit of a thing of, could we play this really well?

Rob Wiblin: Just to back up and talk about the multiplicative model of entrepreneurship or productivity that you were talking about, this is the idea that your output is determined by multiplying together a whole bunch of different factors — like how good you are at all these different sub-skills of the thing that you’re trying to do. Which produces quite different results than what you get if you’re just adding together your skill in a bunch of different areas.

Sam Bankman-Fried: Yeah.

Rob Wiblin: Basically it means that you could be sabotaged by being extremely weak in any one area: if any of the things you’re multiplying together is zero or close to zero, then the whole project produces no output.

Sam Bankman-Fried: Yep.

Rob Wiblin: Do you want to elaborate on it a little bit more?

Sam Bankman-Fried: Yeah. I think it’s an important and a weird point. It’s not an absolute point. I don’t want to claim that in all cases, this is the right way to think about things or anything like that. What I’d say instead is something like, you should try and understand in which ways something is multiplicative — in which ways it is the case that, were that factor set really low, you’d be basically fucked. As opposed to, that’s just another factor among many.

Sam Bankman-Fried: What are some of those? One example of this, which I learned early on, is management. If you’re trying to scale something up big, and you’re very good at the object-level task but bad at managing people, and no one on the leadership team is good at managing people, it just becomes a mess. It almost doesn’t matter how good you are at the original thing — you’re not going to become great as a company. It’s really hard to substitute for that. It’s amazing how quickly things can go south, if organizational shit is not in a good state.

Sam Bankman-Fried: That was one example of a case where I originally didn’t particularly think of it as multiplicative, but I do think it was. And I learned that lesson eventually, that you can’t forget about that. I think there are a lot of other things like that that came up.

Rob Wiblin: Yeah. It’s a good example of the multiplicative effect. I suppose the multiplicative model is just kind of a model that can be helpful and is partially true and partially not true.

Sam Bankman-Fried: Yeah.

Rob Wiblin: But people have pointed out that founders falling out, or the original team growing a project coming to hate one another, is one of the main ways that a project fails. It’s a great example of how it kind of doesn’t matter how good a prototype they build or how good their accounting system was or their ops was — if the people working the project just end up despising one another, then it’s all for naught, basically.

Sam Bankman-Fried: Yeah. I think that’s basically right.

Rob Wiblin: I suppose there’s a few other things like that. And similarly, if they get on really well, but they’re terrible at designing a product, such that they’re never going to actually appeal to customers, then the whole thing is for naught again.

Sam Bankman-Fried: Yeah.

Rob Wiblin: It suggests that you kind of want an all-rounder or an all-rounder company or an all-rounder CEO. Well, at least that that’s better than someone who’s exceptional in one area and really weak in another. Do you think that’s a reasonable conclusion to draw?

Sam Bankman-Fried: Yeah, with some caveats. I think it’s mostly right, but you have to be careful if you think about it that way. Again, I do think this is a reasonable way to think about it, in many senses, but you have to be careful that you don’t overdo it. And in particular, so OK, you go for the all-rounder approach. You don’t want to be left with a generic pile of mush, right?

Sam Bankman-Fried: Part of this is again saying, in order to reach an extremely good outcome, you actually need a lot of things going very well. So some of this is sort of like, if you’re not in that case, you just are not going to end up in the extremely good outcome. That’s sort of how it is. It’s sort of sad, but true. I think part of this is as much saying that as anything else.

Rob Wiblin: Yeah. I guess a modified version is that hopefully, the whole reason you’ve chosen to go into entrepreneurship on project X is that you’re amazing at some aspect of that thing, because you had discretion over what you were going to go into. So, why not choose something where at least you’re extremely knowledgeable about the product or whatever. And then, having gotten a really high value for that, on the rest of the stuff you want to do well enough that it doesn’t sabotage the project.

Sam Bankman-Fried: Yeah. I think something like that. There are ways that you can try and cover for some of your flaws. There are things you can do to make it such that they matter less than they otherwise would. You can be a little bit strategic about that.

Sam Bankman-Fried: Now, it’s always sad when you’re in covering-your-ass mode, so to speak. That’s not where you would ideally want to be coming from. But some examples of that, that I do think can be helpful: one thing that you can do is, if you choose an area where you are the first mover by a lot — like a consumer-facing business, and where your depth of product knowledge is not very good, you can build an OK product, and you’re good at corporate strategy and shit — that can potentially work.

Sam Bankman-Fried: Because you might end up in a position where just the brand value of having been first is worth so much, that even if your product isn’t the best eventually, if it’s the best in an open area where there are no competitors, that might be enough to build up a pretty big head start. Obviously it’s better and worth a ton if you can also be great at product there, but that is an avenue you can try and play.

Dealing with trouble within organisations [02:38:37]

Rob Wiblin: Yeah. So relatively early on with all the crypto trading stuff, there was a stage where a bunch of people on your team became kind of disillusioned and decided to leave the project. How did you deal with that, personally? Because I guess that’s a thing that many projects go through, both ones that ultimately fail and ones that carry on and succeed. But I think it’s often very difficult, from a mental health and motivation point of view, to get beyond that.

Sam Bankman-Fried: Yeah. It is tough. It was tough for me, certainly, to deal with that. It was not a particularly fun time, I’ll say. I think that part of it was, you have to make a decision about whether you’re going to soldier on or just give it up. If you are going to soldier on, if that’s your decision, then great: do it. You’ve made your choice. It’s almost like you’ve made your choice. There’s no point in second guessing it.

Rob Wiblin: Dwelling on the past?

Sam Bankman-Fried: Exactly. If that’s what you’re going to do, it’s what you’re going to do. Lean into it. That’s one piece of it. At that point, the only way out is through. Find the best paths you can.

Sam Bankman-Fried: I think a big piece of this also, was being in a position where I felt compelled that there was a compelling way through — where I felt deeply that there was huge upside there, and that we could potentially get there. Not that that’s proof that we would get there, but that we could. And the fact that we could meant that there was still something great worth striving for there.

Sam Bankman-Fried: Coming to a point where I felt like I understood what had gone wrong and what to do differently was really important as well. I think without that, there’s sort of this thing looming over you, of, “Everything will go to shit at some point, and I have no idea why.” Which is not a good place to be in, always looking over your shoulder for, “I saw a bad thing happen once. Maybe it will just keep happening.”

Rob Wiblin: So it’s much more reassuring to have a model or a theory for what happened and what you can learn from it. That makes it a lot more motivating to try to continue and do things differently.

Sam Bankman-Fried: Yeah. Yeah.

Rob Wiblin: Do you think people underestimate how possible it is for a company to end up being a huge success, despite feeling like a chaotic mess internally?

Sam Bankman-Fried: I think it depends on what a chaotic mess means, exactly. There seems like one version of this, where it feels a mess because there’s an enormous amount going on and not all of it works.

Sam Bankman-Fried: So here’s a quote though. I think it was Elon — I don’t know, I forget who — where basically he said, you can think of everything internally at a company as a vector in a vector space. So they got some magnitude, they got some direction. If they’re just pointing in completely random directions, you won’t go anywhere as a company. You’re just pulling yourself apart. If they’re all kind of net pointing in one direction, that’s when you move quickly as a company.

Sam Bankman-Fried: I think part of this is, if things are a mess internally, does that just mean there’s a ton of random vectors, but they’re all working, pointing hopefully in the right direction? And if not, they’re doing their side thing — that might not work, and that’s OK; it’s OK if some initiatives don’t work. Or is it that they’re fighting with each other? And that they’re pushing into opposite directions on the same project and can’t get along? It’s causing tension and decoherence.

Sam Bankman-Fried: That second thing is real nasty, but the first thing is totally fine. It’s totally fine to have 16 initiatives internally and just be like, “Yeah, our best guess is half of these will fail and the other half will be great.” That can work. But feeling like there’s 16 initiatives and our net score is going to be the number that succeed minus the number that fail, then you’re fucked.

Rob Wiblin: Yeah. I guess I was thinking the mess might be, you’re working on some project, and you’re working on some aspect of it, but then there’s another part of the project that’s not getting completed, say. Like the operation side of the delivery is not functioning, so you feel frustrated. It’s not that people are sabotaging the thing, so much as you need A, B, and C to all occur in order for something useful to happen — and A and B are happening, but not C.

Sam Bankman-Fried: Yeah. That makes sense. And that can also be a big problem. I think the big thing there is there’s sort of more and less resilient versions of that. You basically can imagine a version of that where you have A, B, and C, and you just never do one of the important ones. Because of that, you never actually net get anywhere as a company. And that can be really bad.

Sam Bankman-Fried: You can imagine another version of this though, where you have a lot going on internally. And it is quite messy and they’re reliant on each other and they’re not all perfectly timed with each other, but they’re all making progress. And yeah, sometimes someone finishes one part a little bit before the other part would be finished, and it’s a little awkward. Whatever, you’ll live.

Sam Bankman-Fried: In some sense, a big difference between those, I think of as: let’s say that A, B, and C are all necessary. Let’s say you’re in that world. And A and B get done, and C is just not making progress at all. What happens then? Do you just fail as a company? Or is there something that comes in and says, “All right, we’ve done A and B. C is really important. It’s now been a month. Nothing happened there. Let’s figure out what’s going on there and make sure C happens.” And get a new team on it, get more people on it. Diagnose what’s going wrong. Remind the people that it’s [important].

Sam Bankman-Fried: I think that gets to another thing that I’ve ended up feeling is really, really important for running a company, and I think Holden [Karnofsky] was one of the people who sort of helped me realize this. If you’re running a company, and you assign Bob the task of turning the widget, and the widget doesn’t get turned, it’s very tempting for your takeaway to be like, “Fuck Bob. Bob failed.”

Rob Wiblin: And by blaming Bob, I’ve solved the problem.

Sam Bankman-Fried: Exactly. Right. Let’s put aside blaming Bob for a second. Maybe the blame isn’t helpful. Maybe it’s not. Probably it’s not, but let’s even ignore that part. It’s missing the bigger picture, which is that the widget still hasn’t been turned. The important thing is, it’s my fault if the widget ultimately doesn’t get turned. Nothing else changes that. I can do whatever sort of mental gymnastics I want, but in the end I have to make sure the widget gets turned. And my strategy of assigning it to Bob was maybe just the wrong strategy. And instead I should have assigned it to Bill, or Jill, or I don’t know.

Rob Wiblin: Two people.

Sam Bankman-Fried: Or reminded Bob, or hired somebody, or done it myself.

Rob Wiblin: Yeah. Yeah. Or motivate Bob differently.

Sam Bankman-Fried: Exactly. Who knows exactly what I should have done, but somehow, apparently I was not doing the right thing.

Rob Wiblin: Yeah. It’s a very constructive attitude. I’d love to do an episode at some point on how civil aviation became so safe, because it’s interesting from a risk management point of view. As far as I understand it, one important aspect of it was whenever they investigate a plane crash or an accident or anything like that, it’s never acceptable to have the bottom line be the pilot made a mistake.

Sam Bankman-Fried: Yep.

Rob Wiblin: Because the pilot is just a component of the plane that breaks like any other component sometimes. And you have to build the entire system around pilot failure, around human error. So if the pilot made a mistake and it caused a bad outcome, then it’s the system that’s broken, not the pilot. So you just view people like a piece of machinery in this, at least in this particular context. Not in a cruel way.

Sam Bankman-Fried: Yeah. I completely agree.

Sam’s day-to-day life [02:46:10]

Rob Wiblin: So I’m not going to ask so much about the recent FTX era, because it’s something you’ve spoken about on a bunch of other interviews that we’ll link to in the blog post associated with this episode. So let’s push on from that backstory to talk a bit more about what things look like for you at the moment. What does a typical day in your life look like?

Sam Bankman-Fried: I mean, it’s really all over the place. Just to give some recent days as examples, what did I do two days ago? So two days ago, I was in Munich at the Munich Security Conference, meeting with random heads of state and people in security roles in governments to talk about crypto policy.

Rob Wiblin: Why?

Sam Bankman-Fried: It seems, I don’t know, I could help them think about crypto policy. Maybe they could help us figure out where we should be. We could help shape policy in a constructive direction. I don’t know.

Rob Wiblin: Cool.

Sam Bankman-Fried: Yeah. It sort of seems like worth doing. So that was one day, but it’s not like that’s the answer, that I meet with heads of state on crypto policy. That’s usually not what I’m doing.

Sam Bankman-Fried: So what are other answers then? What do I have today? I have some interviews. I listened to Putin’s speech to see what was going to happen in Ukraine. I have a call about a potential partnership. I have an all-hands meeting. And then I have a bunch of “check in with this project” type stuff. You know, see what’s going on with this project. Does it need help? Are there things I should be doing that would be productive there?

Sam Bankman-Fried: So that’s a big piece of what I’ll do. We have these initiatives. Payments is one example: we’re trying to add more fiat on-ramps and off-ramps to FTX. Let’s talk to the people working on that. Be like, “Hey, how are things going? Are there blockers? Are there frustrating things? Do you need help on something? Talk to me about the progress recently.” If there isn’t much, where is it prioritized? Is it prioritized correctly? Do sort of generic project manager stuff. So that’s another thread.

Sam Bankman-Fried: I’m in DC every month or so to talk with lawmakers and regulators about crypto policy basically. And that’s sort of a big piece of this as well. It’s becoming an increasingly important piece over time. So that’s another thing.

Sam Bankman-Fried: And you know, it sort of goes on and on, in some sense with just random thing after random thing. And rather than thinking of it as like, “This what I do,” I sort of, at this point, almost just think of it as, “You know, I do a bunch of crap. A bunch of random things is my job.” It’s whatever is important today. And that might not be the same thing as what was important yesterday. And that’s totally fine.

Rob Wiblin: Is that problematic as a CEO? If it’s just kind of random different things all the time, I wonder whether that creates a kind of uncertainty within the organization about who’s responsible for exactly what?

Sam Bankman-Fried: It’s a good question. And part of this is what’s determining what I do each day. A lot of it is like, “What do I need to be responsible for?” A lot of this is where do I see that there might be a problem right now, or that people might need help or a push? Or that there’s just an incredibly pivotal thing going on somewhere — I will often try and jump in there and pitch in.

Sam Bankman-Fried: And so part of this is at least an attempt to help the problem of you have too many stakeholders — or no stakeholders, it’s sort of similar — and make sure that we’re on top of the most important pieces. But it’s not like everyone knows, “Oh, today is Tuesday, and so today Sam is going to be thinking about X.” And that is definitely a messy piece of this.

Rob Wiblin: Yeah. One kind of school of thought I’ve heard about what happens to CEOs as their companies get bigger, is that ultimately deciding on personnel is a key leverage point, where you can actually get a lot out of a small amount of time. So deciding who are going to be the leaders of the different parts of the organization, and so hiring and promotions become very important. And I guess motivating the people in the layer of the org structure below you is super important. Is that kind of right? That those two things are what you perceive as core responsibilities?

Sam Bankman-Fried: Yeah. I think what I’d say is something like, having good people is a big part of it, but having the right structures for them is a big part of it too. We’ve seen firsthand that you can have great people in the wrong situation, and they’re net negative, because it’s very easy to be net negative. We’ve seen companies that hire 5,000 great people and are completely dysfunctional, and it’s not because it’s all their CIO’s fault or something like that. It’s very hard to figure out what went wrong exactly, but somehow the set of good people are worth a lot less than the sum of their parts.

Sam Bankman-Fried: So I think some of this is understanding what is going wrong there sometimes. What can we do differently? And I think a lot of that is about how people feel internally. How are they arranged? Incentivization, which is something you brought up, is really important. Are they incentivized to do a great job? You know, it sort of feels a little trite, but that is legitimately a thing that is often wrong: when push comes to shove, they don’t make any more money themselves if they do a great job than if they don’t. So are they going to do an OK job? I don’t know.

Moving to the Bahamas [02:51:23]

Rob Wiblin: Yeah. Interesting. So I think last year, or not that long ago, you moved to The Bahamas.

Sam Bankman-Fried: Yep.

Rob Wiblin: A lot of people would think you moved to The Bahamas to get away from taxes, but you’re a US citizen, and the US just taxes its citizens no matter where they live. So it’s not such a great tax dodge for you, at least not for Sam Bankman-Fried personally, and I imagine for most of your staff it doesn’t really help.

Rob Wiblin: But you moved there, I think primarily because it has clearer regulation of cryptoassets and exchanges, so you had a much clearer legal framework in which FTX could operate, basically. What’s an unexpected positive or negative about that move?

Sam Bankman-Fried: Yeah. That’s basically right. The big thing for us was we wanted to be in a jurisdiction that had a license for a crypto exchange, that had a place where we could have a license for our business and be regulated. And the number of jurisdictions that have that is shockingly small. It’s like, two. I mean, it’s more —

Rob Wiblin: What’s the other one?

Sam Bankman-Fried: Gibraltar has one, Cyprus has one, Singapore, Japan. But many of these are only partial licenses. There is sort of a license there, but it’s only licensed as part of our business, not the whole business. And doesn’t license, for instance, derivatives — which is two-thirds of our volume. And so very few of them actually license most of what we do. And that was one of the big things: how can we get a license for most of what we’re doing, or ideally all? And this was one of the only places that had that.

Sam Bankman-Fried: The other thing is, I think the biggest surprising positive of it — which maybe I should have thought through ahead of time, but I sort of didn’t — was people like visiting us here. It’s like, if you’re going to go on an expedition to visit someone, it’s a pretty nice place to go.

Rob Wiblin: You could do worse than The Bahamas.

Sam Bankman-Fried: Yeah. And because of that, we actually have a lot of counterparties who come and visit us here, and it’s really great. It’s a great way to get to know them. It’s a great way for them to get to know us. And overall, I’m just super happy that we have that relationship with people, and I think it’s because The Bahamas is a place that people want to be, is why that’s happened. That’s something that I did not sufficiently anticipate, but has been really nice.

Rob Wiblin: Yeah. On that, FTX has been offering a fellowship for people who have an interest in effective altruism. I suppose one way of thinking of it would be a fellowship for people who plausibly in the future could receive some kind of grant from the FTX Foundation, or work at the FTX Foundation at some point. So you’ve been offering the fellowship to encourage people to come over and spend a little while in The Bahamas and work remotely, and I guess trying to build some momentum behind having a hub of people who have an interest in these topics over there.

Rob Wiblin: One worry I have about that is, are there enough people with that interest in the world to support another hub? We’ve got a group in California and a group in the UK, and then a few other groups around the world, on the East Coast, in Europe and Australia — but can we sustain The Bahamas as well? What do you think of that?

Sam Bankman-Fried: Yeah. It’s a reasonable question and I wasn’t originally sure. I’m still not sure what the answer was. I think part of our thought was like, “Let’s start talking about having something like this and see if anyone comes.” And if there’s no interest in it, then that answers the question. Then that means it turns out there was not enough interest for another hub. We have enough hubs.

Sam Bankman-Fried: We’ve had what was, at least to me, a somewhat surprising amount of interest, and it made me feel like there are a lot of people looking for something else in a hub. I certainly think that’s a piece of it, and I think it’s different for different people why they’d want to come. But the fact that people want to come, I take as a real sign. So that’s one piece of it, which I think is pretty relevant.

Sam Bankman-Fried: Then beyond that, it is great as an opportunity for us to get to know people in the community, for people to get to know us, and for people who are potentially looking at getting funded by us or working with us, as you said. So those are some of the other angles.

Rob Wiblin: Yeah. I don’t want to be too harsh. Hopefully there’ll be such growing interest in all of these topics, in all of these areas that you want to give away grants into, that even if we can’t sustain The Bahamas now, just give it a year or two, and we will be able to. You were saying that people are looking for something different. I suppose maybe what The Bahamas offers is that many of these people won’t live there, so it’s a place to go and experiment with something different, meet a whole wide range of people from all over the place. It’s a chance to get away from ordinary life, I suppose.

Sam Bankman-Fried: Yeah. I think there is something like that. And especially in the winter, a lot of people really don’t like being in the cold for the winter, and so I think that’s an opportunity too for them.

The importance of being authentic [02:55:59]

Rob Wiblin: Yeah. Preparing for this interview, I got a chance to listen to a lot of your other interviews on podcasts and read through a lot of your tweets, Sam. And I have to say, your style just remains to really let it all hang out there, and just be same authentic Sam Bankman-Fried that you’ve been for the last 10 years — and, as far as I can tell, kind of not give a damn what people are going to think of the CEO of a company saying all of this stuff. Did you ever seriously consider living otherwise than this?

Sam Bankman-Fried: Yeah, it’s a good question. And I think, look, there’s a time and a place to be really careful about what you say, and I don’t want to argue too strongly against that, or make it seem like that’s an absurd thing to think. But I think what I’d say is people will often decide that, just by default, they should be fake — that by default, they should be someone that they’re not — and I think that just rarely turns out well. You can look at politics as an example of this: you can see candidates who are really impressive people, but are generally thought to have been semi-intentionally fake while campaigning, because it felt like what one does, or something like that. I think people just don’t usually end up appreciating that.

Rob Wiblin: Yeah, people can see through it, or they can smell inauthenticity, I suppose.

Sam Bankman-Fried: Exactly. You can tell when someone is really not being authentic, and I think it just doesn’t end up doing you any favors.

Rob Wiblin: As you’re saying, there are occasions in which being unauthentic is the right thing to do. If you’re giving a presentation at a funeral and you didn’t like the guy, then maybe you just say the right words. Did you just be your normal self in front of Congress, for example?

Sam Bankman-Fried: That’s an example where I’m really careful about what I say. I think I approach it from the perspective of, only say things that are true and that I believe and that I think are important. To give a trivial example of this, I swear a lot normally — I don’t know, it’s a part of my lexicon. When I was speaking in front of Congress, I did not swear. Again, that’s a trivial example, but so what? That’s how things are, and I think that’s one example of a way in which I was not just my normal self in front of Congress, but in general —

Rob Wiblin: But on a deeper level. Yeah.

Sam Bankman-Fried: Yeah. On a deeper level, if your plan is to be someone you’re not, it’s just not exactly going to cohere long term. I think it’s sort of a recipe for ending up tying yourself in knots and not ending up where you wanted to be.

Rob Wiblin: Well, people come to then expect this false personality from you, and then it’s like now you’ve got to keep up this ruse forever, and it’s hard to sustain.

Sam Bankman-Fried: Yeah. I think that’s exactly right. It’s like, you argue for a certain policy that you don’t think is the right policy, because you think it’s the thing one does there. But now you’ve got to support that policy forever. Eventually, you’re going to be backed into a corner.

Rob Wiblin: So you are still being authentic. Do you worry that, because you’re now a big deal, that it’s going to be harder for you to get the truth out of people, or to hear gossip about the latest research or what projects are worth funding and what projects are not? Is that an issue?

Sam Bankman-Fried: Yeah, absolutely. It’s something that I absolutely do see sometimes. One example is I’m always worried that people will not be straightforward with me about the negatives about the things that they think I’m doing wrong, or that they think we’re doing wrong as a company, because they don’t want to —

Rob Wiblin: Antagonize you.

Sam Bankman-Fried: Yeah. Exactly. I do think there are a lot of things like that, that I’m becoming increasingly at least a bit nervous about. And I don’t know the answer to that. I don’t have a, “And this is how you solve that problem.”

Rob Wiblin: Yeah. I think probably the only way that you can really solve it is to just have a very long track record of demonstrated good responses to negative feedback, and to using information that people give you very responsibly.

Sam Bankman-Fried: I think that’s probably right.

Rob Wiblin: And then gradually people will come to trust you. Yeah.

Sam Bankman-Fried: Yeah. That sounds right to me.

Rob Wiblin: Yeah. It seems like it’s an issue that almost all grantmakers have, or rich people have. People start treating them differently in a way that slightly foils their plans.

Sam Bankman-Fried: Yep.

Rob Wiblin: I suppose it’s a good problem to have in the big picture.

Ways the effective altruism community could be better [03:00:29]

Rob Wiblin: Let’s talk a little bit about effective altruism, as we head towards the end of the interview. As people will have picked up, you’ve been participating in and following how the effective altruism community has been developing for over a decade. I’m curious to get your ideas on how it could be better. Maybe first, to put a slice of positive bread on a compliment sandwich, what do you find most useful about effective altruism as an intellectual and professional scene?

Sam Bankman-Fried: An enormous amount. Almost a place to start from is, I think effective altruism is great, and it has been my guiding principle for forever, and so I think that’s just the first thing to say. The community has also just done an enormous number of really good things, and I think this gets sometimes a little bit forgotten.

Sam Bankman-Fried: I think that I make the same mistake as a manager: I’ll see someone will do 10 things; eight of them will be great, two will be kind of mediocre. I’ll start at the eight great things and be like, “Great, no need to comment, all going well.” And the two things that they fuck up, I’ll be like, “Ah, here’s some constructive criticism on those.” From my perspective, I’m like, “This is great, they’re 80% good.” And from their perspective, they’re like, “Aw shit. Sam hates me. He’s giving me 100% negative feedback.”

Sam Bankman-Fried: So I think, as a community, we sometimes do that. And it makes sense that we do that, but it’s a little bit too bad. We’ll basically do a lot of things well as a community, but not everything, and we’ll focus on the things we did poorly, because those are the things we can improve. Again, that makes sense that that’s what we do, but ideally we should also be recognizing what we do well.

Rob Wiblin: Yeah. It’s worth just taking a moment to think about how any community — or indeed any person you know — how they could be so much worse. Think about all of the ways that your relationship with them could be terrible.

Sam Bankman-Fried: I know, right?

Rob Wiblin: All of the vices that they could have that they don’t. It’s chastening.

Sam Bankman-Fried: They could just be total shit. They could be horrific. Think about how bad they’re not.

Rob Wiblin: OK, so with that out of the way, what’s a mistake you think at least some nontrivial fraction of people involved in effective altruism are making?

Sam Bankman-Fried: Yes. So again, I want to be careful about this. I do think, by and large, we’re doing quite well. But what are things that could be better? I guess one thing is just shooting high and being ambitious. I think this is something that in general, the community is good at. I don’t want to frame this as like, this is a weird weakness of the community. But I think that so much of the value is in the tail cases — in the cases where things go better than just well — and that really does incentivize you to shoot extremely high.

Sam Bankman-Fried: I think that as a community, we’re good but not perfect at that, and that we’ll also often go for strategies that are not the highest-upside strategies, and are instead safer strategies. Again, these are often things that we’re even explicitly trying to protect against, but that we don’t do a perfect job of. One example being not framing things in terms of making sure that you have positive impact in your life.

Rob Wiblin: What do you mean?

Sam Bankman-Fried: So, if your goal is to maximize the expected value of the impact that you have, then I think it implies interesting things about how you should behave. And in particular, the expected value of how much impact you have, I think, is going to be a function sort of weighted towards upside tail cases. That’s what I think my prior would be. And if your impact is weighted towards upside tail cases, then what’s that probability distribution of impact probably look like? I think the odds are, it has decent weight on zero. Maybe majority weight.

Rob Wiblin: Majority weight on zero. Yeah.

Sam Bankman-Fried: Yeah. Probably, right?

Rob Wiblin: Or at least very close to zero. Yeah.

Sam Bankman-Fried: So I think there are really compelling reasons to think that the “optimal strategy” to follow is one that probably fails — but if it doesn’t fail, it’s great. But as a community, what that would imply is this weird thing where you almost celebrate cases where someone completely craps out — where things end up nowhere close to what they could have been — because that’s what the majority of well-played strategies should end with. I don’t think that we recognize that enough as a community, and I think there are lots of specific instances as well where we don’t incentivize that.

Sam Bankman-Fried: There are all these cases where I think we give not enough attention to think about the high-upside impact you can have. Forget about the common paths and forget about even the probability of success for a sec. Just think about what would massive success look like, and what would maximize your odds of getting there — and then evaluate that path, because I think it’s a pretty plausible one.

Sam Bankman-Fried: I think that often does imply, I don’t know, should you be trying to become a US senator? That’s a question that you could ask. I think the answer’s like, “Well, maybe.” Actually, if you do the math, it seems plausible. But if you do follow that, probably you won’t be one.

Rob Wiblin: Almost certainly not. Yeah.

Sam Bankman-Fried: Right. But that’s not a path that we talk about very much. I think people often sort of round the odds of that to zero or something in their minds. And I think it’s like, not zero. And on the flip side, there’s too much emphasis, traditionally, on making a bit of money, without having thought hard about whether that’s what you should be doing or not. I think that’s maybe another side of this.

Sam Bankman-Fried: Then the last thing is thinking about grantmaking. This is definitely a philosophical difference that we have as a grantmaking organization. And I don’t know that we’re right on it, but I think it’s at least interesting how we think about it. Let’s say we evaluate a grant for 48 seconds. After 48 seconds, we have some probability distribution of how good it’s going to be, and it’s quite good in expected value terms. But we don’t understand it that well; there’s a lot of fundamental questions that we don’t know the answer to that would shift our view on this.

Sam Bankman-Fried: Then we think about it for 33 more seconds, and we’re like, “What might this probability distribution look like after 12 more hours of thinking?” And in 98% of those cases, we would still decide to fund it, but it might look materially different. We might have material concerns if we thought about it more, but we think they probably won’t be big enough that we would decide not to fund it.

Rob Wiblin: Save your time.

Sam Bankman-Fried: Right. You can spend that time, do that, or you could just say, “Great, you get the grant, because we already know where this is going to end up.” But you say that knowing that there are things you don’t know and could know that might give you reservations, that might turn out to make it a mistake. But from an expected value of impact perspective —

Rob Wiblin: It’s best just to go ahead.

Sam Bankman-Fried: Yeah, exactly. I think that’s another example of this, where being completely comfortable doing something that in retrospect is a little embarrassing. They’ll go, “Oh geez, you guys funded that. That was obviously dumb.” I’m like, “Yeah, you know, I don’t know.” That’s OK.

Rob Wiblin: Yeah. We’re slightly returning to themes that we were talking about earlier. Over the last year, I remember two conversations I had. One about this risk and certainty issue about planning careers having impact.

Rob Wiblin: In one of them, I only realized very late in the conversation that the person had the impression that an important principle of applying effective altruist ideas would be trying to maximize the probability of having a reasonably large impact, which really actually isn’t a thing at all. What you want to do is maximize the expected value, which might imply an extremely high probability of having no impact at all, at least in areas where most of the impact is in the case where you do extremely well or where you do extremely unexpectedly well — you’re at the 99th percentile or 100th percentile of outcomes.

Rob Wiblin: There was another case where I think someone was getting very stuck, because they were considering lots of different plans, and in all of them, they thought that there was a 20% chance that what they would do would be harmful. I was just like, “I just think in the area you’re working, you’re never going to get below that. That is actually pretty good.” As long as it doesn’t have a high chance of having a catastrophically bad outcome, a 20% chance of having a kind of bad outcome is actually just as good as it gets, because you’re working in a very uncertain area. And you just then have to evaluate, well, is there enough positive on the other side of things to outweigh that?

Sam Bankman-Fried: Completely agree.

Rob Wiblin: Yeah. It’s so easy to get stuck in that case, where you are just unwilling to do anything that might turn out to be negative.

Sam Bankman-Fried: Exactly. And a lot of my response in those cases is like, “Look, I hear your concerns. I want you to tell me — in writing, right now — whether you think it is positive or negative expected value to take this action. And if you write down positive, then let’s do it. If you write down negative, then let’s talk about where that calculation’s coming from.” And maybe it will be right, but let’s at least remove the scenario where everyone agrees it’s a positive EV move, but people are concerned about some…

Rob Wiblin: Yeah. An audience member wrote in this question for you: “What are Sam’s major disagreements with the EA canon?” I guess inasmuch as we have a canon. Maybe we should say “conventional wisdom” rather than canon.

Sam Bankman-Fried: Right. It’s a good question, and many of these are disagreements with either a little bit of a caricatured version of EA canon, or with EA canon circa 2018 or something, in ways that I think have gotten less caricatured over time. I want to flag that maybe this is a little bit of a strawman that I’m disagreeing with here.

Sam Bankman-Fried: One thing is, I feel like in EA canon, the fraction of our resources that should be spent on AI x-risk has been an extremely volatile number, historically. I feel like that number has gone from zero to 90% to 70% or something like that, but some people also think it should be 20%. But my sense is that, at least it swung a little bit too far in that direction, at some point.

Rob Wiblin: Too low?

Sam Bankman-Fried: Too high.

Rob Wiblin: Too high, OK.

Sam Bankman-Fried: And to be clear, I think it’s plausibly the most important cause. I think it’s one of the most important causes, and we should be thinking hard about it and funding anything that looks really good from that perspective. But I do think that at some point that sounded like we should do that to the exclusion of anything else, and like it’s dumb to work on anything but that.

Sam Bankman-Fried: Maybe that’ll turn out to be right, and I’m going to look dumb for having not thought that. I don’t know, I just think there’s a chance of that. I also think that I’m just not extremely compelled by some of the opportunities we’ve seen in this space. It’s one of these things where I think it’s incredibly important. I think it’s not incredibly tractable. I think it’s not incredibly untractable either. I think it’s somewhere in the middle on that. But I also think it’s not 50 times as important as anything else. I think there are other things within a factor of 50 of importance, which means the tractability angle can potentially compensate.

Rob Wiblin: Could be relevant, yeah. So if you were talking to someone who thought that it was more than 50 times as important, what might you say to try to convince them that it’s not?

Sam Bankman-Fried: I think I’d say, “What are the odds of AI x-risk in the next century? Name that number.” And first of all, if that number is less than 30%, then I feel pretty good saying, “OK, how about biorisk? How about nuclear risk? Here are other causes I think have more than a 50 basis point chance of x-risk.” That should, I think, make it pretty compelling that some things are at least within that.

Sam Bankman-Fried: If they say numbers above 50%, I mean, I even think arguments that there are things above 2% are pretty plausible outside of AI x-risk, but I’m also skeptical of a 90% claim about AI x-risk this century.

Rob Wiblin: Or you might worry about that judgment, or you might then think you need to be more humble about your understanding of how things are going to play out?

Sam Bankman-Fried: Exactly. What if we just get distracted from AI, or it just takes a really long time to get there. I don’t know, it’s not hard to get above 10% uncertainty in something.

Rob Wiblin: Or another disaster preempts it.

Sam Bankman-Fried: Exactly. Right. Some other x-risk happens before it. That’s another angle. Then the third thing I’d say is that even if you think that ultimately AI is all that matters, I think there are other things that flow through to it, but don’t look like it. Politics is one example.

Rob Wiblin: It creates both risk factors and security factors that could affect how AI plays out.

Sam Bankman-Fried: Exactly. Right. And I think they will have more than a 2% impact on how AI plays out. So even if you think AI is ultimately all that matters, I think things that dictate how society behaves in general are probably more than 2% of that picture as well. So again, I’m not trying to argue that AI is not important. I’m saying I think there are other things that at least are plausibly worth thinking about in addition to it.

Rob Wiblin: Or that funding something that’s great in another area might be better than funding something that’s mediocre, or less than mediocre, in AI.

Sam Bankman-Fried: Yeah. Again, I’m not saying every area has this, right? I’m not saying guide dogs are —

Rob Wiblin: It’s the best guide dog facility, yeah.

Sam Bankman-Fried: Right. But if you take the best —

Rob Wiblin: The next few things on the list.

Sam Bankman-Fried: Yeah, exactly.

Rob Wiblin: Yeah. I suppose an AI advocate would say that the thing that distinguishes AI is that it’s far more likely to cause complete extinction, because most of the other existential risks merely cause massive depopulation in reality, and then you can recover. Then they would also say that the thing that distinguishes AI is that if we can get AI right, then it will preempt all these other things, because we could use the AI to make the world safe. We’re running out of time, so we probably don’t have time to respond to that completely, but what would you make of that?

Sam Bankman-Fried: Yeah. So that first thing I think is an interesting point. I think that things that aren’t complete extinction matter — like, I don’t have the belief that as long as there’s 13 people left, we can rebuild. Or even like a million people.

Rob Wiblin: Not confidently, yeah.

Sam Bankman-Fried: Yeah. I think society might be fucked forever. I think that’s a real worry, but I can’t prove that’s true and I respect people who are skeptical of that view.

Sam Bankman-Fried: The other thing I’d say is, yeah, it’s true that AI could address other things, but other things could also make AI irrelevant if they happen first. So I think it’s one of these things where if your AI timelines are two years, then this does change the calculus quite a bit, because probably nothing else world-changing will happen in two years. But if your AI timelines are like 30 years, then I think 30 years is a long time, and that’s enough time for nuclear war to break out or something first.

Rob Wiblin: Yeah. It matters whether AI is being born into a world that’s a complete shitshow.

Sam Bankman-Fried: Yeah.

Rob Wiblin: An audience member wrote in: “Now that EA is less capital constrained — thanks in a large part to you, among a few others — what are the barriers to making megaprojects and other very good things happen and how can we get past them?”

Sam Bankman-Fried: So the first thing is that I think that, while it is less money constrained, I don’t think it’s not money constrained. I’m towards one end of the spectrum: I think there’s billions here we could spend well. But putting that aside for a second, another big thing is there is founders of megaprojects: people who in particular will say like, “I’m going to do this. I can be really good at it. I’m just announcing this is going to happen. I’m going to do it, and obviously I need funders and a team and shit like that, but I’m going to be the Schelling point for this. I’m going to be the one who makes sure that this project happens, no matter what, and will take the reins of it.” I think that that’s an incredibly important thing that we don’t have enough of.

Rob Wiblin: Yeah. Are there any other kind of skills or areas of expertise that you’d be really psyched for the effective altruism community to attract? Or I guess, alternatively, develop more of?

Sam Bankman-Fried: I guess like entrepreneurial skills, running an organization. But I think it is more just like —

Rob Wiblin: You don’t sound fully sold on it.

Sam Bankman-Fried: Yeah. I think trying something ambitious is really my answer. Try to build something ambitious — whether it’s a research model, a company, an organization, or something else. I think it’s a really good and useful experience.

Sam’s plans for the next 10 years [03:16:38]

Rob Wiblin: Yeah. Interesting. All right. We’ve covered a lot. We’ve reached the ends of all the questions that I thought we could plausibly cover in all this time that you’ve given us. Maybe just a final, very simple question for you: what is your long-term plan personally?

Sam Bankman-Fried: I mean, I don’t know. For the foreseeable future, continuing to build out FTX and starting to build out the FTX Foundation, and hopefully do some cool things with it.

Rob Wiblin: Yeah. What about in five or 10 or 15 years? Could you see yourself having a second career? Or a third career, I guess. Fourth, maybe. I don’t know.

Sam Bankman-Fried: It’s hard for me to know for sure. So much has changed in the last three years for me. It’s hard for me to project out that far, but I definitely think that, at the very least, we’re going to want to get involved in a pretty hands-on way in a number of other projects.

Rob Wiblin: Could you see yourself actually taking a grant yourself and deciding you’re done making money, and you want to take some money and lead on one of these amazing projects yourself? I guess you’ve got a proven track record, so it could even make sense.

Sam Bankman-Fried: Yeah, it wouldn’t shock me. And I certainly would not be surprised if I put some effort on the side into helping one get off the ground. Even in the shorter term, that would not at all shock me.

Rob Wiblin: Yeah. I’d be excited to see that happen. Maybe you can be like Elon Musk, just jumping from project to project. That would be an exciting vision from my point of view.

Rob Wiblin: All right. My guest today has been Sam Bankman-Fried. Thanks so much for coming on The 80,000 Hours Podcast, Sam.

Sam Bankman-Fried: Of course. Thanks for having me.

Rob’s outro [03:18:06]

Rob Wiblin: You can follow Sam on Twitter at SBF_FTX.

And if you’d like to learn more about Sam’s philanthropic plans, what he hopes to fund, and how you could apply to funding for yourself or something you’re working on, you can learn more at ftxfuturefund.org.

If you search for FTX Foundation you’ll get to a different thing and be confused.

But the FTX Future Fund website is super informative, so I recommend taking a look.

While we’re here, this podcast is obviously my favourite thing that 80,000 Hours does, but we also have a number of other really useful services that can help you in other ways.

First off, the 80,000 Hours team produces all sorts of written research to help you understand the world better and see how you can have more impact in your career.

You can find all our written work at 80000hours.org, check out what’s new at 80000hours.org/latest, or alternatively sign up to get email updates about our latest articles every few weeks at 80000hours.org/newsletter.

Second, our job board currently has 995 available vacancies and study opportunities, across all the various problem areas we discuss on this show, and including some for undergraduates as well as people who are already well into their careers.

The board has 249 jobs related to engineering or software engineering, 127 jobs related to building effective altruism, 290 roles for people with more than five years of work experience, and 319 roles related to global health and development.

Oh and there’s also far more remote roles than in the past — 363 currently — which may make it easier to find relevant options if you’re not in a major US or UK city.

You can check out those roles and filter them down to find options that are right for you at 80000hours.org/jobs.

Third and last, there’s our advising team, who are speaking one-on-one with more people than ever about how they can have more impact with their work. The service is free of course, and you can find out what our advisors can and can’t do for you and apply to speak with us at 80000hours.org/speak.

All right, The 80,000 Hours Podcast is produced and edited by Keiran Harris.

Audio mastering and technical editing by Ben Cordell.

Full transcripts and an extensive collection of links to learn more are available on our site and put together by Katy Moore.

Thanks for joining, talk to you again soon.

About the show

The 80,000 Hours Podcast features unusually in-depth conversations about the world’s most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths - from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.

The 80,000 Hours Podcast is produced and edited by Keiran Harris. Get in touch with feedback or guest suggestions by emailing [email protected]

What should I listen to first?

We've carefully selected ten episodes we think it could make sense to listen to first, on a separate podcast feed:

Check out 'Effective Altruism: An Introduction'

Subscribe by searching for 80,000 Hours wherever you get podcasts, or click one of the buttons below:

If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.