Emergency pod: Elon tries to crash OpenAI’s party (with Rose Chan Loui)
Emergency pod: Elon tries to crash OpenAI’s party (with Rose Chan Loui)
By Robert Wiblin · Published February 12th, 2025
On this page:
- Introduction
- 1 Transcript
- 1.1 Cold open [00:00:00]
- 1.2 Elon's throws a $97.4b bomb [00:01:18]
- 1.3 What was craziest in OpenAI's plan to break free of the nonprofit [00:02:24]
- 1.4 Can OpenAI suddenly change its charitable purpose like that? [00:05:19]
- 1.5 Diving into Elon's big announcement [00:15:16]
- 1.6 Ways OpenAI could try to reject the offer [00:27:21]
- 1.7 Sam Altman slips up [00:35:26]
- 1.8 Will this actually stop things? [00:38:03]
- 1.9 Why does OpenAI even want to change its charitable mission? [00:42:46]
- 1.10 Most likely outcomes and what Rose thinks should happen [00:51:17]
- 2 Learn more
- 3 Related episodes
On Monday, February 10, Elon Musk made the OpenAI nonprofit foundation an offer they want to refuse, but might have trouble doing so: $97.4 billion for its stake in the for-profit company, plus the freedom to stick with its current charitable mission.
For a normal company takeover bid, this would already be spicy. But OpenAI’s unique structure — a nonprofit foundation controlling a for-profit corporation — turns the gambit into an audacious attack on the plan OpenAI announced in December to free itself from nonprofit oversight.
As today’s guest Rose Chan Loui — founding executive director of UCLA Law’s Lowell Milken Center for Philanthropy and Nonprofits — explains, OpenAI’s nonprofit board now faces a challenging choice.
The nonprofit has a legal duty to pursue its charitable mission of ensuring that AI benefits all of humanity to the best of its ability. And if Musk’s bid would better accomplish that mission than the for-profit’s proposal — that the nonprofit give up control of the company and change its charitable purpose to the vague and barely related “pursue charitable initiatives in sectors such as health care, education, and science” — then it’s not clear the California or Delaware Attorneys General will, or should, approve the deal.
OpenAI CEO Sam Altman quickly tweeted “no thank you” — but that was probably a legal slipup, as he’s not meant to be involved in such a decision, which has to be made by the nonprofit board ‘at arm’s length’ from the for-profit company Sam himself runs.
The board could raise any number of objections: maybe Musk doesn’t have the money, or the purchase would be blocked on antitrust grounds, seeing as Musk owns another AI company (xAI), or Musk might insist on incompetent board appointments that would interfere with the nonprofit foundation pursuing any goal.
But as Rose and Rob lay out, it’s not clear any of those things is actually true.
In this emergency podcast recorded soon after Elon’s offer, Rose and Rob also cover:
- Why OpenAI wants to change its charitable purpose and whether that’s legally permissible
- On what basis the attorneys general will decide OpenAI’s fate
- The challenges in valuing the nonprofit’s “priceless” position of control
- Whether Musk’s offer will force OpenAI to up their own bid, and whether they could raise the money
- If other tech giants might now jump in with competing offers
- How politics could influence the attorneys general reviewing the deal
- What Rose thinks should actually happen to protect the public interest
Video editing: Simon Monsour
Audio engineering: Ben Cordell, Milo McGuire, Simon Monsour, and Dominic Armstrong
Transcriptions: Katy Moore
Transcript
Table of Contents
- 1 Cold open [00:00:00]
- 2 Elon’s throws a $97.4b bomb [00:01:18]
- 3 What was craziest in OpenAI’s plan to break free of the nonprofit [00:02:24]
- 4 Can OpenAI suddenly change its charitable purpose like that? [00:05:19]
- 5 Diving into Elon’s big announcement [00:15:16]
- 6 Ways OpenAI could try to reject the offer [00:27:21]
- 7 Sam Altman slips up [00:35:26]
- 8 Will this actually stop things? [00:38:03]
- 9 Why does OpenAI even want to change its charitable mission? [00:42:46]
- 10 Most likely outcomes and what Rose thinks should happen [00:51:17]
Cold open [00:00:00]
Rose Chan Loui: But the basic law on this is that you can change purpose for one of four reasons. One is that the purpose has become illegal. Second, that it’s become impossible to fulfil. Third, that it is impracticable. And fourth, that it is wasteful — and by wasteful, it means that you have more assets than you need to fulfil your purpose.
Rob Wiblin: Do any of those apply?
Rose Chan Loui: I think the only one maybe is an argument that it’s impracticable for some reason. I think that’s still a very tough hurdle. I mean, I’d be curious what you think. Is it now impracticable? I guess they would argue something like, “We need the money. And so it’s impracticable because our investors won’t make additional investments without kicking out the nonprofit from its control position.”
But is that their purpose? They seem to me to be conflating their purpose of benefiting humanity with their purpose of developing AGI for the purpose of developing AGI.
Elon’s throws a $97.4b bomb [00:01:18]
Rob Wiblin: We’re here today with Rose Chan Loui for an emergency podcast. I think “emergency podcast” is a little bit of a contradiction in terms: I feel like if something’s truly an emergency, probably you’re going to announce it some way other than a podcast. But within the world of podcasting, this is an emergency one.
Rose Chan Loui is the founding executive director for the Lowell Milken Center for Philanthropy and Nonprofits at UCLA Law. We last spoke in November for episode #209: Rose Chan Loui on OpenAI’s gambit to ditch its nonprofit for good. And we’ve had some updates since then, and indeed some big updates last night. So we wanted to get on the line and get some legal interpretation of them. How are you doing, Rose?
Rose Chan Loui: I’m doing great. Good afternoon.
Rob Wiblin: Have you had a chance to digest the big announcement from Elon?
Rose Chan Loui: Well, the bits that we got, yes, I think — just the basic terms of his offer price of $97.4 billion as well as his statement that he wants the nonprofit to revert to its original purpose.
What was craziest in OpenAI’s plan to break free of the nonprofit [00:02:24]
Rob Wiblin: Yeah, we’ll get to Elon’s bombshell in just a second. I actually wanted to back up. There was some other crazy news that we got, I think around Christmas, which I guess Christmas is whenever you post the announcements that you’re most proud of as an organisation.
But OpenAI announced their plan, they gave some explanation for how they’re planning to do this conversion from a nonprofit to or how they’re going to streamline their operations to get the for-profit be free of its nonprofit controller.
I guess for the purpose of this conversation, we’re going to assume that people have some reasonable idea of what we’re talking about, that probably they’ve listened to the interview that we did in November or otherwise they’re up to speed.
What was the update that we got from OpenAI around Christmas?
Rose Chan Loui: Yeah, I think there from the nonprofit law perspective, the important part was that this nonprofit would become essentially a grantmaking foundation, so it would support such charitable initiatives as health, education, and science. And that it would get stock in the new OpenAI — and that the new OpenAI would become what’s called a public benefit corporation under Delaware law.
So you’re right, Rob. I feel like we’ve known this for a while, but we did not know this in November. So just for some level setting, a public benefit corporation in Delaware is very different from a public charity. It’s purely aspirational. So it will try to develop artificial general intelligence for the benefit of humanity, but that’s not the same as a legal commitment that you make as a public charity. So I think that’s the biggest difference.
Rob Wiblin: So that’s what the company would become?
Rose Chan Loui: That’s what the company would become, correct.
Rob Wiblin: And the, and the nonprofit, the current foundation, it would own some equity in the company, but it wouldn’t own, like it wouldn’t have any control over it.
Rose Chan Loui: Well, it will get stock, but it will not have control anymore. And so a couple of questions. We don’t know exactly how much stock and what kind of say they’ll have, whether they’ll get a board seat, perhaps. All big questions.
The other question to me is if it’s a grantmaking foundation but they’re getting stock, can they sell the stock so they can make grants? And they did not give us a number. Although I think we were all operating under some articles that said that they were thinking it was $30–40 billion is what the nonprofit would get.
And that I think that brings us up to the latest bombshell.
Can OpenAI suddenly change its charitable purpose like that? [00:05:19]
Rob Wiblin: Well, I just wanted to pause for a minute because I thought the announcement was crazy in that it said that the new foundation was… like in our conversation we had been assuming that the new foundation will be focused in some way on AI at least that it will be focused on AI safety or AI governance or promotion of AI like anything. And then, what was the wording? To pursue charitable initiatives in sectors such as healthcare, education, and science — which I can’t think of anything more vague than that. But no mention of purpose.
Rose Chan Loui: Right. Yeah, we should maybe review what they state is its purpose. It’s stated in their certificate of incorporation, which is still the current one, that it is to ensure that artificial general intelligence benefits all of humanity, including by conducting research and/or funding artificial intelligence research. So that’s a very specific purpose.
Rob Wiblin: It’s a pretty specific purpose. And it doesn’t seem like it’s the same purpose as funding charitable initiatives in sectors such as healthcare, education, and science. Or I mean maybe you’d be packing a lot into the word science to make it the same.
Rose Chan Loui: Right. So very different. And so I think the question, before even we get to the latest development, our question with that announcement from a legal perspective is do they meet the requirements for changing purpose? So is it OK if I go into the law there?
Rob Wiblin: Please, absolutely. Because, you know, you would think if you donate money to a charity, it can’t just go from saying, well, we were a healthcare charity; now we’re a sports charity. We were a sports charity; now we focus on repairing roads. That’s not really how this is meant to work.
Rose Chan Loui: Certainly not with the charitable assets you already have today. Right. I mean, different to go forward and say, we’re gonna stop this.
But the basic law on this is that you can change purpose for one of four reasons. One is that the purpose has become illegal. Second, that it’s become impossible to fulfil. Third, that it is impracticable. And fourth, that it is wasteful. And by wasteful, it means that you have more assets than you need to fulfil your purpose, so you can redirect it to other charitable purposes.
Rob Wiblin: Do any of those apply?
Rose Chan Loui: I think the only one maybe is an argument that it’s impracticable for some reason. I think that’s still a very tough hurdle. I mean, I’d be curious what you think. Is it now impracticable?
I guess they would argue something like, “We need the money. And so it’s impracticable because our investors won’t make additional investments without kicking out the nonprofit from its control position.”
But is that their purpose? They seem to me to be conflating their purpose of benefiting humanity with their purpose of developing AGI for the purpose of developing AGI. In other words, there’s nowhere in the certificate that it says, “We are an AGI development company” — because that’s not enough to become a tax-exempt nonprofit organisation. I mean, everybody else is doing that and trying to win the race to AGI.
So they seem to me to be conflating those two goals and saying, “We’re fulfilling our charitable purpose by being the first to get to AGI, so the fact that we need to keep raising money justifies this change of structure.”
Rob Wiblin: So I didn’t totally follow that. Probably my fault. So the original thing is all about benefiting humanity by developing AGI safely. And you’re saying they have to argue that’s impractical now. So they haven’t said on what grounds they’re giving up their original purpose, so we’re left to speculate as to what argument they might make.
Rose Chan Loui: I don’t even know if they’re recognising that that’s a hurdle that they need to be thinking about.
Rob Wiblin: Yeah. And you’re like, the best you can come up with is for them to say it’s not possible for them to continue doing what they’re doing to try to positively shape the development of AGI, because with them as the nonprofit controller, the company won’t be able to attract enough investment, and so it would be rendered irrelevant. And so now there’s this kind of internal contradiction in how they’re trying to go about their mission, and so they should do something else.
But then it’s a big jump to say, and then they should go and run healthcare stuff. Wouldn’t you then say, well, then they should make the minimal change to the thing that’s possible and they should go and do AI-focused stuff, like try to accomplish the mission of making AGI go well, which is what we kind of assumed that they would say that they were going to do.
Rose Chan Loui: Right? Perfect. Perfect. So, yes. So step two is, if you can show it’s impracticable, then you should be writing a new purpose that approximates the old one as much as possible. And I totally agree with you. The foundation idea, while it can do a lot of good, it’s not approximating the original purpose. There is that little science part.
Rob Wiblin: I guess everything is science if you’re willing to be loose enough.
Rose Chan Loui: Right, right.
Rob Wiblin: I mean, you were also saying, even if they could say it’s not practical for AI, the company, to win their race to AGI with this nonprofit controller, and that’s why the nonprofit has to give up control of it, you’d say, well, maybe you shouldn’t have set it up that way then. If this wasn’t the way to win this difficult commercial race. That doesn’t mean that you can get out. You say that you screwed up. That doesn’t necessarily mean that you can get out of all of your commitments that way.
Rose Chan Loui: Right. So now it sounds like, oh, wait, we actually realised we could make a lot of money here, so now we want to own it. That’s a very big pivot. So where is the public’s interest in that move?
Rob Wiblin: I can also believe that, I’m sure it’s an additional challenge to win the race to AGI with a nonprofit controlling you. So the profit cap was still 100x. That’s still a substantial amount of profit that investors can make putting money into OpenAI. And I think they’re managing to raise significant capital, if not maybe all of the capital that they want. They may be raising tens of billions, and they want hundreds of billions. Not obvious that it’s impossible for them to pursue their original mission. Maybe they face some challenges, as we all do in life, but I wouldn’t just say it’s so impractical that they definitely have to give up.
Rose Chan Loui: It also depends how you interpret that purpose or that mission. Because like Helen Toner said at one point that it could be that fulfilling mission means not making all that money.
But I think in terms of bringing people up to speed, the other thing that came out after their post-Christmas post is that they want to remove the cap, because investors want the cap removed. So now we’re not even talking about 100 times investment — we’re talking no cap going forward. So that’s a big thing too, because they had boasted a lot about that and how that will protect the nonprofit.
Rob Wiblin: Yeah. Another crazy thing I want to point out is so yesterday Sam Altman on his blog wrote he was talking about all of the benefits that can come from AGI and his vision for the future. I haven’t read the full post, but I’ve read some quotes from it and one of them was, “Anyone in 2035 should be able to marshal the intellectual capacity equivalent to everyone in 2025. Everyone should have access to unlimited genius to direct however they can imagine.”
I just wonder if that is your vision for the future, why do we really need a nonprofit to pursue charitable initiatives in areas such as healthcare, education, and science if everyone can just open their phone and have access to literally the intellectual capacity of the entirety of current human civilisation in their pocket? What is the education part of this for? What are these people going to be doing? I guess teaching them how to use their AI assistants? I don’t know. It’s just such a clash of worldviews here.
Rose Chan Loui: Yeah, I will confess that I don’t know what to imagine. I don’t know the capacity of what we’re developing. Which I think brings up why it’s so important to have a nonprofit like OpenAI is. I think we’ve talked about this before. Its position within the company is in a way priceless. And just becoming a watchdog, that could be the best way to approximate it. But it’s definitely not the same.
Rob Wiblin: It’s a step back from being at the real forefront of the action. And actually just like you’re going from controlling things to merely trying to influence things, that’s more challenging.
Rose Chan Loui: And also I was just going to mention, I think it was interesting that his first tweet was, “I say no.” You know, basically “I, Sam Altman say no.” And immediately I thought, well, it’s not up to you. Remember, it’s the nonprofit board. And then his communications after that did mention the board, and said “our board will not say yes to this.”
Diving into Elon’s big announcement [00:15:16]
Rob Wiblin: So let’s come back to that. So the big news last night was Elon Musk saying, well, you’re suggesting that you want to sell or compensate the nonprofit foundation for its stake in the business to the tune of something like $40 billion. Well, I’ll buy that stake. Me and my coinvestors will offer to buy the stake for $97.4 billion. That’s what we think is a fair valuation.
Can you explain the implications that this has, if any?
Rose Chan Loui: Well, I think because they have opened the market in terms of saying that they’re going to do this restructure and they’re going to compensate the nonprofit for its interest in OpenAI, they do have to consider unsolicited bids, and they don’t have to accept it, but they’re going to have to go through some analysis of why they don’t want to accept his offer.
So money is definitely one thing, but they also, I think, can consider effect on charitable purpose and then whether his offer is credible. And then finally, I think what’s interesting is, the difference, I think, if we read his or his lawyers’ statements accurately, is he’s assuming or he’s seeing a world where the nonprofit continues, because he says we’ll revert back to its original purpose.
Which is something that the judge brought up actually in the hearing. He said, “Why can’t this nonprofit continue? You guys just keep raising money here.” And that’s when they start talking about, “Well, because we need to raise capital and the investors won’t do it without the nonprofit going away from its control position.”
But I think they need to consider what’s best for the nonprofit and its purpose. And so throwing that in makes it interesting, because I think the most obvious thing was it sets kind of a floor for valuation purposes because they first will have to show why that’s not fair market value, because originally I think they talked in the $30–40 billion range. At least that’s the numbers we were hearing. So I think they have to look at the amount and say, well, we don’t agree because XYZ.
The other interesting thing is I think valuation potentially is even higher than $156 billion when we talked in November, because SoftBank wants to invest, and their valuation for that deal, if it happens, is $260 billion.
Rob Wiblin: For the whole company.
Rose Chan Loui: For the whole company, yeah. But $97.4 billion might be good for a $156 billion valuation, but I don’t know if it’s high enough for a $260 billion valuation. So. So it definitely needs to be looked at. But they do need to look at different factors, first, probably being Elon’s credibility. He’s made all kinds of offers. But he does have investors who he says have the money.
Rob Wiblin: So the underlying legal issue is that the nonprofit foundation has to show that the path that they’re taking is in the best interests of the mission of the nonprofit foundation. And they have to argue that changing their mission and getting $40 billion is better than maybe getting more like $100 billion and keeping the same mission that they currently have. And it’s better at accomplishing the original mission to do the first thing than the latter.
Now, remind us, who do they have to demonstrate this to? Because you were talking about court hearings, I thought it was the California Attorney General and the Delaware Attorney General — I guess OpenAI is trying to convince both of them to sign off ahead of time, saying, yeah, we think this is fair and reasonable.
Rose Chan Loui: That’s correct. Both attorneys general have publicly stated that they are looking at the transaction.
So this might get a little technical, but basically the Delaware Attorney General has authority because OpenAI was established in the state of Delaware. So the attorney general there has authority over purpose, that it’s following the stated purpose on its certificate of incorporation, and that they are governing themselves properly. So they will look at whether the nonprofit board is following their fiduciary duties and whether the process for valuation and such is fair to the nonprofit.
And then California’s authority comes from the fact that they are operating in the state of California. And when you establish operations in the state of California, as soon as you have assets, you have to register with the Attorney General’s Department of Charities. And so they have authority over your charitable assets and whether you are using them for the stated purpose again, and however you’ve been marketing yourself here in the state of California. And they’ve clearly, as you know, all over their website, even now say, “our purpose is to benefit humanity.”
I guess that’s the argument, really, Rob. They probably interpret it maybe a little differently than necessarily we do, because I think they do think that AGI will benefit. So that’s probably the best way for me to understand what they’re saying. They’re basically saying, if we win the race to AGI, we will, as a matter of fact, benefit humanity. And some people will disagree because they think that there need to be guardrails in reaching AGI.
Rob Wiblin: Yeah. So that’s the way that they could make the argument is to say, well, all of this stuff about safety or having it go well, or being more measured in how we approached it, or designing it in such a way that it benefits humanity rather than harms it, all of that, let’s set that aside. That doesn’t matter. All that matters is getting AGI as quickly as possible. And the best way to do that is for the nonprofit to get out of the way and for us to just take hundreds of billions of dollars of investment, build the big compute centre to do it. Yeah, interesting.
Rose Chan Loui: I think that’s it. Yeah. I think that’s what they’d have to argue, and they would say it’s impracticable because we can’t achieve that goal if the nonprofit is in the way. I think that’s the best case for them, in terms of this restructure.
Rob Wiblin: There’s a degree of irony here where they’re feverishly trying to do all this stuff so quickly because they think if they don’t do it, someone else will do it very soon afterwards. They’re not even required for this to happen because it’s inevitable. And, like, if it’s not them, it’s Anthropic or it’ll be Google.
Rose Chan Loui: So I was just going to add that I also keep remembering that on their website, they do say that if someone else is doing it better, they will stop and support the other company.
Rob Wiblin: Yeah. Someone I was reading recently said, well, really, this stop and merge or stop and cooperate clause, that was meant to be if, I can’t remember the exact conditions, but it’s like if AGI is imminent, which they definitely say they think AGI is imminent, or at least that’s what Altman says, I guess is what all of the CEOs of these companies say, and someone else is doing it well, and we don’t want to create too much race dynamics, then we will stop racing with them and we will have a merger and try to do things in a more cooperative and less hasty way.
Rose Chan Loui: Right. Work together or something. Yeah. So I don’t know where that plays in here, because it does seem like the focus is on getting to AGI first.
Rob Wiblin: OK, so the attorneys general, they might write a letter to OpenAI saying, well, we saw there was this person who’s willing to pay a whole lot more money and keep the original charitable objectives. So it seems like it’s now less clear that the plan you’re putting forward actually is the best thing for the foundation to do. So maybe you need to change your plan or respond to this some way or another.
And you were saying they could offer various defences. They might say it’s not a serious offer. Maybe like Elon’s making up that he has the money and he wouldn’t actually pony up if we said yes. I guess they could say maybe there’s antitrust issues because Elon also is involved in his own AI company. There’s like, Grok, is that it? Or like xAI, I think, is the company.
Rose Chan Loui: Yeah. Actually, I don’t actually know what Grok is, but I know xAI is the competing artificial intelligence company.
Rob Wiblin: I think Grok is the name of the AI model. It’s like Claude.
Rose Chan Loui: OK, that’s sort of what they’ve developed. Yeah. I don’t know. So there’s antitrust issues that are raised by Elon Musk and his litigation, and that’s based on the relationship between Microsoft and OpenAI. I mean, I think there should still be antitrust issues. Unless they think xAI is not that big a company yet.
Rob Wiblin: Yeah, I think it’s not quite as big a player as the others, but I guess the argument would be, well, we couldn’t have Elon buy such a large stake in OpenAI because then he would own two of the biggest AI companies. And this would reduce the number of AI companies, like the major ones, from four to three, or maybe five to four. And so the Department of Justice would object.
Rose Chan Loui: I think they would face some antitrust issues. They just might be different from what they’re arguing in the litigation focused on Microsoft, because they’re really arguing that they’ve been working in cahoots for all this time and causing unfair competition. But, yeah, I would think that there would be issues there.
Rob Wiblin: I guess, realistically, under the current administration, it doesn’t seem like it’s going to be taking a very aggressive antitrust stance. I mean, they’re quite the opposite. And I guess Elon seems to half run the government, so I don’t know whether the Department of Justice —
Rose Chan Loui: Yeah, I don’t know which way the administration would want to fall in this case, because I think generally speaking, they’re not interested in antitrust issues in blocking mergers and such. But it is Elon. So who knows.
Rob Wiblin: If this was a TV show, you’d just be like, “Wait, they crammed all of these things into one character? He’s the guy doing the government stuff and he’s also doing the cars as well. And now he’s the one suing about that? The writers are out to town!”
Rose Chan Loui: Yeah. I mean, in a normal world, I think that there’s ethical issues here. That’s a whole other discussion, I think, on the law there. But also unclear to me if… He has a government position, I think. Or is he, you know, I think we have to look at. Because it’s a position that was not even in existence. A whole department that was not in existence. I don’t have any idea if he gets paid. Is he a private citizen or is he now a government employee?
Rob Wiblin: I think let’s set DOGE aside. That’s maybe beyond our remit for this evening.
Rose Chan Loui: Yeah, that’s a different rabbit hole.
Ways OpenAI could try to reject the offer [00:27:21]
Rob Wiblin: Yeah. I asked on Twitter — Twitter, also owned by Elon Musk — I asked for questions on there from the audience, and someone said, “How strong would the defence be? Yes, this is a higher amount of money, but it would violate our duty to the mission because we think this acquisition would be bad for the mission.” I guess they can’t argue that, because the original foundation would continue in this situation with its original mission. And I think Elon wouldn’t control the nonprofit because a nonprofit can’t have an owner like that.
Rose Chan Loui: So it’s interesting because the way that these transfers happen in the nonprofit world, at least from nonprofit to nonprofit, is that the board changes.
So for example, let’s say it’s just a company and it’s a nonprofit. You’re on the board, you need capital badly, but you are not feeling like you’re capable of raising the kind of capital you need to serve your nonprofit purpose capably. And so then you’d look for an affiliation with another nonprofit, say. And what would happen is the other nonprofit would say, well, we will infuse this much cash, we will essentially take over your nonprofit operations. But then they negotiate over who, whether the old nonprofit and that board continues to be in control, or gives up its control in order to get that infusion of cash.
So everybody’s still focused on purpose. So that’s a typical thing. You’re still wanting your purpose. To continue, but you’re deciding that you’re not the best people to do that. And so in that case, you accept a deal where they’ll take it over and you give up control. And the way you give up control is you give up your board seats, or maybe you hold a couple of seats, but you’re in the minority.
So to analogise here, the way I could see — again, we don’t know the details yet — but what I could see happening is Elon saying, “Nonprofit, I will give you $100 billion, but I want X number of seats on this board.” So when he says, I want control, I want the nonprofit control, then he would want to be able to control more than 50% of the seats on that board.
And so the questions I’d be asking then is that the best thing for the nonprofit? Will there be terms about who he appoints to the nonprofit board? Right now, the nonprofit board is supposed to be made up of independent board members. They define independent as not having an equity interest in OpenAI, which to me is kind of limited because at least some of those people historically have had interests. Like Sam has had interest through partners who do business with OpenAI. So until now, he’s not had an interest in OpenAI itself.
So the board would have to ask Elon, so what are you proposing here? How is this better for the nonprofit? But that’s how they would control it. That’s how he would take over: as part of the deal. — at least that’s one way; he may have other ways — but that’s sort of a typical thing, is you get to name the board.
Rob Wiblin: OK, so the argument with the attorneys general might be OpenAI Foundation will come back and say, well, we want to reject this because we think Elon is going to put these foolish cronies or something onto the board. And then maybe it will have the same stated mission, but it won’t actually pursue the mission, or it will pursue it incompetently because of these foolish appointments.
And I guess Elon could push back and say, well, no, I suggest person A, B, and C, all of these very credible, reasonable people. How can you object when these people are no worse than the people you currently have? And that could end up being the dispute back and forth.
Rose Chan Loui: Yeah. What if he suggests the top AI experts in the country who purely have no interest at all economically, but really are interested in developing AGI softly, safely.
Rob Wiblin: Geoffrey Hinton or something. He won a Nobel Prize. He’s one of the leaders in this field for decades. Why not? How can you object to that?
Rose Chan Loui: Right. Yes. So those are the kinds of things that haven’t been talked about yet. Or we don’t know what he’s thinking, but those are the kind of questions that the nonprofit board could ask in terms of deciding whether his offer could be in the best interest of the nonprofit and its original purpose.
Rob Wiblin: And its original purpose. Does this go before any judges? Are there any judges in the picture here? Because I’m wondering, someone had a question, can a judge dismiss the offer for being unserious or not credible? But I guess it sounds like it’s not judges, it’s the attorneys general. And yes, they could decide that it’s not credible if they just thought it was ridiculous, but maybe they’re going to have a hard time justifying that.
Rose Chan Loui: Well, I guess Elon could add this to his suit. He’s already suing on antitrust grounds. He’s suing as a donor, a previous donor. I think he’s not sure. His original suit was as a previous board member, but I think legal standing was shaky on that ground. I think he’s suing on fraud grounds.
So he could possibly, if this is the newest thing and they reject it, he could perhaps sue that the board is not following its fiduciary duties and just add that on as a cause of action if the judge will let him add it on. Otherwise he has to bring another separate suit.
Rob Wiblin: And do you have any thoughts on how likely do you think it is that the attorneys general would say, “We don’t think this is a serious offer. We don’t think this is a big deal. Carry on.”
Rose Chan Loui: I think it depends on what he produces to substantiate his offer. I mean, I think most people think it’s a game of chess for him, that this makes them come up to that $100 billion level so that if they succeed, it won’t be on the cheap.
Rob Wiblin: I see. And I guess the evidence he has to offer is where the money is coming from. So he would have to say, these people want to finance it. And if there’s signatures on the dotted line from all of these banks, then it’s a bit hard to say it’s not credible.
Rose Chan Loui: But Rob, I think he might throw the question back at OpenAI: “Where are you going to get the money?” Because remember, they’re madly raising money to do the AGI development. And so far in their blog post, they only mention stock as the compensation — stock in the new for-profit.
Rob Wiblin: And did he say whether he was offering cash?
Rose Chan Loui: He did not. He just said $97.4 is his valuation. So I think the attorneys general, the nonprofit board will have to compare who’s offering what. You know, not just dollars, but what?
Sam Altman slips up [00:35:26]
Rob Wiblin: Here’s another amazing bit that you hinted at earlier, but let’s dive into it. So Elon said this, and then Sam Altman came back on Twitter saying the board, like, “No way, we’re not going to accept this.” Actually, I think he said, “I want to buy Twitter for $9.7.” So I guess he’s trying to have fun with it.
But wait a minute. Wasn’t all of this meant to be decided by the nonprofit board, which has to be at arm’s length from Sam Altman, who has this very conflicted position given that he is CEO of the for-profit company? I thought that Sam Altman was not meant to be involved in decisions or not meant to be certainly deciding himself what the nonprofit foundation would or wouldn’t accept. So how can he possibly say that the offer wouldn’t be accepted? That’s not really up to him.
Rose Chan Loui: It’s not up to him. And so I think he slipped with the first tweet back. Couldn’t help it, said “I say no.” But I did notice that additional communications mentioned that “the board” is not going to say yes. But has he actually met with the board and discussed this with them?
So it’s not good from an appearance standpoint — certainly it does seem like he’s still running it, not the board. Even with the transaction. What are they calling it? Stargate. We really, we wonder whether the board has actually considered whether that makes sense from the nonprofit. Does it make sense to enter into that? And again, are they following charitable purpose with promising to make a huge commitment to that effort? I don’t know, there might be an argument. But the question is, has that actually gone through the nonprofit board?
Rob Wiblin: Yeah, I haven’t been able to follow the Stargate thing. There’s just been too much going on recently. So is it that OpenAI, the for-profit, is going to invest money into this Stargate effort?
Rose Chan Loui: This new hundreds of billions of dollars effort. It’s to set up computing centres everywhere.
Rob Wiblin: And they’re contributing and so are other organisations? I guess I should just go look this up.
Rose Chan Loui: Yeah, yeah. That’s about the extent of my knowledge. I just know a lot of money is also needed for that, and a number of companies have shown interest in investing in that.
Will this actually stop things? [00:38:03]
Rob Wiblin: A listener on Twitter — Dwarkesh Patel, actually — said, “I’d be curious to get Rose’s actual probability that the for-profit conversion gets derailed, not just the explanation for why it should or shouldn’t.” Would you venture a probability that this throws a significant spanner in the works?
Rose Chan Loui: I think it’s going to be delayed. I think it’s going to be delayed because it’s going to take the attorneys general time to go through all of these documents. When we first started looking at this, we were thinking, no one’s going to look at this. It’s just going to sail through. But it’s gotten all that attention and more. I do think it will get delayed. I think something will probably happen in the end. Whether or not I’d be happy with it is a different question than whether or not it goes through.
And again, politics will play a part in this. You know, the attorney general in charge, those who are looking at the nonprofit angle, will make their recommendation to the AG but the AG may decide on totally different grounds. They may just say, “Look, we want OpenAI here in California. We don’t want to hurt all the employees that have interest in the for-profit and we’re going to negotiate something where we…” I mean, they’ve said they want to take care of the nonprofit, but what does that mean in the end?
Rob Wiblin: So the reason this is relevant is that the attorney general is an elected position in both California and Delaware, is that right?
Rose Chan Loui: It is. So it is in California. I don’t know about Delaware, actually.
Rob Wiblin: Yeah. So even though their legal advisors might say, oh, the key issues are X, the attorney general could say, because it’s a political position kind of, they could be leaned on by all kinds of actors. They might decide in OpenAI’s favour or against OpenAI on the basis of other reasons.
Rose Chan Loui: Other reasons. Or even if not political even, they’re just saying, look, they employ all these people, it’s good for the economy of California. They might weigh it against that.
Rob Wiblin: I’m just looking at when the attorney general is next elected. The next election is in November 2026. So it’s a little while away.
Rose Chan Loui: They are racing against time though. They have a two-year period to get this done, and I don’t know when it started running.
Rob Wiblin: You’re talking there about how OpenAI, the company took a whole lot of investment that they have to suddenly repay if they don’t complete this nonprofit conversion.
Rose Chan Loui: Yeah, $6 billion is at stake for them that they have to pay back.
Rob Wiblin: So that’s an amount that they probably could repay back, at least if the business is going well. It’s not necessarily the end of the line for them.
Rose Chan Loui: Yeah. Yeah.
Rob Wiblin: But they would like to get it in under that two years, I’m sure. And I guess Elon, given his antipathy towards them, would probably love to drag it out past that two-year deadline.
Rose Chan Loui: Right, right. Yes, yes. I mean, litigation itself, what were they talking about? Yeah, the judge started by asking all these tough questions of Elon’s attorney about what kind of timing he saw, about when it could go into trial, and things were definitely going to be dragging into 2026. So it could be very close.
Rob Wiblin: That’s really interesting. Oh, god. I’m just imagining the election for the attorney general. One candidate funded by Elon Musk and the other one funded by OpenAI, trying to compete to get that person in to approve it. You’d say it’s too crazy. But I don’t know about this timeline that we’re in.
Someone told me that the Delaware Attorney General was taking a bit of a harder line with them. I can’t remember the details here, but does that sound right?
Rose Chan Loui: Certainly taking a harder line than we would normally see from Delaware because they’re generally very laissez faire about corporate matters. And so it was surprising. So they weighed in, they filed an amicus brief in the Elon Musk litigation against OpenAI, but basically telling the court, we’ve got this, we are looking at this, we’re going to protect the public’s interests, we’re going to make them follow their stated purpose, etc. So they’re saying all the right things in terms of at least our perspective on charitable purpose and such. But again, in the end, how hardline will they be?
Why does OpenAI even want to change its charitable mission? [00:42:46]
Rob Wiblin: Yeah, you don’t know until it actually happens. So OpenAI Foundation has said we plan to change our mission to something much vaguer about healthcare, education, and science.
Why would they do that when it seems like that would make things legally much more dicey and difficult for them because they have to argue that they’re better accomplishing their original mission with a new different goal in mind. It slightly reminds me of the AI alignment problem in some ways actually. Normally if you’re an agent trying to accomplish a goal, you don’t want your goal to be changed because then you’ll accomplish it worse. So isn’t this just creating new legal problems for them?
Rose Chan Loui: What occurs to me is that they’re following what was done with the hospital conversions in the ’90s. All these nonprofit hospitals and insurance companies like Blue Cross became for-profits — and in exchange for allowing those conversions, the attorneys general of all various states require that they set up a foundation. So the charitable assets that were dedicated to the medical or health purposes got put into these grantmaking foundations. A lot of our biggest foundations are from those conversions, and they continue to do good in a very different way.
And at least in California, I believe it required some specific legislation. And I don’t know what all these different certificates of incorporation actually said, because often certificates of incorporation will state a specific purpose but then they’ll say “and including any charitable purpose that is allowed by law” or something. So there’s this big catchall — which is not in theirs.
So it’s kind of interesting. A lot of nonprofit tax exempts have that catchall general provision so it’s easier to change purpose. But they did not include that language here, which is very interesting. And I think we’ve talked about this before, Rob. I’m just purely guessing, but I think they actually when they added in the big for-profit investors, that’s when they changed their purpose to be this strict.
And from the emails we’ve seen that have started to come out, it looked like their concern was hiring the best and the brightest. Their strategy for that seemed to be dependent on the fact that they could boast that we’re nonprofit, we care a lot about doing this the right way. So join us.
Rob Wiblin: Yeah So you think that was a big part of why they did the nonprofit thing in the first place?
Rose Chan Loui: Right. And then now it’s like oh, we don’t need that anymore. And now it’s making a lot of money. So we all want to share in this and we don’t want the nonprofit to hold us back.
Rob Wiblin: Yeah. So I guess originally they were like, we have to be a nonprofit to get the very best people because they’re really motivated by benefiting humanity. It’s like now we need the cash from Dubai so they don’t give a damn about the nonprofit. So we got to get rid of it. Is there an update on how other actors feel about this? I feel like we haven’t had any statements from individual members of the board or any leaks like that. Things are pretty close lipped.
Rose Chan Loui: No. Well, there have been groups speaking out. Public counsel has been writing to the attorneys general, certainly to California. There’s a group of foundations who have written to the attorney general in California about… A lot of some of them, not all of them were the product of these hospital conversions, hospital or healthcare insurance companies, et cetera. They’re not specific about what they want. They state the law very well, I think, but they, I think are concerned that if this goes through that the resulting foundation gets a fair shake, that they really do get fair market value for giving up their controlling interest in OpenAI.
Then the other group that is weighed in is a nonprofit kind of watchdog type organisation that has written an amicus also in the litigation.
Rob Wiblin: Which side are they on?
Rose Chan Loui: They’re on the side of stop this restructure. I mean, I think they’re very much intentionally not wanting to negotiate. They’re not concerned just about fair market value compensation. They’re concerned about removing the nonprofit from its controlling position.
Rob Wiblin: So you’re saying the watchdog, and they’re against the change, I guess, because it’s just setting a bad precedent for nonprofit law in general. We think now other people will just try taking their charity and turning it into a company?
Rose Chan Loui: Not really. Their argument is that we need this nonprofit all the more. This purpose has not become impractical or not needed in any way. If anything, we need it more. And what we’ve talked about before being inside this company is a priceless position to be in. And so they’re just against the restructuring for that reason.
Rob Wiblin: That’s really interesting. So what’s the name of this nonprofit watchdog? It’s really nice. It’s really interesting that they’re taking kind of the line that I was suggesting might be justified last time.
Rose Chan Loui: They’re really like drawing a line in the sand or whatever. It’s called Encode. Yes, yes. Some young, very young AI people started this company and they’ve been involved in a lot of AI proposed legislation.
Rob Wiblin: Yeah, we’ll stick up a link to that.
Rose Chan Loui: They really care about the AI side of it, the development of AI in a safe way that benefits humanity.
Rob Wiblin: It sounded earlier like you might be saying, the reason that they’re trying to change the language to something vaguer like to pursue charitable initiatives in sectors such as healthcare, education, and science was that they wanted to make this conversion pattern match to conversions that occurred earlier with healthcare companies that spun off nonprofits. And maybe there was some legislation that included healthcare in them and they’re trying to sandwich it into something that fits with that legislation?
Rose Chan Loui: I think so. I think they’re thinking, here’s precedent. Because whenever anyone asks me, has this been done before? Our answer is yes, with the hospitals. And so I can imagine that’s what happened. They’re like, OK, this has been done. We’ll follow this pattern. So that should justify this and this should be allowed. That’s how I would imagine this came to be.
And they were allowed to change purpose. They specifically used to operate either insurance companies or hospitals. And then they were allowed to become a grantmaking foundation. So there is precedent there. I think the question is, the devil’s in the details: what were the purposes there? How were they written? And you know, this one’s a very specific purpose, like I keep saying. And the resulting foundations did have to focus on health.
Rob Wiblin: Yeah, that makes more sense if you’re going from a hospital to focusing on healthcare. Yeah, this feels like a bigger leap. So I guess it’s possible they could change the strategy if it doesn’t seem to be panning out for them.
Most likely outcomes and what Rose thinks should happen [00:51:17]
Rob Wiblin: If I had to sum up what it sounds like your expectations are: so if Elon is able to show that the money actually probably is available and the offer is credible, it could well delay things and lead to more scrutiny from the attorneys general. Probably they won’t be forced to accept it, but rather maybe they will up their offer and make various arguments about why Elon isn’t a good counterparty and they think maybe his board members wouldn’t be very good. They’ll make various arguments or like maybe something about antitrust as well. They’ll just throw every legal argument that they can back about how, like why they shouldn’t accept the offer. Maybe they’ll have to up the amount of money.
I guess that could be challenging for OpenAI because you’re saying they’re finding it hard to raise money. So maybe they won’t be able to raise the necessary $97 billion to compete with what Elon’s putting forward.
Rose Chan Loui: I guess the way I would sum up is that I think I break it down to kind of three things.
One is that if this were to go through, one, the resulting nonprofit needs to receive fair market value, which we’ve discussed a lot before. And that’s not an easy thing actually because I don’t think it’s that easy to pin a specific value. I mean the valuations are definitely a starting point, but the loss of control thing like we’ve talked about in some ways is priceless. But they also, as part of that compensation factor, need to ask for sufficient liquid resources so they can actually operate — because just holding on to stock if it’s not liquid is not going to do them any good.
Second, if I had, if I could make a wish with my Aladdin’s lamp, I would want to be sure that the new board — or the continuing board, if they don’t change it — needs to be free of oversight by the new [for-profit]. Otherwise it’s just a typical corporate foundation where oftentimes it gets used to further the marketing purposes of the for-profit. I mean, not entirely, right, but a lot of times it’s who’s coming to ask for funds: “Oh, it’s one of our customers. So yes, we’ll donate to your favourite charity.”
And then I think it also means that board should not have more than 50% with economic interests in OpenAI or its partners.
Rob Wiblin: Because you’re saying it does business with so many other companies and you can have a financial interest in the success of OpenAI without necessarily owning OpenAI itself.
Rose Chan Loui: Right. I think it’s a big loophole right now.
And then third, kind of to our point of you can’t just like dump your whole previous original purpose, is I think the new activities should still include some kind of monitoring, evaluating and maybe reporting on the operations of OpenAI — and not just OpenAI, but other companies — so continues some kind of watchdog function. So that it is trying to approximate purpose. Again, I don’t think it’s the same at all. But it needs to be in there somewhere.
And to do all of that, to be actually an influencer, it needs to have enough capital to make the kinds of grants that will move the needle on AI development.
Rob Wiblin: Yeah, I think that’s a good place to end. But I do want to add one more thing. A crazy scenario that I saw someone spell out, which is, you can imagine that maybe Elon doesn’t actually have the money, so his bid is kind of rejected. But then if you’re Google DeepMind, why don’t you chime in and say, “Well, we’ll offer this amount.” With kind of the precedent set.
Rose Chan Loui: Yeah. And I keep thinking, because there are some huge nonprofits out there that really do care about the safe development of AGI, who could come in and swoop in and save the day for us. But if you find one, let me know.
Rob Wiblin: I think we might have to be looking at one of the world’s biggest companies to raise this kind of money. Maybe the Ikea foundation can chime in and decide to completely change their mission and just buy OpenAI. But, yeah, I think Google DeepMind might be the better bet.
Rose Chan Loui: I mean, I think to the listener’s point, I think we’ve talked about what we think should happen, but it’ll probably fall somewhere in the middle. I mean, I’m just hoping that the AGs will be able to negotiate well for this new nonprofit. And maybe make that purpose a little more specific.
Rob Wiblin: Just a little. Yeah, I think that was a fantastic list of policy recommendations. I hope you put that in a letter and send it to the attorneys general. I suspect this may not be the last time we talk about this on the show.
Rose Chan Loui: We’ll wait and see what’s next. Yeah, it’d be interesting if someone else comes in, because it’s really been Elon Musk so far, just keeping this in the conversation. Let’s see what else he comes up with.
Rob Wiblin: Yeah. Look forward to talking about the next twists and turns. Thanks so much for making time to do this emergency, impromptu, unprepared podcast.
Rose Chan Loui: I know. Well, I’m sure we’ll hear more, and we’ll see if he really sticks with this.
Rob Wiblin: Yeah. Enjoy the rest of your day.
Rose Chan Loui: Thank you, Rob.
Related episodes
About the show
The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.
Get in touch with feedback or guest suggestions by emailing [email protected].
What should I listen to first?
We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:
Check out 'Effective Altruism: An Introduction'
Subscribe here, or anywhere you get podcasts:
If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.