Emergency pod: Did OpenAI give up, or is this just a new trap? (with Rose Chan Loui)
Emergency pod: Did OpenAI give up, or is this just a new trap? (with Rose Chan Loui)
By Robert Wiblin · Published May 8th, 2025
On this page:
- Introduction
- 1 Articles, books, and other media discussed in the show
- 2 Transcript
- 2.1 Cold open [00:00:00]
- 2.2 Rose is back! [00:01:06]
- 2.3 The nonprofit will stay 'in control' [00:01:28]
- 2.4 Backlash to OpenAI's original plans [00:08:22]
- 2.5 The new proposal [00:16:33]
- 2.6 Giving up the super-profits [00:20:52]
- 2.7 Can the nonprofit maintain control of the company? [00:24:49]
- 2.8 Could for-profit investors sue if profits aren't prioritised? [00:33:01]
- 2.9 The 6 governance safeguards at risk with the restructure [00:34:33]
- 2.10 Will the nonprofit's giving just be corporate PR for the for-profit? [00:49:12]
- 2.11 Is this good, or not? [00:51:06]
- 2.12 Ways this could still go wrong – but reasons for optimism [00:54:19]
- 3 Learn more
- 4 Related episodes
When attorneys general intervene in corporate affairs, it usually means something has gone seriously wrong. In OpenAI’s case, it appears to have forced a dramatic reversal of the company’s plans to sideline its nonprofit foundation, announced in a blog post that made headlines worldwide.
The company’s sudden announcement that its nonprofit will “retain control” credits “constructive dialogue” with the attorneys general of California and Delaware — corporate-speak for what was likely a far more consequential confrontation behind closed doors. A confrontation perhaps driven by public pressure from Nobel Prize winners, past OpenAI staff, and community organisations.
But whether this change will help depends entirely on the details of implementation — details that remain worryingly vague in the company’s announcement.
Return guest Rose Chan Loui, nonprofit law expert at UCLA, sees potential in OpenAI’s new proposal, but emphasises that “control” must be carefully defined and enforced: “The words are great, but what’s going to back that up?” Without explicitly defining the nonprofit’s authority over safety decisions, the shift could be largely cosmetic.
Why have state officials taken such an interest so far? Host Rob Wiblin notes, “OpenAI was proposing that the AGs would no longer have any say over what this super momentous company might end up doing. … It was just crazy how they were suggesting that they would take all of the existing money and then pursue a completely different purpose.”
Now that they’re in the picture, the AGs have leverage to ensure the nonprofit maintains genuine control over issues of public safety as OpenAI develops increasingly powerful AI.
Rob and Rose explain three key areas where the AGs can make a huge difference to whether this plays out in the public’s best interest:
- Ensuring that the contractual agreements giving the nonprofit control over the new Delaware public benefit corporation are watertight, and don’t accidentally shut the AGs out of the picture.
- Insisting that a majority of board members are truly independent by prohibiting indirect as well as direct financial stakes in the business.
- Insisting that the board is empowered with the money, independent staffing, and access to information which they need to do their jobs.
This episode was originally recorded on May 6, 2025.
Video editing: Simon Monsour and Luke Monsour
Audio engineering: Ben Cordell, Milo McGuire, Simon Monsour, and Dominic Armstrong
Music: Ben Cordell
Transcriptions and web: Katy Moore
Articles, books, and other media discussed in the show
Unpacking the saga:
- Evolving OpenAI’s structure — the letter announcing the change from Sam Altman and Bret Taylor
- Not For Private Gain — the letter that laid out all that was at risk with the restructuring
- Rob’s tweet thread breaking down the Not For Private Gain letter for a lay audience, and another thread on why we can’t claim victory from the announcement just yet
- OpenAI reversed its restructuring plans. Critics aren’t cheering. — podcast episode from Politico TECH
- Four predictions about OpenAI’s plans to retain nonprofit control by Garrison Lovely (plus this tweet)
- OpenAI claims nonprofit will retain nominal control by Zvi Mowshowitz (former guest of the show)
- Can OpenAI abandon its non-profit “purpose”? — article by Rose
- California bill to block OpenAI’s for-profit move gutted by Titus Wu
Other 80,000 Hours podcast episodes:
Transcript
Table of Contents
- 1 Cold open [00:00:00]
- 2 Rose is back! [00:01:06]
- 3 The nonprofit will stay ‘in control’ [00:01:28]
- 4 Backlash to OpenAI’s original plans [00:08:22]
- 5 The new proposal [00:16:33]
- 6 Giving up the super-profits [00:20:52]
- 7 Can the nonprofit maintain control of the company? [00:24:49]
- 8 Could for-profit investors sue if profits aren’t prioritised? [00:33:01]
- 9 The 6 governance safeguards at risk with the restructure [00:34:33]
- 10 Will the nonprofit’s giving just be corporate PR for the for-profit? [00:49:12]
- 11 Is this good, or not? [00:51:06]
- 12 Ways this could still go wrong – but reasons for optimism [00:54:19]
Cold open [00:00:00]
Rob Wiblin: Currently, OpenAI the nonprofit is a general partner of OpenAI LLC, which means that it can directly instruct it to do things. Under this new arrangement, it just becomes one of multiple different shareholders in this [public benefit corporation]. That is very different and I think substantially reduces its ability to control what the company does.
Rose Chan Loui: I think elements of that can still be in this new deal, as long as what the nonprofit is focused on is safe development of AGI. As long as the structure on the economic side is easier for everyone to understand, they should be willing to sign on to a deal that says regardless of stock ownership, the nonprofit board and the nonprofit are in charge of decisions that impact how technology is developed.
It’s very possible that you can have less than a majority but still have the right to make certain decisions — because what we care about in terms of control is how they develop AI.
Rose is back! [00:01:06]
Rob Wiblin: Hey, Rose. Welcome back to the show.
Rose Chan Loui: Good morning, Rob.
Rob Wiblin: For those who don’t know, I’m here with Rose Chan Loui, a nonprofit law expert at UCLA. And I’m Rob Wiblin, host of the show.
This is our fourth conversation about the saga of OpenAI trying to go for-profit. But if OpenAI is to be believed, maybe it should be the last one of these. Though I suspect perhaps it won’t be.
The nonprofit will stay ‘in control’ [00:01:28]
Rob Wiblin: Can you tell us a little bit about the big announcement that OpenAI made last night?
Rose Chan Loui: Yes, the big announcement last night is that the nonprofit will remain in control of OpenAI. This was revealed in a blog post from Sam Altman and [Bret] Taylor. So now I think Rob and I are excited to take apart the blog post and figure out what really it means to be “in control.” I think we have a lot more to figure out, and to hear from them about the details of whether this really will work.
Rob Wiblin: Yeah. Let’s dive into the details a little bit. So it reads: “OpenAI was founded as a nonprofit, and is today overseen and controlled by that nonprofit. Going forward, it will continue to be overseen and controlled by that nonprofit.” The LLC is still going to become a Public Benefit Corporation, which adds a bunch of wrinkles that we’re going to talk about later, but “the nonprofit will control and also be a large shareholder of the Public Benefit Corporation.”
And it goes on to say, “Our mission remains the same, and the Public Benefit Corporation will have the same mission” — which it’s a little bit complicated whether that is entirely true, but that’s at least the headline that they were going for. It created a massive stir.
Later on we’re going to talk about the possibility that maybe, in fact, this is not nearly so much of an improvement as they are trying to make out. And in fact, maybe almost all of the things that people were worried were going to happen might still happen under this new arrangement — but now it’s just going to be harder to see that that is happening because they’ve chosen a more surreptitious way of doing it.
But if you kind of took this all at face value, is it a huge deal, and maybe a positive thing for people who’ve been worried like we are?
Rose Chan Loui: I still think it’s a big deal, because you and I have talked about how we think that the nonprofit’s control of the entity from within the entity is a really significant asset, if you want to call it that, for the purpose. The purpose of OpenAI from the beginning was to control the development of AGI from the inside. I do think that is a big thing to announce.
Now, I think what we need to take apart or think about here is what the governance structure is going to look like and how they’re going to implement it, because that will make all the difference. I mean, we’re a little bit back to where we were a year and a half ago when the nonprofit board tried to fire Sam Altman. I think for those of us who were believers in the nonprofit structure of OpenAI, that really exposed its vulnerabilities.
So I think there’s an opportunity here. Once they have said that it will be still a nonprofit structure, where the change is going to happen for sure is at the corporate level — with the change to being a Delaware public benefit corporation, and also the changes that they have announced to the capitalisation of the for-profit part of the business. So I think there’s things to know both at the nonprofit level and at the for-profit level.
I’ll let you ask questions so that I’m not just lecturing.
Rob Wiblin: Yeah, yeah. So for those who have not been following this constantly all the time like we have, the original plan was that basically the nonprofit, in broad strokes, was going to sell its interest, its stake, its control of OpenAI, the for-profit company; and the for-profit company would go off on its own and the nonprofit would not really be able to influence it very much going forward.
I guess the big news, and it is a massive change to the plans, is that they’re saying they’re not going to do that anymore. They’re going to mix things up a little bit, but the nonprofit is still going to have control, they say, of the for-profit. And so its charitable purpose of ensuring that AGI benefits all of humanity should still have some, maybe a lot, of influence over what the for-profit company can do.
I would love to know the inside story of how it is that this change came about. They say in the statement:
We made the decision for the nonprofit to retain control of OpenAI after hearing from civic leaders and engaging in constructive dialogue with the offices of the Attorney General of Delaware and the Attorney General of California. We thank both offices and we look forward to continuing these important conversations…
Is this polite code for the attorneys general and the courts were about to kind of hand their asses to them, so they had to give up and try something different?
Rose Chan Loui: Well, it’s hard to know, not being in the room. But you remember, Rob, that the nonprofit was going to be, in my words, relegated to becoming a typical corporate foundation. They were not talking about any kind of involvement in the development of a technology by the for-profit. It was going to be one of the best resourced foundations in the world, but they never mentioned the actual purpose of that nonprofit.
So I do think that there is something to celebrate here. By saying that it will stay within and control the for-profit, I think that’s good news for us. But we’re far from done in terms of examining what goes forward. And I think the AGs have said that in fact also, in different statements by both Delaware and California.
Rob Wiblin: Yeah, I expected maybe in my heart for the attorneys general to allow it to go through in some form or other. But I guess when you think about it, the attorneys general have a good personal reason to want to intervene here, which is that they currently have oversight over OpenAI the nonprofit. OpenAI was proposing that the attorneys general would no longer have any say over what this super momentous company might end up doing.
I mean, it was just crazy how they were suggesting that they would take all of the money and then pursue a completely different purpose. It was kind of stunning. It seemed a bit too much to me. I’m kind of surprised that they thought that they would get away with that. And maybe that’s the thing that pushed the attorneys general over the edge to say no.
Rose Chan Loui: Yes. I mean, it’s what we’ve been saying for a long time. But I was just surprised that they came out with it yesterday and got there. Who knows what they were told? But I was taken aback by the speed with which it happened, so I don’t know what really pushed them over the line.
But again, they’re not saying that much yet. So I think we need to talk about how control is going to be implemented and how to make sure it’s real control.
Backlash to OpenAI’s original plans [00:08:22]
Rob Wiblin: Yeah, we definitely will talk about that in a bunch of detail. But just before that, is it the case that this about-face kind of is an admission that what they were doing was sketchy and not acceptable, like we’ve been saying for the last six months? Maybe this is just a selfish thing. I kind of enjoy feeling vindicated, but…
Rose Chan Loui: I know, I know, right? All the work that we’ve been doing. It’s speculating, but I don’t know if when they first embarked on this, they anticipated all of this pushback from all these different groups. I think they’re trying to find their way to something that will clean up the economic side of it, make it less complicated, but that is acceptable from the public’s view of the public’s interest in this restructure.
Rob Wiblin: Yeah. Let’s talk a little bit about a bunch of the backlash that they’ve been getting recently, because I think we last spoke in very early March when this issue was still a bit under the radar — and there wasn’t really any public effort, that I was aware of at least, to object to what was going on.
But since then, I guess there was one attempt to get up legislation in California that would have not allowed this. I think that didn’t really go anywhere; there were issues with the legislation itself that maybe it wasn’t that well drafted, but that one didn’t work out.
But then there was this other big letter that did make a significant splash. It was on the domain Not For Private Gain. The main drafters were Page Hedley, Sunny Gandhi, Nathan Calvin, and Tyler Whitmer. But this letter to the attorneys general, it was also signed by dozens of other signatories, including three Nobel Prize winners. The folks that I recognised on the long list of names was Geoffrey Hinton, Margaret Mitchell, Stuart Russell, Scott Aaronson, Joseph Stiglitz, Luigi Zingales, Oliver Hart, Lawrence Lessig, and Michael Dorff, the guy who literally wrote the book about public benefit corporations.
Rose Chan Loui: And my next-door neighbour!
Rob Wiblin: Oh, really?
Rose Chan Loui: At UCLA, yeah. It was a very well-written piece, and I thought very persuasive. And then also a good number of California foundations, they have a little bit different view of what should happen. But I think it really caused the attorneys general to realise that there are a number of constituencies out there who are very concerned.
Rob Wiblin: Yeah, yeah. The letter, when I read it, it was both very funny and so brutal to just lay out all of the facts in a very clear way. And I’d thought a lot about this in preparing for these interviews, but I’d never really conceptualised it quite clearly enough, because it is so confusing.
People could go and read the letter at notforprivategain.org if they want to take a look. It is a little bit more technical than you might understand if you haven’t been tracking this issue, so I actually wrote up a tweet thread that tried to explain it in very plain language for people, which I think was my most popular tweet ever. I think more than 10 million people saw it.
Rose Chan Loui: Oh, that’s great. Congratulations.
Rob Wiblin: Definitely got a decent audience. And we’ll link to that for people who would like a quick and punchy summary. It was just such a body slam on this that I kind of expected it to make some trouble for OpenAI, and for the attorneys general to read it and their eyes to open wide. But I didn’t expect it perhaps to lead to an about-face on this topic within just a few weeks.
Rose Chan Loui: Right. I think it’s the alacrity that surprised me. How quickly. Of course we don’t know what internally is going on there that might have also accelerated that.
Rob Wiblin: I guess a totally different story would be maybe this was being proposed by OpenAI the business, and the nonprofit board members hadn’t really been so engaged yet; they hadn’t maybe fully understood what was going on or dedicated the time to it. And maybe when they gave it a closer look, just as it was going through the process, they were like, “Wait, you’re proposing to do what?” Because it’s not clear why the nonprofit board would be in favour of this. It was always a little bit surprising that they would go along with it.
Rose Chan Loui: Yeah, I agree. I think that there has been so much written that the nonprofit board really did have to look at what people were saying and ask some penetrating questions of the executives at OpenAI.
It also could be that the investors want some certainty. You know, it could be that getting rid of the nonprofit’s control over the safety development issues was how they thought about cleaning up the economic side of things.
But as I’ve written before, you can clean up the economic side of things and still have the nonprofit control the issues that are really most important. They can have outsized voting rights on that and still give what it looks like they’re trying to give the investors here, which is more typical straight stock ownership of their interest in the new Delaware public benefit corp. So I think they’re trying to figure out that balance.
Now, for us who are concerned about the nonprofit going forward, it will matter a lot what governance looks like, who’s going to be on that board. I don’t know if there will be a change right now because they do have relatively new board members, who it seems have been looking at this and presumably have had a hand in developing this new proposal. So will they stay? But then what does the succession plan look like? Will there be any outside monitoring of who’s on the board?
And then I think the biggest question for me is what was shown a year and a half ago: what kind of transparency will the board have into the operations of OpenAI? Will they have full and candid information about how the company is developing technology? Because that’s the vulnerability that was revealed early last year, or late the year before.
So I think there’s an opportunity to actually really make this work. I think that’s an optimistic way of looking at it. Now that they have said the nonprofit will be in control, then what we’re hoping for is to see details that will show us how the nonprofit will actually be in control. Like, the words are great, but what’s going to back that up?
Rob Wiblin: Yeah. It was interesting: that letter, Not For Private Gain, was suggesting stuff that no one had really proposed before, which was really fascinating at the end, where it was saying it’s actually not enough just to stop this restructure into being for-profit. All these events have shown that the nonprofit board isn’t being empowered to actually complete its mission.
It was saying that the attorneys general had a right and a responsibility to intervene and say that OpenAI the business has to start providing more information; it has to be given the actual resources to scale up in order to pursue its mission. We have to make sure that there are members on the board who are not actually undermining the actual charitable purpose of the organisation — and if they are, then they need to be removed and replaced with people who are happy to go along with that mission.
So it is just the case that it’s a very difficult job that they have, and they haven’t really, to date, been resourced and empowered to pursue that mission super well. And we’re hoping that as part of this restructure that they’re proposing, they actually will get their due.
The new proposal [00:16:33]
Rob Wiblin: We should maybe just explain a little bit about what is the proposal now. Do you want to do that, or should I have a go?
Rose Chan Loui: Oh, you can. I’ll just jump in.
Rob Wiblin: I’ll do my best. So the proposal now is that OpenAI the LLP, the business part of it, it’s going to convert from being a limited partnership to being a public benefit corporation. The nonprofit foundation is going to go from being a general partner that has direct control over the company to being one of multiple different shareholders in this public benefit corporation.
Now, they say it will have control, but it’s a bit vague exactly what that is, so we’re not sure. Will they have a majority of the votes, or will there be some special agreement about the relationship between the nonprofit and this new public benefit corporation?
Rose Chan Loui: Can I jump in here? Two things: it seems like from the blog post, they went to great effort not to say that it would be the largest or a majority shareholder, because if they really are thinking that, wouldn’t you say that right up front? Because that’s big news for us.
But they say that it will be a “large” stockholder, and so then it’s not as obvious how they’re going to have control. If they’re the majority stockholder, then… And remember, right now we don’t actually know how much was given away in stock, so we don’t know exactly how much. But regardless of how much stock they have, this is what I would hope would go forward, they have control over what the business operations do.
So I think it’s really important going forward that the nonprofit continue to have rights to control, at the least, the development of AI in a way that is safe and benefits all of humanity — which is what they committed to the public. I think an interesting thing is whether the charter will continue, because that fleshes out purpose a lot more. So that’s kind of one side of it: that they can have control regardless of how much stock they have.
I mean, the stock part matters in the sense that we would like the nonprofit to be able to continue to participate in profits in the future. And I think to some extent that’s their due. They started all of this. They originally put in contributions from donors, they put in the IP, they put in the original employee base. So that’s one side of it.
And then the other side of it, just for people not knowing what a Delaware public benefit corporation is, it is not at all the same as a California public benefit corporation. The similarity in names is very confusing. A California public benefit corporation is 100% a nonprofit charitable entity. A Delaware public benefit corporation is very much a private, profit-making corporation that is allowed to consider public good, a public mission in its operations — so they don’t have to put 100% of their endeavours into profit maximisation for the investors.
Rob Wiblin: OK, so it’s going to become a Delaware public benefit corporation, which, as you’re saying, it kind of has a dual mission of both to make profit and it’s allowed to have another consideration, another goal that it has in mind. But we said in previous episodes they don’t really have to pursue the nonprofit mission very effectively, or there’s not really any recourse. No one can complain. It just gives them the option to do it should they feel like it.
Rose Chan Loui: Yeah. So that’s really important. We’ve been trying to emphasise that. It’s a very, very different entity. But they are saying that the Delaware public benefit corporation will continue to support the mission of the nonprofit. So we’re wanting to know how.
Giving up the super-profits [00:20:52]
Rob Wiblin: I guess it’s only been 24 hours since this dropped, and people have been trying to make sense of it, and the legal issues are very complex and non-obvious. But some folks are worried that this may not be as good as it looks on the tin, and there’s various different ways that the nonprofit could still end up drawing the short straw here from this change.
Rose Chan Loui: The short end of the stick.
Rob Wiblin: Yeah, exactly. We could go through a couple of them.
One is, they’ve said directly that this previous arrangement — where the nonprofit would receive super-profits if OpenAI earned enormous returns, became worth $10 trillion, $100 trillion — currently, the nonprofit would receive almost all of those profits in that extreme scenario. They’re going to go to a different thing, where they’re just going to have normal shares — they will receive 20% or 10% or 30% or whatever it is — they’re just going to receive that same share, whether it’s super successful or mildly successful or whatever. They’re getting rid of this super-profits arrangement, which is the profit cap thing, where other investors can’t earn more than, say, 100 fold on their initial investment.
So they’re directly saying they’re getting rid of that. I don’t know that’s necessarily objectionable. I don’t know that it was necessary for the nonprofit’s mission for it to have this super-profits arrangement. You could imagine a bunch of arguments for why this actually isn’t so key — and in fact their attention should be elsewhere, and they should trade off those rights in exchange for other things that they care about more.
Rose Chan Loui: Exactly, exactly. I think that that is not a bad deal. But the nonprofit’s got to get enough out of it. And for me, “enough” means they’re in control of the charitable purpose — and that nonprofit charitable purpose of ensuring that AI benefits humanity still continues to be the paramount purpose. And that they’ll have the ability to slow down, to make profits second to that primary purpose.
Because they are giving up a lot.
Now, I’m with you: when I first read about the 100x cap, I thought, when’s the nonprofit ever going to get anything out of this? There’s hardly anything at the nonprofit level right now, and there haven’t been any distributions thus far. But people who know more than I do about the prospects for really amazing profits clearly care about the removal of that cap. So if it didn’t mean anything, I don’t think they would fight so hard for it to be removed.
Rob Wiblin: There is something very funny about the removal of the profit cap in the statement that they make. I think in Sam Altman’s letter, he says this 100x maximum profit cap made sense in the past when we thought that one company might control AGI alone, but now that it’s such a competitive market, it doesn’t make sense anymore.
But actually, if it didn’t matter at all, if no one thought that they were going to earn these incredible returns, then why are all of the investors pushing so extremely hard to get rid of the cap? Obviously, it does matter to someone: it matters to the investors, which is why this is a requirement for a lot of their investment now is to get rid of this profit cap. It’s a little bit hard to understand why they’re saying it doesn’t matter at all, but it’s very important that we get rid of it.
Rose Chan Loui: Right, right. I think it clearly matters. I’m sure reasonable minds would differ as to whether this tradeoff is worth it. But I think if we look at what the actual purpose is of the nonprofit, that’s supposed to be number one, not profits.
Can the nonprofit maintain control of the company? [00:24:49]
Rob Wiblin: So let’s talk about this issue of control of what OpenAI the company does with its technology, whether it’s taking unreasonable risks, whether it’s deploying the technology in a way that’s broadly beneficial.
Currently, OpenAI the nonprofit, as we were saying, is a general partner of OpenAI LLC — which means that it can directly instruct it to do things. It can change the CEO, it can tell the business to do stuff.
Under this new arrangement, it just becomes one of multiple different shareholders in this public benefit corporation, this PBC. That is very different, and I think substantially reduces its ability to control what the company does. Certainly if the details are set up wrongly, then it might really struggle to control it.
Rose Chan Loui: But I think elements of that can still be in this new deal. I’m reading into it that hopefully that should be acceptable to the investors, as long as what the nonprofit is focused on is safe development of AGI. And I would say if that’s not acceptable to the investors and to OpenAI, then that means they don’t intend to go about it in the way they’re claiming they’re going to go about it.
As long as the structure on the economic side is cleaner and easier for everyone to understand, is more typical stock ownership, they should be willing to sign on to a deal that says that regardless of everyone’s stock ownership, the nonprofit board and the nonprofit are in charge of decisions that impact how technology is developed.
Rob Wiblin: That’s really interesting. How would you do that?
Rose Chan Loui: They would have outsized voting, including perhaps the right to veto — to just say, “No, you can’t go in that direction.” It wouldn’t be tied to their stock ownership; it would just be something they would have to do contractually.
It’s like when they had the operating agreements and the limited partnership agreements, there were provisions in there that said we have these rights. Now, in that case, they were also the 100% owner at that time. They will not be the 100% owner anymore. But actually, I don’t think they’ve been the 100% owner, because I’m pretty sure the employees have stock ownership. It’s just Microsoft that has these weird profits interests instead of stock ownership.
So it’s very possible that you can have less than a majority but still have the right to make certain decisions. And that’s what they really need to think about: how to do that in such a way that all of us can still feel assured that they really mean it when they say that the nonprofit will control? Because what we care about in terms of control is how they develop AI.
Rob Wiblin: Right, OK. So I understood that it was kind of difficult for the nonprofit to control what the public benefit corporation would do for a couple of different reasons. One, because they’re no longer a general partner, because they just are one of multiple different shareholders, their only option would be to call a meeting of shareholders and then try to get a majority to fire the CEO. No, that’s not right?
Rose Chan Loui: I mean, I don’t know if they want to give up everything, but they can have what I call super-rights or outsized voting rights and otherwise rights to make decisions on certain things. And it’ll be in that lawyering of what those certain things are going to be where the rubber meets the road.
I kind of generally say any important decisions, anything that the board feels… And this is where you come in with they’re needing to be sufficiently resourced: they should have their own experts who can advise them on what’s dangerous, or what’s actually beneficial for humankind. You know, taking the time to think about the implications of a product or platform they’re developing. I’m trying to get across that I think they should have a significant ownership — because you want them to be in the room — but ultimately you can write that they have voting rights greater than their stock ownership would indicate.
Rob Wiblin: I see. So there’s this issue that they won’t have a majority of the financial shares in the company, but they could have outsized voting rights so that their votes basically count for more. So despite being owed a minority of the profits, they could have a majority of the voting shares that would allow them to appoint the CEO or fire the CEO, say.
Rose Chan Loui: Right. You can have different types of shares, so theirs could hold some rights that other shares don’t have.
Rob Wiblin: Yeah. It sounds like you’re saying that there’s more that you can do as well. That above and beyond that, there could be an agreement between the nonprofit and the new public benefit corporation, saying that the nonprofit can veto things within some specific domain, that if the company wants to deploy a model of some particular type with some certain capabilities, that they might have to consult the nonprofit and get their approval in order to do it. That’s the kind of agreement that you actually could make that would be legally enforceable?
Rose Chan Loui: There’s different ways to implement it, because also, to the extent they want to be able to sell shares, they don’t want to give away that. So it might be better done through an agreement. Like they could have a type of share that would have that embedded within. Like I said, there’s different ways to structure how that’s done, but it should be able to be done.
In other words, I think my point is just that they’re having a minority, although hopefully significant, interest — because I want them to participate in the future of OpenAI at the same time. There’s more than one way to get control of the things that the nonprofit cares about.
Rob Wiblin: OK, so what are those options? There’s just having super-voting rights, so you have a big majority there. And then there’s also having all kinds of different agreements that you’ve made where you could allow and disallow particular things or be able to instruct them about some specific narrow topic that is core to the nonprofit’s purpose.
Rose Chan Loui: Right, right. Corporate lawyers probably come up with all kinds of different ways, but it’s definitely… I mean, that’s what I’m looking to hear more about, because from what we’re hearing, they’re not going to beat the majority. I mean, we always said that the valuation was very difficult here with all these investors having priority rights on profits up to 100 times right now. And then they’re going to most likely remove that.
So I guess that’s where I’m optimistic, because even though they’ve not said yet how, I think there are ways to give that nonprofit control of the most important issues. I don’t know if they would give them the right to choose the CEO. They could have the right to choose at least the initial public benefit corp. I mean, they’re in charge right now, so I think that would have to happen. But then I think it’s just got to be negotiated: what else does that nonprofit have control of going forward?
Rob Wiblin: Yeah, I guess after what’s happened over the last six months, I really feel like you want to make these agreements as legally tight as you possibly can, because it is foreseeable that people might wriggle out of them in future.
Could for-profit investors sue if profits aren’t prioritised? [00:33:01]
Rob Wiblin: A concern I’ve seen raised is that the public benefit corporation ends up with the nonprofit as a shareholder and a bunch of profit-motivated shareholders — and if the nonprofit leaned on the public benefit corporation to focus mostly just on its nonprofit mission, and to not make significant profits, that conceivably the for-profit investors could sue — saying that the public benefit corporation is no longer being a good public benefit corporation, because it’s not reasonably balancing the nonprofit mission with their interest in making profit.
I imagine this is a very unusual scenario. I’m not sure whether this has really happened before. Does this sound at all imaginable to you, or is this something that people are dreaming up because this is such an abnormal situation?
Rose Chan Loui: I go back to what you said: I think it has to be very clear up front what that Delaware public benefit corporation is subject to in terms of the control of the nonprofit and what kinds of things.
I mean, the investors thus far have agreed to all kinds of things that limit their ability to make decisions on behalf of the for-profit operation, so I don’t think there’s anything limiting the Delaware public benefit corporation from doing that as well. If they say up front that’s part of their mission, then that should be.
But again, good lawyering is going to be really important here, and it’s got to be very clear what rights that investors don’t have.
Rob Wiblin: Yeah, interesting.
The 6 governance safeguards at risk with the restructure [00:34:33]
Rob Wiblin: The letter that we were referring to earlier listed six different governance safeguards that OpenAI had been promising over five or 10 years that it argued was at risk with a for-profit restructure.
So there was: “Profit motives are subordinate to charitable purpose.” Is that going to be protected here? I guess it depends on how it’s set up. It depends on the special voting rights that the nonprofit might have, like what agreements it makes specifically. So the devil is just really in the details as to whether it will actually be possible for the nonprofit to insist that profits be subordinate.
Rose Chan Loui: That is really important. I think that’s the control we care about. Which list are you referring to, Rob?
Rob Wiblin: Table 1. There was a table somewhat early on that listed these six things and saying they’re all there today. But I guess some of them were definitely going to go, and others, it was just unknown.
So then number two was: “Leadership has a fiduciary duty to advance the charitable purpose, enforceable by the attorneys general.” The ability of the attorneys general to intervene here has changed a little bit, because if the nonprofit is not pursuing its mission, they have to lean on the nonprofit to then lean on the public benefit corporation.
So whether the attorneys general do still have some recourse here depends on, again, that relationship between the nonprofit and the public benefit corporation and whether it is set up right in the first place. If it’s set up poorly, then the attorneys general could say to the nonprofit, “We want you to do your job,” and the nonprofit could say, “Actually, we can’t anymore, because we messed it up and we’ve lost the ability to tell the PBC what to do.”
Rose Chan Loui: Yeah, that’s true. I think that’s right. The AGs have the best ability to ensure that this nonprofit purpose continues and is actually implemented now, before the restructuring goes through. They will still have control over the nonprofit, but they won’t have control over the operations. It’ll be through the nonprofit that they will have the ability to monitor.
So yeah, it is really important that it be set up. We keep saying the same thing: it will all depend on how it’s set up so that the nonprofit actually can exercise control and oversight.
Rob Wiblin: Yeah, yeah. It is unfortunate that we are quite in the dark, but the statement was quite vague. It felt slightly deliberately vague. They really weren’t tying themselves to the mast with any particular commitments about exactly how this would go, which I guess is one reason that some people are nervous that maybe they’re going to try to slip in some stuff that is better for the investors or better for the company and weaker for the nonprofit. So people have to keep their attention on it.
Rose Chan Loui: What I don’t know, Rob, is whether it’s intentional or they just really don’t know yet. They’re trying to read everybody, all the tea leaves. I think from that perspective, it’s helpful, hopefully, that people are talking about it and getting ideas out there for how this can be made to work.
I mean, I don’t think anyone wants the company to die, right? So it’s just how they can continue to do good work and continue to fulfil their purpose. They have at least recommitted to the purpose. So now we’re all waiting to see how are you actually going to follow through. And we already know that there’s vulnerabilities, as we said. So let’s see if you can make it better. Because we already know that the nonprofit board was not feeling particularly empowered. But I think if we can empower the board, I’d be OK with some cleanup on the economic side.
Rob Wiblin: Yeah. It is very interesting that it’s quite unusual for attorneys general of a state to intervene in nonprofits for not pursuing their charitable mission effectively enough for doing stuff that maybe is somewhat contrary to the charitable mission.
Rose Chan Loui: Not for California.
Rob Wiblin: Oh, really? Do they intervene quite actively? I thought that usually they’re very busy. Do they really have time to intervene very often?
Rose Chan Loui: Well, I think when they do intervene is when it’s large transactions, because you do have to give notice. And they intervened here before OpenAI even gave notice, from what I understand, because they are the ones who wrote the letter and said, “Please give us these documents.”
Rob Wiblin: I was just saying because this is such a big deal, and now it’s really on the radar of the attorneys general, they’re very well aware of what is going on. They really have an opportunity here — while it’s on the agenda, while they’re up to speed — to intervene and say, “This is how you have to set it up with the public benefit corporation, because we don’t want to be cut out of the loop because you wrote some shoddy contracts here.”
Rose Chan Loui: Yeah. What probably surprised us more is that Delaware intervened and still remains interested, because they’re the ones who are generally pretty hands off. But maybe that is an indicator of how it was really a big about-face on purpose to just say, “Sorry, we don’t want to be controlled by that nonprofit anymore.”
Rob Wiblin: Yeah, they flew a little bit too close to the sun.
Rose Chan Loui: Yes.
Rob Wiblin: Coming back to that list, there’s number three: “Investor profits are capped, with above-cap profits owned by the nonprofit.” I guess that is going to go, but we think that’s maybe not the end of the world, as long as the nonprofit is compensated.
Rose Chan Loui: Right. Even though it’s interesting, because they are going to get the residual interest, so potentially huge amounts. But they were also last in line.
Rob Wiblin: Yeah. It might be that they just want 20% no matter what happens. They don’t want to be stacking all their returns in the scenarios where OpenAI is most impactful.
Then there’s: “Majority independent board commitment.” This is that a majority of people on the OpenAI nonprofit board can’t own a part of the OpenAI business. I guess there has been a bit of a loophole there where you can have a financial interest in a company’s success without necessarily owning the company itself. For example, you could own a business that supplies things to that company. So it would be great if they doubled down on this and said, “…and are not also profiting significantly in other ways.”
Rose Chan Loui: I think this is one of the things to fix, a good opportunity to fix something here — because that definition of “independence” is very narrow. That’s not the definition of independence that is otherwise used in nonprofit law. You know, when you turn in a conflicts of interest form — that I’m sure some listeners have done before — it doesn’t depend solely on your equity in the corporation that you’re on the board of. In fact, it’s often because you have interest in a partner.
Rob Wiblin: It’s true. I guess your husband or wife could own the shares, and on this grounds you would be still independent.
Rose Chan Loui: Yeah. So they need to fix that. Let’s just say they need to fix that. It does need to be at least majority independent, but they need to redefine independence.
I think the other question again is who appoints? Should there be another outside entity that maybe gets some seats on the board? That’s another possibility. That way you’d have some voices that would not be easily persuaded to go along with the PBC’s plans for something that might not seem like they should be doing.
Rob Wiblin: Yeah, that’s an interesting idea. Crazy idea: could the attorney general ask to place someone on the board so they’re involved and they can track what’s going on?
Rose Chan Loui: That’s who I’m thinking of. I don’t know if they do it, but some outside… I think one of the ideas that came out of the proposed legislation that they were trying to make go through was an independent commission made up of University of California. So maybe some academic or some AI watchdog organisation that really doesn’t have any for-profit interest in another… They have to be pure themselves. It can’t be a competitor. It has to be someone who really does not have skin in the game.
Rob Wiblin: Yeah. OK, number five was: “AGI, when developed, belongs to the nonprofit for the benefit of humanity.” I guess that would again depend on the agreements that you had between the nonprofit and the new PBC, and whether they were watertight, and under what circumstances they could intervene and so on. So remains to be seen.
Rose Chan Loui: Right, right. That was a nice thing that they had reserved at least in the deal with Microsoft. I don’t know what the deal is with everybody else. I guess it would be owned by the nonprofit right now, but it seems like going forward, whatever the PBC develops is going to belong to all the shareholders.
Rob Wiblin: Yeah. And number six was an interesting one: “Stop-and-assist commitment from Charter.” OpenAI the company has this charter that’s meant to kind of be its guiding principles, its guiding values. I think that they legally can change it, but they’ve had it for a long time from early on, and it’s supposedly been an important set of values that they’re meant to stick to and not be changing just for personal benefit.
And I think we just don’t know. They haven’t really commented on this charter and important parts of it, like the stop and assist commitment. That I think stands on its own. And I guess the PBC could pick it up, or maybe it will drop it. Maybe they’ll try to get rid of it as part of the restructure, but it’s a slightly separate issue.
Rose Chan Loui: I have to think about that some more. I guess I’m feeling like even though they keep saying that in their website, I don’t know if they would actually ever implement it. That might be a tough one for the investors. I really am not sure about this one, what I predict for that particular point. What do you think in terms of how important that is? I mean, it was a wonderful commitment to make.
Rob Wiblin: Yeah. Just to clarify for people for whom it’s escaped their mind: the stop and assist commitment was that if another company was very close to developing AGI — I can’t remember, it was like within a year or two — then OpenAI said, “We will stop competing and we won’t race with them. Instead we’ll assist them in producing AI collaboratively.”
I mean, it has felt a bit implausible that this would happen, and it might raise antitrust issues for that matter. And yes, certainly the investors could be very upset.
Rose Chan Loui: Yes, yes. I have to think some more about that one, whether or not that’s something we should fall on our swords for.
Rob Wiblin: I guess they just haven’t commented on it very much, but it feels like if you have a commitment like that, and then as you get close to the relevant time, you’re just like, “Oh, we got rid of that now, forget about it,” it feels like you have to say something more, and try to do something in place of this thing that you previously thought was super important, but now maybe you think it’s unrealistic. You can’t just promise stuff to people, then drop it.
Rose Chan Loui: What if it resided just in the nonprofit? In other words, because this is going to be the best resourced nonprofit in the world, they could give a grant to… You know, the people who really want to make sure that that nonprofit is independent, if it truly were independent, it could do that: it could financially support another company that is a competitor of OpenAI’s.
Rob Wiblin: The underlying goal here isn’t to produce a merger and ensure that there’s no competition. It’s to prevent a race and it’s to prevent premature deployment. It’s to prevent people feeling the pressure to deploy stuff that they themselves might feel uncomfortable with, and they might feel that it’s dangerous, but they’re like, “We have to do it in order to make money, because otherwise we’ll become irrelevant.”
And so you could have something that’s lesser than stop and assist, more like, “A core value of ours is that, as we approach incredibly powerful models, we will do everything that we can to coordinate with other companies to ensure that none of us feel pressure to deploy stuff that we do not feel entirely comfortable with.” That surely must be legal.
Rose Chan Loui: There have been efforts to do that, right? I think maybe led by Elon Musk, where he tried to pull all of them together to… And I think, again, talking to former OpenAI, I think one of the reasons they think it is so important for the nonprofit to remain within the company is that they saw OpenAI as a role model, and that that still can be an important role to play. We can’t control necessarily directly what other companies do, but that there’s a lot of value to people who care about how AI is developed for OpenAI to continue to keep this role model position. So both control what’s happening within the company, but also act as a leader for caring about safe development of AI.
Will the nonprofit’s giving just be corporate PR for the for-profit? [00:49:12]
Rob Wiblin: It seems like the nonprofit is giving up some degree of control, and it’s giving up the super-profits, and it’s going to get something in exchange for that. I think it’s probably going to ask for or receive some amount of money that it can use for grantmaking. Unfortunately, I think the early indications suggest that those grants might be more like standard corporate philanthropy, which are designed to make the company look good and to appease different stakeholders that might be threatening the company and to buy out people who are complaining.
So that is a little bit of a shame, if that is the way that things go: that it receives billions of dollars and it basically just becomes a PR effort.
Rose Chan Loui: I agree. There’s definitely remnants of that in this recent blog post.
Rob Wiblin: It says: “[The nonprofit] will become a big shareholder in the PBC, in an amount supported by independent financial advisors, giving the nonprofit resources to support programs so AI can benefit many different communities.”
It sounds like what they might do is use that money to basically buy services from OpenAI the company, and then provide them to different groups that they think are disadvantaged or that they want to help to support science or whatever else. I guess that would be the slightly cynical, slightly pessimistic take on how they’re going to use the resources. And there’s worse things, but it doesn’t really feel like it was consistent with the original vision for what the nonprofit would be prioritising.
Rose Chan Loui: If they want to do that on top of their specific, most important purpose, then I think that’s OK. As long as that important purpose doesn’t go away, fine, give services away, give some grants out. But remember your actual purpose, which is to make sure that the work is done well and safely.
Is this good, or not? [00:51:06]
Rob Wiblin: I’ve been very critical of the previous plans and I guess I’m a little bit cynical and a little bit nervous about whether this is going to be as good as it sounds. But to give them their due, I wanted to steelman the changes a bit.
So unlike with the previous thing, the nonprofit board still retains significant control. It might be hopefully the majority of the votes, or at least a major shareholder in the PBC. So it’s retaining either some or a lot of control.
And I guess as the number of actors contributing to OpenAI grows and the amount of resources that it wants to bring in expands to the hundreds of billions of dollars of investment, maybe it’s not so shocking that the nonprofit might have to loosen the controls on the company a little bit.
Whether that is consistent with their mission does depend on whether the board believes that OpenAI is safer and a better company for humanity to develop AGI than the competitors. Many people are sceptical of that, and I think given OpenAI’s shaky record, I’m not sure that they really are the better option than other competitors that might achieve AGI if they don’t. But if you think that the board does believe that OpenAI is the best bet, and I guess OpenAI has done some good stuff for safety in the past, then it might make sense for them to loosen the constraints somewhat in order to remain competitive. Not so crazy.
Rose Chan Loui: But again: not its purpose, right?
Rob Wiblin: Yeah. I guess we’ve got to be very clear that the purpose of OpenAI the nonprofit is not to develop AGI.
Rose Chan Loui: It will be a constant balancing act for that nonprofit board. If they do believe that they’re the best at it — which, again, some people will question — but if they do believe it, they’ll still constantly be balancing their desire to be first with the need to be careful. But that’s something that boards do.
Rob Wiblin: Yeah. They’re taking on a position of great power, and that is challenging. They’re going to have to do a difficult job, but they should rise to the occasion.
Rose Chan Loui: Yeah, it will come with a lot of responsibility. If they really do go forward with this plan, and they really do give some real control to the nonprofit, the nonprofit board members are sitting in really important, critical positions. I think then what we have to think about is: do we need outside monitoring or involvement in order to satisfy the regulators?
Rob Wiblin: Yeah. I would feel a lot more confident about all of this if the attorneys general maintained their interest and were actively tracking what was going on and ensured that the nonprofit was pursuing its mission to the greatest extent possible, rather than being intimidated or being deprived of resources and so on.
Rose Chan Loui: I think you’re very right. You raised a good point earlier about the nonprofit needing its own resources to help its nonprofit board members fulfil their fiduciary duty to the purpose. It’s to the purpose that they have a duty, not the company.
Ways this could still go wrong – but reasons for optimism [00:54:19]
Rob Wiblin: Right, exactly. So other concerns that people have raised: there’s the possibility of the public benefit corporation being sued by for-profit investors. But like that doesn’t happen that much. And would they really succeed in court? I don’t know. It sounds like it’s a worry worth raising, but maybe not such a massive issue.
Then there’s the issue that the nonprofit board may or may not be able to tell the public benefit corporation what to do, like it currently can to OpenAI. But it’s unusual for owners or general partners to intervene in specific business decisions of a company that they control. In general, the way that they exercise control is to remove the CEO if they don’t like them and to threaten to remove the CEO if they don’t do what they want. So maybe the fact that they can’t intervene in specific decisions, they were never going to do that anyway. So that’s not a huge loss to give up that option if the agreement doesn’t include it.
Rose Chan Loui: You mean the CEO part?
Rob Wiblin: Well, I’m saying if the nonprofit can no longer direct OpenAI to do specific things with its technology, then maybe they in practice weren’t doing that anyway. So to lose that ability over the public benefit corporation and instead only be able to remove the CEO, perhaps that’s not such a big sacrifice.
Rose Chan Loui: But isn’t that one of the things we would hope that they would ask for? Outsized control over those kinds of decisions?
Rob Wiblin: I see. Right.
Rose Chan Loui: I guess what you’re saying is the alternative would be to have the ability, even without being the majority stockholder, to remove the CEO. I would think a more workable way might be that they actually do have oversight and then the ability to stop something or to slow down something that’s going on at the operational level, that you would normally not have the ability to do just as a shareholder.
Rob Wiblin: Yeah, that’s really interesting. I guess I came into the interview not realising that that was such a viable option.
It sounds like in order to satisfy the investors, to give them something, the nonprofit could say, “We’re willing to narrow our scope. We’re not going to be intervening in product decisions and normal product launches, the kind of stuff that actually brings in tonnes of revenue right now, because that’s not our concern. Our concern is with the cutting-edge science and whether that could go wrong, and things that are tremendously powerful, much more powerful than the products that most people and businesses are going to be wanting to buy. That’s where we have the concern. And we only want to be saying, ‘We don’t feel comfortable with launching this right now. We want to delay it by some number of years while we study what’s going on better.'”
And fingers crossed the investors would be willing to tolerate that, because it shouldn’t be such a large loss of actual business potential.
Rose Chan Loui: Not if OpenAI at the same time is saying that they’re going to comply with that purpose. Because it seems like asking for that, if you’re already giving them the typical shareholder interest, but with just this carveout that this nonprofit exists to protect the public from bad AI, whatever that is… I mean, I think what’s going to be hard and challenging is writing this out. We all probably have different ideas, and then it’s going to involve lawyers working with AI experts to act, because I couldn’t do it without a lot of help.
But what I think the nonprofit would be saying is, “Look, we’re cleaning up all the economic side of things, we are giving you very typical ownership rights — but we need to hold on to this purpose, and it needs to still be primary over immediate profit.” I think that needs to be part of it somehow or else we’re back to, well, it’s just words.
Rob Wiblin: Yeah, yeah. So are the changes that they’re proposing — where it becomes a public benefit corporation and maybe they make all of these agreements — is that something that the attorney general has to sign off on and actively approve before it happens? Or I guess in practice, given that they’re taking such a big interest, they probably will ask for their permission to do it?
Rose Chan Loui: I think they’re going to want to see the whole thing, because kind of everything is tied together. Because again, that is a change. And remember, I think you went through this, the previous structure had all these bells and whistles to make sure that the nonprofit was in control. So that’s all being given up, but it needs to be replaced with something that will still protect mission while opening up on the economic side who owns the profits that OpenAI is going to make.
That’s why I think the nonprofit has some power here. They’re giving up a lot, but the investors will get something out of this: it won’t be such a weird structure.
Rob Wiblin: Yeah. The thing that I’m optimistic about here is that it’s not as if the nonprofit is against the investors making money or against the business thriving and being profitable. Everyone’s totally fine with that. Can’t they come up with some agreement that allows both sides to get 80% or 90% of what they most care about? I think that they should be able to, if they’re creative enough and they really try.
Rose Chan Loui: Yeah, I do too. And that’s why I’m saying that the investors should be willing to get this done, because the nonprofit’s primary goal is just to make sure that things are done safely and that they’re not skipping safety protocols that are important. Like you said, they don’t care about every little product that goes out there. It’s really the next big, huge development, and thinking through what the risks are of whatever it is. I’m talking in the abstract.
Rob Wiblin: All right. Well, it’s huge news. And overall, I feel like we definitely have a very good potential to be on a much better path now. They’ve recommitted to the mission. The devil is definitely going to be in the details, and they haven’t given us many details yet, so there’s definitely reason to worry and reason to keep a lot of eyes on this. But it’s great that the attorneys general are so engaged. There’s potential for an agreement that everyone is mostly happy with. So fingers crossed they work on it.
I guess one day, maybe down the line, we’ll find out what the internal discussions were about this a little bit more. But I expect that the board members are reading the things that people are writing, and taking a great interest in it. And I imagine, as I’ve said throughout this entire thing, that many of them are really sincerely trying to do their job. It’s challenging, but I hope they put in the hours.
Rose Chan Loui: Yeah. I think we have reason to be optimistic. Of course, that’s my personality. But I saw it as an encouraging announcement. Clearly our work is not done, but really, I think there’s a lot of opportunity to make this work.
Rob Wiblin: Yeah. OK, cool. Is there anything you want to say before we wrap up?
Rose Chan Loui: No. I think everyone should just keep their eye on it, and that there’s definitely more to come.
Rob Wiblin: I suspect we will be back. Whenever they explain the plan in a bunch more detail, we can evaluate whether it actually is protecting the nonprofit’s mission.
Rose Chan Loui: Correct. And just encouraging the attorneys general to do that as well, to keep the nonprofit’s interests and its purpose in mind.
Rob Wiblin: They’ve done a great job so far. All right, we’ll chat soon. Have a good day.
Rose Chan Loui: Thanks Rob! You too. Take care.
Related episodes
About the show
The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.
Get in touch with feedback or guest suggestions by emailing [email protected].
What should I listen to first?
We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:
Check out 'Effective Altruism: An Introduction'
Subscribe here, or anywhere you get podcasts:
If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.