Emergency pod: Don’t believe OpenAI’s “nonprofit” spin (with Tyler Whitmer)
Emergency pod: Don’t believe OpenAI’s “nonprofit” spin (with Tyler Whitmer)
By Robert Wiblin · Published May 15th, 2025
On this page:
- Introduction
- 1 Articles, books, and other media discussed in the show
- 2 Transcript
- 2.1 Cold open [00:00:00]
- 2.2 Who's Tyler Whitmer? [00:01:35]
- 2.3 The new plan may be no improvement [00:02:04]
- 2.4 The public hasn't even been allowed to know what they are owed [00:06:55]
- 2.5 Issues beyond control [00:11:02]
- 2.6 The new directors wouldn't have to pursue the current purpose [00:12:06]
- 2.7 The nonprofit might not even retain voting control [00:16:58]
- 2.8 The attorneys general could lose their enforcement oversight [00:22:11]
- 2.9 By default things go badly [00:29:09]
- 2.10 How to keep the mission in the restructure [00:32:25]
- 2.11 What will become of OpenAI's Charter? [00:37:11]
- 2.12 Ways to make things better, and not just avoid them getting worse [00:42:38]
- 2.13 How the AGs can avoid being disempowered [00:48:35]
- 2.14 Retaining the power to fire the CEO [00:54:49]
- 2.15 Will the current board get a financial stake in OpenAI? [00:57:40]
- 2.16 Could the AGs insist the current nonprofit agreement be made public? [00:59:15]
- 2.17 How OpenAI is valued should be transparent and scrutinised [01:01:00]
- 2.18 Investors aren't bad people, but they can't be trusted either [01:06:05]
- 3 Learn more
- 4 Related episodes
OpenAI’s recent announcement that its nonprofit would “retain control” of its for-profit business sounds reassuring. But this seemingly major concession, celebrated by so many, is in itself largely meaningless.
Litigator Tyler Whitmer is a coauthor of a newly published letter that describes this attempted sleight of hand and directs regulators on how to stop it.
As Tyler explains, the plan both before and after this announcement has been to convert OpenAI into a Delaware public benefit corporation (PBC) — and this alone will dramatically weaken the nonprofit’s ability to direct the business in pursuit of its charitable purpose: ensuring AGI is safe and “benefits all of humanity.”
Right now, the nonprofit directly controls the business. But were OpenAI to become a PBC, the nonprofit, rather than having its “hand on the lever,” would merely contribute to the decision of who does.
Why does this matter? Today, if OpenAI’s commercial arm were about to release an unhinged AI model that might make money but be bad for humanity, the nonprofit could directly intervene to stop it. In the proposed new structure, it likely couldn’t do much at all.
But it’s even worse than that: even if the nonprofit could select the PBC’s directors, those directors would have fundamentally different legal obligations from those of the nonprofit. A PBC director must balance public benefit with the interests of profit-driven shareholders — by default, they cannot legally prioritise public interest over profits, even if they and the controlling shareholder that appointed them want to do so.
As Tyler points out, there isn’t a single reported case of a shareholder successfully suing to enforce a PBC’s public benefit mission in the 10+ years since the Delaware PBC statute was enacted.
This extra step from the nonprofit to the PBC would also mean that the attorneys general of California and Delaware — who today are empowered to ensure the nonprofit pursues its mission — would find themselves powerless to act. These are probably not side effects but rather a Trojan horse for-profit investors are trying to slip past regulators.
Fortunately this can all be addressed — but it requires either the nonprofit board or the attorneys general of California and Delaware to promptly put their foot down and insist on watertight legal agreements that preserve OpenAI’s current governance safeguards and enforcement mechanisms.
As Tyler explains, the same arrangements that currently bind the OpenAI business have to be written into a new PBC’s certificate of incorporation — something that won’t happen by default and that powerful investors have every incentive to resist.
Without these protections, OpenAI’s new suggested structure wouldn’t “fix” anything. They would be a ruse that preserved the appearance of nonprofit control while gutting its substance.
Listen to our conversation with Tyler Whitmer to understand what’s at stake, and what the AGs and board members must do to ensure OpenAI remains committed to developing artificial general intelligence that benefits humanity rather than just investors.
This episode was originally recorded on May 13, 2025.
Video editing: Simon Monsour and Luke Monsour
Audio engineering: Ben Cordell, Milo McGuire, Simon Monsour, and Dominic Armstrong
Music: Ben Cordell
Transcriptions and web: Katy Moore
Articles, books, and other media discussed in the show
Tyler’s work:
- Not for Private Gain: Updated letter out today, May 15 — a followup from the original April 2025 open letter to the attorneys general of California and Delaware
- Legal Advocates for Safe Science and Technology — Tyler’s nonprofit dedicated to ensuring that advances in science and technology benefit society without compromising on safety
OpenAI context:
- May 5 announcement: Evolving OpenAI’s structure
- December 27, 2024 announcement: Why OpenAI’s structure must evolve to advance our mission
- OpenAI’s current structure and Charter
- OpenAI’s for-profit overhaul if far from being a done deal — article in Bloomberg (paywalled)
- OpenAI claims nonprofit will retain nominal control by Zvi Mowshowitz
Our other episodes on this topic:
- Did OpenAI give up, or is this just a new trap? (with Rose Chan Loui) (May 2025)
- Judge plants a legal time bomb under OpenAI (with Rose Chan Loui) (March 2025)
- Elon tries to crash OpenAI’s party (with Rose Chan Loui) (February 2025)
- Rose Chan Loui on OpenAI’s gambit to ditch its nonprofit (November 2024)
Transcript
Table of Contents
- 1 Cold open [00:00:00]
- 2 Who’s Tyler Whitmer? [00:01:35]
- 3 The new plan may be no improvement [00:02:04]
- 4 The public hasn’t even been allowed to know what they are owed [00:06:55]
- 5 Issues beyond control [00:11:02]
- 6 The new directors wouldn’t have to pursue the current purpose [00:12:06]
- 7 The nonprofit might not even retain voting control [00:16:58]
- 8 The attorneys general could lose their enforcement oversight [00:22:11]
- 9 By default things go badly [00:29:09]
- 10 How to keep the mission in the restructure [00:32:25]
- 11 What will become of OpenAI’s Charter? [00:37:11]
- 12 Ways to make things better, and not just avoid them getting worse [00:42:38]
- 13 How the AGs can avoid being disempowered [00:48:35]
- 14 Retaining the power to fire the CEO [00:54:49]
- 15 Will the current board get a financial stake in OpenAI? [00:57:40]
- 16 Could the AGs insist the current nonprofit agreement be made public? [00:59:15]
- 17 How OpenAI is valued should be transparent and scrutinised [01:01:00]
- 18 Investors aren’t bad people, but they can’t be trusted either [01:06:05]
Cold open [00:00:00]
Tyler Whitmer: I think there’s memes out there in the press that this was a big shift when this announcement came through — and I think there’s ways this could turn out well, but I don’t think that having the view that this was a big shift is really the right way to be thinking about the situation right now.
I think keeping the pressure on is the right way to respond to this, rather than think, “Oh good, we won. The nonprofit’s in control. We can all sleep easy at night.” I think there’s still a lot of work to be done. You know, there’s a way that this moves forward where you’re on a slippery slope to the profits becoming the end instead of the means, and we want to make sure that they remain the means and not the end.
It’s why OpenAI set themselves up the way they did to begin with, right? They understood that this was a unique situation, and they put in a unique corporate structure, a very thoughtful way of trying to put these safeguards around it. So we’re just really concerned with trying to protect those now.
Rob Wiblin: If someone was just trying to steal a whole lot of stuff from you, and then after a lot of public pressure, they decide that they’re not going to try to steal that stuff from you anymore; they are going to propose a different plan — a different, very complex plan that you find it quite hard to understand — I think that it remains sensible to have a defensive posture here and not to assume that without any further scrutiny or any further discussion, that the better angels will win out and all of these great things will happen by default.
Tyler Whitmer: This is also an opportunity for them to do better, like find things that aren’t in the existing Charter that would be good for the world and then write them into the articles.
The nonprofit board members owe a special fiduciary duty to humanity, as the beneficiaries of OpenAI’s charitable purpose to ensure that AGI is safe and benefits all of humanity. That mission applies to what they do right now, in approving whatever restructuring goes through.
Who’s Tyler Whitmer? [00:01:35]
Rob Wiblin: Today I’m speaking with Tyler Whitmer. Tyler was a partner at a law firm called Quinn Emanuel for many years, and more recently has been the founder of Legal Advocates for Safe Science and Technology.
But we are speaking with him today because he’s a coauthor on a letter to the attorneys general of California and Delaware, which should be coming out approximately today, about OpenAI’s announcement that its nonprofit is supposedly going to retain control of the business — which the letter argues is very much not necessarily true by default. Welcome to the show, Tyler.
Tyler Whitmer: Thanks for having me, Rob. Appreciate it.
The new plan may be no improvement [00:02:04]
Rob Wiblin: So my goals today are to explain to everyone, as clearly as possible, why it is that OpenAI’s big announcement last week, that it supposedly wasn’t going for-profit anymore, why that was probably very misleading — and in your view, maybe it was actually no improvement on the plans at all.
Then we’re going to say what it is that the attorneys general of California and Delaware and the nonprofit board members should insist on OpenAI doing to ensure that the public interest gets defended.
To launch us in, your letter says, “The updated proposed restructuring is identical in most respects to the original proposal.” Why do these new plans not necessarily make much difference?
Tyler Whitmer: I think it could be useful to take a step back and look a little bit at the status quo on the issues that we focused on in the letter and we’re going to focus on today, which is that the nonprofit has very direct and immediate control over OpenAI’s for-profit. And they made an announcement towards the end of last year that they were going to change that in a bunch of different ways. And then on Monday, May 5, they made an announcement that the nonprofit was going to retain control of the for-profit entity.
And I think the real crux of that is: what does it mean to retain control? By default, the control that they might retain really doesn’t approximate the status quo at all. And that’s what we’re concerned with.
Rob Wiblin: I see. So currently, as I understand it, the nonprofit entity has basically total control over the business — which is an LLC; occasionally we’ll call it OpenAI LLC, but basically that is just the business part of it. It’s a subsidiary of the nonprofit. They just get to control it completely.
Why is it that on the new plans, the nonprofit will probably lose the ability to actually direct what the business does?
Tyler Whitmer: So both the original restructuring plan and the revision that they announced last week — the LLC that you just described, which is a limited liability company in Delaware — would convert into what’s called a Delaware public benefit corporation.
And the way that it’s structured right now, the nonprofit is the manager of the LLC and has direct and immediate control of everything that the LLC does. If the LLC converts into a public benefit corporation, then “control” could mean something like the nonprofit retains the ability to elect the board of directors of the public benefit corporation, but it would not have the same kind of direct and immediate control that it would have with the LLC setup.
Rob Wiblin: Yeah. And electing the board of directors, why isn’t that a sufficient level of control?
Tyler Whitmer: One way to think about it is how is this actually operationalised in areas that we care about? For example, if OpenAI LLC currently were planning to do something like release a model that appeared to be unsafe, the nonprofit would have the power now to just step in and stop that from happening very directly. And that’s a product of the immediate control it has by virtue of being the manager of the LLC.
In the public benefit corporation situation, using that same hypothetical situation, if there’s a model that’s about to be rolled out that’s unsafe, the nonprofit would not necessarily have the ability to step in and stop that from happening immediately. They would be able to elect a board that they would hope would do the thing that they want to do in that situation and keep that unsafe model from being rolled out, but they wouldn’t actually be able to step in and stop it immediately.
Rob Wiblin: I see. And from reading your letter — which people can go and read; it’s only a few pages, and it’s very entertaining, and I think a pretty straightforward thing to understand — they actually might not even be able to do that. It’s unclear, in fact, that they would have almost any influence potentially over what the business could do by default.
One thing that I didn’t understand until recently is that the reason that OpenAI the nonprofit has so much control over the business today is that they have this whole set of binding legal agreements that were instituted at the outset of this entire arrangement, which basically are quite nonstandard, and they give the nonprofit the ability to intervene in all of these ways and to have this kind of strong level of control.
And if the LLC were to convert into a Delaware public benefit corporation — which I guess has actually always been the plan, both before and today — potentially all of those contractual agreements that were very carefully arranged in 2019 to ensure that the nonprofit mission would take priority over making money, that would all basically be swept away by default. So you would be reverting to a situation where the nonprofit board would have a more typical level of control that a shareholder might have within a company, which is radically less than the kinds of direct managerial control that they have over it today.
The public hasn’t even been allowed to know what they are owed [00:06:55]
Rob Wiblin: And one crazy thing is, this agreement between the nonprofit and OpenAI the business — that is the asset that the nonprofit has, that allows it to pursue its charitable mission, that allows it to pursue its goal of ensuring that AGI benefits all of humanity; that is the chip that it has to play, it’s the way that it defends the public interest — that document itself is not public. The public does not know what it owns, what rights it has. And in fact it could lose all of the things that are in that document and maybe not even realise, because it’s never actually been declared what kinds of rights the nonprofit has as the owner of the business. Have I understood this right?
Tyler Whitmer: Yeah, that’s right. The operating agreement of the LLC that is the current for-profit entity is not a public document. OpenAI has described that document on their website in some amount of detail — and it’s sort of enough detail that you get the sense that it is a bespoke, unique situation, where there is an awful lot of control and power given to the nonprofit board.
And LLCs are sort of designed to be an extremely malleable situation, so there’s more flexibility with what you can do with an LLC than what you can do with a PBC. So one of the concerns that we have is that, in converting from an LLC to a PBC, you remove some of that optionality. There’s more you can do to structure the LLC exactly how you want it than with a PBC, so you’re taking away some of that ability to structure it just the way you want it.
One thing that a PBC could be good for, though, is that the equivalent of the operating agreement for a PBC would be the articles of incorporation or the certificate of incorporation, and that would be a public document.
So to your point, we know a little bit about what’s in the LLC agreement by reading OpenAI’s website. We don’t know everything about it. I think it would be really good for the world if we were able to hold up what’s in that LLC agreement and the kind of control that exists there, and hold that up against whatever they put into — if they go through with the restructuring — the PBC’s certificate of incorporation, and make sure that that’s what’s happening with the PBC, and the relationship between the PBC and the nonprofit gives the nonprofit at least as much control over the PBC —
I shouldn’t say it this way. It’s not so much about control every time. I think what’s really important to us here is that there be the primacy of the nonprofit’s mission, which is to ensure that AGI is safe and benefits all of humanity, and that that mission and the primacy of that mission is enforceable by someone with the ability to really enforce it.
Right now, the situation is that the LLC agreement has been written in a way — we’re told by OpenAI on their website — that enforces that primacy of the mission. And then because it’s a nonprofit, the attorneys general of California and Delaware, the reason we’re writing the letters to them is that they have very direct oversight and ability to regulate the nonprofit, because it’s a nonprofit.
So we want to make sure that whatever happens with the restructuring, we end up in at least as good a situation as we are now. And I think you’re right that having some transparency into what the actual details were of that LLC agreement could help us do that.
Rob Wiblin: Yeah, I feel like I’m taking crazy pills when I say that the whole thing that’s under dispute here is actually not public. It really feels like it’s a bare minimum for the attorneys general to insist on that — given that there’s so much public interest in this, it is such a matter of enormous public concern. And it’s impossible for us to assess whether in fact the nonprofit is being ripped off, whether its mission is being pursued, without knowing what the status quo situation is, without knowing what rights and what control they currently have.
Surely that agreement, or at least large parts of it, should be made public, so that people can assess whether the new situation that’s being proposed is as good or better, or actually worse and maybe shouldn’t be legal.
Tyler Whitmer: Yeah I think it would be great for them to make that more transparent. I think that would be good for the world for sure.
Issues beyond control [00:11:02]
Rob Wiblin: Nice. One other thing, just quickly before we go on and explain more of the details here, is that your letter focuses basically on this question of control and managerial control and primacy of the nonprofit’s charitable purpose of ensuring that AGI benefits all of humanity.
But there’s two other things that are changing here, either definitely or almost certainly: one, that the profit caps on future investors and probably existing investors will be removed — which is a big loss to the nonprofit, because previously those profits would have gone to them. And also currently I think it is probably the case that the nonprofit has basically the right to operate and make decisions about how any AGI that OpenAI develops is used and applied, and that is almost certainly going to go out of the window.
These are big concessions from the nonprofit that they’re making. And in order for this to make sense in pursuit of their nonprofit mission, they ought to receive some substantial compensation for that.
But we’re going to set those issues aside in order to just consider these issues of control, which are themselves both important and quite challenging to understand.
The new directors wouldn’t have to pursue the current purpose [00:12:06]
Rob Wiblin: Before we get to the solutions, let’s lay out precisely why it is that the nonprofit would, by default, basically lose the ability to determine what OpenAI the business did on the plan that OpenAI has sketched out. And this was true before the announcement last week and remains true after that announcement.
In the letter, there’s three issues that you raise. The first one is, “OpenAI’s business directors would not have a fiduciary duty to advance the charitable mission over investor interests.” Can you explain that? And what duty would they have instead in this new world?
Tyler Whitmer: Yeah. So with a nonprofit, the nonprofit directors have a special fiduciary duty to the beneficiaries of the nonprofit’s purpose, the mission of the nonprofit. And that’s very clear, and that’s enforceable by the attorneys general. With a public benefit corporation, even if you wrote precisely the nonprofit mission into the public benefit mission of the PBC, by statute in Delaware, the PBC’s directors still have to balance that mission with the financial interests of their shareholders, and there is no special fiduciary duty to the beneficiaries of the PBC’s mission the way there is with a nonprofit.
So you really are changing the duties that are owed to the public by changing it from a nonprofit to a PBC.
Rob Wiblin: Right. So to conceptualise it: you’ve got the nonprofit at the top here as a shareholder, or having some sort of voting rights over the new public benefit corporation — the PBC, this new entity. Now the nonprofit has its own charitable purpose and it’s got its own duties, which is to pursue that charitable purpose as much as it can. In this new situation, you have the PBC under that, but it has its own quite separate obligations that it must pursue — that its directors must pursue, and that I guess its staff must pursue by extension.
I think we’re not even sure whether it’s actually the case that the nonprofit would get to determine the board of directors of the new business, but let’s say that it does have the ability to fire and replace the directors of the business. Let’s say that they started doing things that were excessively pursuing profit and not pursuing the public benefit, in the views of the nonprofit. The nonprofit could, say, vote to remove the directors or remove some of the directors, and replace them with other directors who it thought in their hearts wanted to pursue the nonprofit’s purpose of ensuring that AGI benefits all of humanity.
But as soon as those people are directors of the business, they have new different legal obligations from that. Even if they wanted to pursue the nonprofit’s mission of ensuring that AGI benefits all of humanity, in their role as directors, they have to follow the fiduciary duties of the PBC, of the public benefit corporation — which is to balance profit against the other purpose of the organisation. I guess we don’t know exactly how that will be stated. So even if they wanted to stop something that they thought was too harmful to the world relative to the amount of money that was made, they might not in fact be entitled to do that.
And you can imagine the nonprofit keeps firing the directors and replacing them with new people, and every time they just have to do the same thing: they now have to pursue the purpose of the public benefit corporation. If some of them didn’t, and in fact they instead just went with the purpose of the nonprofit that had put them there, then they would be at risk of getting sued by the for-profit investors, who would say, “We’re getting ripped off here. You have to follow the purpose of the public benefit corporation.”
Have I understood all of this correctly from the letter?
Tyler Whitmer: Yeah, I think that’s about right. And there’s things you can do — like, as I said, the certificate of incorporation of the PBC, that could change that. But if you just did an “off-the-rack,” as they say, PBC certificate of incorporation, the description that you just gave is accurate.
One way to think about it is: right now, with the LLC setup, the nonprofit basically has its hand on the lever and controls things directly. With an off-the-rack PBC setup, you’re going to make that a situation where the nonprofit doesn’t have its hand on the lever anymore, but it gets to decide whose hand is on the lever. But there’s still that one-step distance from actual control.
And that is enough, in our view, to really put at risk what’s important here: a legally enforceable primacy of the nonprofit’s mission over the interests of shareholders and any other interests that might be there. Once there’s that one step removed — from the hand on the lever versus controlling who has their hand on the lever — it disrupts that enough that it’s a problem for us.
The nonprofit might not even retain voting control [00:16:58]
Rob Wiblin: Yeah, right. The second point you raise is that OpenAI nonprofit’s board would have substantially less control after this than it does today. We’ve somewhat covered that. Is there anything more to say about exactly how it would be losing control?
Tyler Whitmer: I mean, I think it’s what you just described, which is really that it is no longer directly controlling things. It is maybe having power over who controls the thing. Ordinarily, depending on how this is set up, like we assumed in the letter that the nonprofit would retain what’s called “voting control.” So they would have the ability to elect the directors.
There’s some credible reporting out there that it’s possible they would set it up to where the nonprofit doesn’t even have that power, or doesn’t have power to, for example, hire and fire the CEO. Normally a controlling shareholder would have the power to elect the directors, and the directors would have the power to hire and fire a CEO. Obviously that’s the kind of control you would need at a very, very bare minimum, and would not be enough to approximate the current situation.
Rob Wiblin: It is interesting. So we’re assuming that they’re not going to by default go out of their way to make these binding, bespoke, unusual arrangements that would preserve the current level of control that the nonprofit has.
But I feel like that’s a reasonable assumption, because one, they haven’t said they’re going to in any way; they’ve never indicated that is going to be the case. And two, so many of the different interest groups here would really like that not to be the case. All of the for-profit investors who have tens, hundreds of billions of dollars at stake and I think are heavily involved in this process would really love to remove all of those limitations. And I suspect that there’s a good chance that people who have that level of motivation might definitely get their way.
You say in the letter:
We assume below that OpenAI-nonprofit would have voting control of OpenAI-PBC’s board in some way. What such control would entail appears to be unsettled, however. For example, credible reporting states that it has not yet been decided whether OpenAI-nonprofit’s board would have the power to fire OpenAI-PBC’s directors or executives.
I mean, I guess they’ve said this very vague thing that the nonprofit will “retain control.” Could they really pass off minority voting rights without even the ability to change the board of directors — like the main thing that shareholders in general are able to do — as that would be retaining control? That would just feel like the most flagrant attempt at fraud, and I find it hard to believe that the attorneys general would ever sign off on such a crazy arrangement.
Tyler Whitmer: I agree with you. I think most folks who are familiar with the way businesses work, and certainly most lawyers, when you read “retain control,” the baseline that you would assume is either by a majority vote or by some kind of special class of shares voting that they would have the ability to hire and fire directors. That seems like the absolute baseline.
So I agree with you. I would be shocked, and it would be pretty egregious if they tried to do something less than that. Although again, I think this is an article in Bloomberg that is referenced in the letter where it at least seemed like that might still be on the table — which is pretty shocking.
Rob Wiblin: I guess if you were being really sneaky, you could try putting forward that as your initial proposal, and then maybe that’s the thing that gets disputed. It’s like making an outrageous initial bid basically in the hope that then shapes people’s baseline expectations, even if you don’t actually expect to get it through.
Tyler Whitmer: Yeah, that’s why we’ve been trying to frame everything up as best we can. In the original Not for Private Gain letter that came out in the middle of April, we really walk through the very intentional process that OpenAI went through to sort of bind themselves to the mast here, to put in all these governance safeguards based on the current structure — with the LLC and the nonprofit is directly controlling the LLC — so that they could really be sure.
This is like a years-long process that they went through to button this up as much as possible, to make sure that the nonprofit mission took primacy over all other interests. You know, they’re running a business that generates revenue, and it’s really easy for that to start to take over. And they really went out of their way to handcuff themselves here, to say, “We have to set this up so that the nonprofit mission always takes primacy over the profit interests here, and that the nonprofit is sitting at the top of this.”
Which again, gives the attorneys general oversight over the entire situation, which are well resourced. It’s law enforcement, right? Like they’re giving themselves over to an outside force that absolutely has the power and ability to make sure the mission takes precedence.
And for what it’s worth, the reason we’re having this conversation right now is because that’s the case, right? If the nonprofit wasn’t the nonprofit, and if the attorneys general didn’t have the oversight authority that they have because the nonprofit sits where it does in the corporate structure right now, we wouldn’t be having this conversation — because they could just make all these changes and no one would have anything to say about it, right?
So the fact that we’re having this conversation is part of the reason we think it’s so important for the nonprofit to stay in the position that it’s in now, functionally — even if the letters after the entity names change.
The attorneys general could lose their enforcement oversight [00:22:11]
Rob Wiblin: Right. That leads perfectly into the third point you make, which is: “You, as Attorneys General, would have limited enforcement oversight.” Can you explain why it is that by default the attorneys general would not be able to weigh in on what OpenAI the business was doing, if they thought it was conflicting with the nonprofit’s charitable purpose?
Tyler Whitmer: Yeah. This will rhyme a little bit with the analogy of hand on the lever versus deciding whose hand is on the lever. The attorneys general have oversight over nonprofits by virtue of the fact that nonprofits don’t have shareholders; they don’t have other stakeholders that could make sure that they’re adhering to their charitable purposes. For that reason, the state attorneys general are sort of set up as the overseers of nonprofits at the state level.
And in California there’s a big statutory regime by which this happens. In Delaware, it’s mostly just that nonprofit corporations are beholden to Delaware corporate law, and then the attorney general just has a special oversight authority over nonprofits that they wouldn’t have over traditional for-profit corporations.
The reason that this is an issue is the PBC is not subject to that direct oversight by the attorneys general. Only nonprofits are. So the PBC itself, unless it’s doing something illegal that would implicate criminal law in some way, the attorneys general wouldn’t have anything to say about it. It’s up to the corporation how the corporation does its things. So if the PBC did something that was contrary to its public benefit, the folks that would have the ability to enforce that would be the shareholders of the PBC.
And it’s worth noting here how little power that is. The Delaware public benefit corporation statute was passed over a decade ago — I think it was 13 or something years ago — and since that time, there is not a single reported case of a shareholder of a PBC successfully suing to enforce the PBC’s public benefit mission.
So you really are, if you remove the nonprofit from the situation and just focus on the PBC, you’re taking the attorneys general out of their oversight position and replacing them with shareholders who may or may not have any power — and most of the shareholders that would have the ability to sue obviously are going to be more interested in putting their thumb on the side of profit motive than they are going to be on putting their thumb on the side of reining in the profit motive and furtherance of the charitable mission. It really does hamstring things pretty significantly.
Rob Wiblin: OK, so the attorneys general would be pretty significantly cut out of the loop, or basically completely cut out of the loop.
But it is the case that the nonprofit would be a shareholder, so it would in principle be able to bring a case, and say, “You’re not doing enough to pursue the public benefit purpose.” And you’re saying they could try bringing a case about that, but you know, good luck to them. Because in fact, no one else has ever managed to successfully change what a business does on that grounds.
Tyler Whitmer: There’s a couple things to say about that. One is what you just said: that it’s quite difficult. So there’s a thing in Delaware corporate law called the “business judgment rule,” which is basically that there’s a lot of deference given to decisions by boards of directors of Delaware corporations, and that includes Delaware public benefit corporations.
So it’s just really difficult to sue a Delaware corporation, as a shareholder, for not doing what it’s supposed to do. And in the case of the PBC, it’s even more difficult to do it when what it’s not supposed to be doing is the public benefit versus shareholder interests. So that’s one piece of it.
Another piece of this, from purposes of enforcement, is you’re suing for something that happened presumably already — whatever impact that thing had has already happened, and you’re suing — and it takes sometimes years for these cases to get through the court system.
So you’re talking about the difference between, “I hear as a director that the for-profit entity is going to do something that would be bad, and I’m able to step in and keep that from happening sort of immediately,” to, “That bad thing has been done. And I’m filing a complaint in court weeks later, trying to correct the thing.” For folks who are worried about the safety of frontier AI models, some of the harms that people are really concerned about aren’t… You know, once the cat’s out of the bag, the cat’s out of the bag.
So you really need to have that front-end filter control instead of a back-end “try to clean up the mess” control.
Rob Wiblin: Right. So the new arrangement might be that if OpenAI the business deployed a very unsafe AI model that was contrary to the public interest, the nonprofit could sue in Delaware courts — and maybe three years later, the courts would come back and say, “Yes, we would like the business to weigh the public interest somewhat more when making decisions of this type.”
Tyler Whitmer: I mean, there’s a broad spectrum of things that could happen, but in general the power to sue after the fact may be totally useless in this unique situation.
And look, the fact that this is a unique situation, and the technology that’s at issue here is why it was so important: it’s why OpenAI set themselves up the way they did to begin with, right? They understood that this was a unique situation, and they put in a unique corporate structure, a very thoughtful way of trying to put these safeguards around it. So we’re just really concerned with trying to protect those now, because the situation is so unique.
Rob Wiblin: It’s the whole reason OpenAI exists. It’s the reason it was a nonprofit at its foundation. It’s the reason all of these very intense agreements were negotiated in 2019 to try to tie it to the mast. To ignore that and say that actually, this is not the core issue today would be crazy.
I guess it’s perhaps a little bit challenging to bring the attorneys general up to speed in all this, because there’s so much history here and they’re not in the loop about all these intricacies of AI concerns and OpenAI’s vision for the future and so on. Unfortunately, they do kind of have to get up to speed on this.
Tyler Whitmer: Yeah. That was like a big motivation behind the Not for Private Gain letter from back in the middle of April, was really to lay out the history and build up to the current status quo situation in a way that could help the AGs, and other people who haven’t been paying as much attention to this as you or I have: to get them up to speed, and give them the context they need to be able to understand that there’s a reason this exists the way that it does. And that if you just turn this into an off-the-rack, regular corporate governance situation, that could be really catastrophically bad.
By default things go badly [00:29:09]
Rob Wiblin: Yeah, there’s this difficult emotional issue here that I think some people are grappling with: throughout all of this conversation, we’re saying “by default” — we’re saying if they don’t take some major evasive action to avoid these outcomes, if they don’t figure out all these legal agreements and make them really watertight, then this is the way that things will go.
Should we be so pessimistic or so cynical as to think that OpenAI is not going to do these good things that would pursue its charitable mission? Of course, there’s many different actors: there’s people who work at the business, there’s the people in the nonprofit, their legal advisors and on and on. I think many people have the intuition that last week they made this big concession, they made this big change to their plans in response to the criticisms that we’ve been making, so let’s assume some greater level of good faith than perhaps what we had thought before.
But I think on balance, I would say this action has kind of conceded that they were before trying to get away with a change that was illegal and that the attorneys general would not have signed off on. It was basically an enormous swindle of the public interest of what things that the public owns and is owed in pursuit of the interests of other private groups.
I think if someone was just trying to steal a whole lot of stuff from you, and then after a lot of public pressure, they decide that they’re not going to try to steal that stuff from you anymore, but they are going to propose a different plan — a different, very complex plan that you find quite hard to understand — I think one is entitled to be on guard at least.
Maybe we shouldn’t be sure about what’s going on and the views of the different people, but I think that it remains sensible to have a defensive posture here and not to assume that without any further scrutiny or any further discussion, their better angels will win out and all of these great things will happen by default. I don’t think that we should assume that.
Tyler Whitmer: I think that’s absolutely right. I think being on guard is exactly the right way to think about it. And just keeping the pressure on is really important here.
One thing that I was encouraged by from the May 5 announcement, and some quotes in the press after that by in particular the Delaware AG’s office, is that it really seems like the Attorneys General’s offices really are paying close attention to this. And it suggests that maybe there’s folks on the board at OpenAI who are tuned into this conversation and might be really listening to these messages.
I think that we should be thankful that they are shifting gears a little bit, even as we stay on guard and keep the pressure on, and hope the folks on the board and the AGs do right by the situation — which again, I think will require a lot more doing. It won’t happen by default, given what they’ve announced now. They really need to build in the governance safeguards into the PBC as much as possible, and it’s a little harder to do that with the PBC than it is with the LLC.
So I think that’s an important thing to keep in mind: it is going to be more difficult to approximate the status quo with the PBC than it was with the LLC. But I think keeping the pressure on to make sure that as the restructuring moves forward, the governance safeguards that have been in place — and we can walk through what some of those are and how important they are if you want to — that stuff is not going to stay there unless a lot of work is done to put it there in the new structure.
How to keep the mission in the restructure [00:32:25]
Rob Wiblin: Let’s turn to solutions. You have two asks for the attorneys general and the nonprofit board. So yeah, let’s go through and elaborate on them and explain the impact that they would have.
The first — which you think of as trivially obvious, and absolutely necessary and the absolute bare minimum — is to “Include a primary fiduciary duty to OpenAI’s charitable mission and charter in OpenAI-PBC’s certificate of incorporation.” What does that mean and why does it matter?
Tyler Whitmer: So the way that PBCs work that’s different from the LLC: in the LLC you can just pretty clearly alter fiduciary duties of the manager in the LLC agreement. It’s very malleable. We understand from what they’ve said on their website that they’ve done that, and that’s the status quo situation. In the PBC, it’s going to be a little bit more difficult to put that in place. So what we really are asking for them to do is to use the certificate of incorporation of the PBC to bake in at least the level of primacy of the nonprofit mission as exists in the LLC.
By default, the PBC directors have to balance shareholder interest with the public benefit mission of the PBC. They’re going to have to write something in the certificate that makes it clear that that balance has to give way, at least in critical situations — for example, the safety of a frontier model that’s being rolled out — and write that in.
And right now there’s a reference to the OpenAI Charter. The Charter is a thing that’s on OpenAI’s website and it has been on OpenAI’s website for a long time. But that’s really all it is right now — a thing on OpenAI’s website — unless it’s incorporated into a document with real legal grip.
The LLC agreement — as we understand it, based on what they’ve said on their website — incorporates the charter into the LLC agreement. So the LLC agreement says, “You, as the manager of the LLC, are obligated to put primary the charitable mission.” The purpose of the [LLC] is to further that charitable mission, and it includes the Charter — which expands a little bit on the charitable mission as it exists in the certificate of incorporation of the nonprofit right now.
And in the PBC, they have an opportunity to do that with the PBC’s certificate of incorporation and the articles of incorporation in a public place. In some sense, that would be an upgrade over the status quo, because right now they could probably just change the Charter, right? And if they were to bake the Charter in to the articles of incorporation of the PBC, that would actually give it a little bit more legal teeth than it has now. So that’s something we think should happen.
Rob Wiblin: Right. There’s a couple of things here. If you put a primary fiduciary duty to OpenAI’s charitable mission and the Charter in the public benefit corporation’s certificate of incorporation, does that mean that it no longer has to balance those values against profit, or does it still have to do that balancing act because it’s a public benefit corporation, and that’s like their whole nature?
Tyler Whitmer: This is one of the issues where there’s a lot of uncertainty, because there’s not a lot of case law on what you can do with PBC articles of incorporation. There’s a lot of flexibility under Delaware law for what you can put into articles of incorporation. The thing you can’t do is do something that’s contrary to law. So there would have to be some balancing, because in the statutes in Delaware, that’s basically the description of what a public benefit corporation is and does, right? It has some balancing.
And look, I’m not a corporate lawyer. I’m not the deal guy. There’s probably a lot of moving parts here, and a lot of different ways that you could try to instantiate the sort of principles that we’re talking about in terms of keeping the primacy of the nonprofit mission and its enforceability paramount here. But I think writing it out so that there’s balancing, but there’s a really heavy thumb on the scale of the public benefit mission — and especially in places where that’s particularly important, given the unique nature of the technology here — is one way I could imagine doing that.
But for what it’s worth, anything you do here could be challenged in court by the shareholders. So my understanding is that all of this is happening in negotiation with investors and with the AGs and all of that right now.
But it is not as easy to maintain the status quo with the PBC structure because of the thing you identified: because by statute, there still has to be some balancing involved. And the question is, how much can you put a thumb on the scale? How much can you make it more like the LLC situation without running afoul of that statute?
What will become of OpenAI’s Charter? [00:37:11]
Rob Wiblin: Right. Let’s talk about the Charter a little bit.
Correction: Hey everyone, Rob here. I’m just recording this over an error in the original episode. In the episode we originally released I say that my understanding is that the OpenAI Charter was created by the business and might be easy for it to change.
But my understanding was wrong! I always hate getting things wrong, but in this case it happily gives us a great chance to clarify the reality.
The situation is actually that the nonprofit established the OpenAI Charter in 2018, and it’s written into the operating agreement of the business, saying that it has to prioritise the nonprofit mission and the Charter.
The Charter then effectively operates as an elaboration of the mission inasmuch as the mission statement, being very short, is vague or underspecified. The Charter helped to address worries about mission drift at OpenAI by more clearly tying it to the mast on particular issues.
It’s also not something the business can change, only the nonprofit can do so.
The Charter was a big deal and there was a wide effort to get it written before OpenAI adopted its new structure. It’s just 300 words, so extremely readable if you want to Google and check it out. I think it’s also great, reflecting the idealism and focus on benefitting everyone over making profit that was more dominant in 2018.
Its 4 headings are:
Broadly distributed benefits
Long-term safety
Technical leadership
Cooperative orientation
The famous “stop and assist” commitment appears under “Long-term safety,” and reads:
We are concerned about late-stage AGI development becoming a competitive race without time for adequate safety precautions. Therefore, if a value-aligned, safety-conscious project comes close to building AGI before we do, we commit to stop competing with and start assisting this project. We will work out specifics in case-by-case agreements, but a typical triggering condition might be “a better-than-even chance of success in the next two years.”
Another interesting line is:
Our primary fiduciary duty is to humanity. We anticipate needing to marshal substantial resources to fulfill our mission, but will always diligently act to minimize conflicts of interest among our employees and stakeholders that could compromise broad benefit.
Naively, I think that would require that anyone who stands to benefit from the current restructure to make sure they’re not involved in development or deciding whether to proceed. Trickily that could include almost all staff at the OpenAI business, seeing as almost all hold substantial stock in the company which would become worth more if the company could sideline the nonprofit. And given the mention of employees and stakeholders, that probably includes investors. It really should be the nine nonprofit board members who don’t work at OpenAI, and their independent employees whose exclusive duty is to the nonprofit, who do this all themselves.
Anyway, for this whole Charter to not evaporate in a puff of corporate restructuring it would have to be written into the new founding documents of any new public benefit corporation. I think that’s definitely something the attorneys general and nonprofit board should insist happens, because anything else is itself inconsistent with pursuing the mission and Charter.
Back to the interview.
Rob Wiblin: Another interesting improvement that occurs to me is that they could consider additional things to their Charter — stuff that they might not have thought of in 2019, but now seem more relevant.
For example, they talk quite a lot about the importance of OpenAI pursuing its work in order to ensure that AGI is developed in the United States first rather than in China. Maybe they should have some commitment to, if it’s necessary, cooperate with some other company or with the government or some other group in order to ensure that is the case — even if that conflicts potentially with their commercial interests, that is something that they will do, because this is an important part of how they’re hoping to benefit the public.
I think that’s something that would not have been so salient in 2019, but is very salient now, and is the kind of thing that if they were reassessing the Charter and considering stuff to add, then that would be quite natural.
Tyler Whitmer: A couple of things in response to that. First of all, there is in the current Charter the “stop and assist commitment,” is the sort of shorthand we use for it — which is something like what you just described. In the Charter they say, “if a value-aligned, safety-conscious project comes close to building AGI before we do, we commit to stop competing with and start assisting this project” — making sure that the AGI that gets developed, whoever develops it, is safe and benefits all of humanity.
And that’s something that by default — if they move to a for-profit model and the PBC takes over from the LLC and the control that we’ve talked about goes away — it’s not clear that would still be there. So I think writing that commitment into the articles of incorporation would be a great way of making that a durable commitment that they really have written into their articles.
And by the way, that makes it so that everyone who’s investing in the company knows about it, right? It’s out there in the public, so no one’s going to say, “We invested, and we thought you weren’t going to stop and assist one day.” You know, it’s there. It allows you to say, “Anyone who’s buying into this situation knows that this is what we’re about.”
They’ve kind of done that already, if you’ve seen the purple box on the website where people who invest have to acknowledge that their investment should be made in the spirit of a donation, et cetera. They’ve been trying to be really upfront about this, and the PBC’s articles could give them an opportunity to do that in an even more durable way.
Ways to make things better, and not just avoid them getting worse [00:42:38]
Tyler Whitmer: Part of what you said made me think of an important bit here, which is that this is an opportunity. We talked about the cynical view of this thing, and staying on guard and keeping the pressure on. I think that’s really important — and for what it’s worth, that’s definitely going to be my mindset, and I think the mindset of a lot of the advocates that are working on this.
But this is also an opportunity for them to do better, right? You could have a rosier view of this, and stay on guard, but you could do better. Like you said, you could find things that aren’t in the existing Charter that would be good for the world, and then write them into the articles now and make it even more protective.
One of the things that we’ve been talking about in a lot of places is just the list of things that we put in the end of the Not for Private Gain letter, about just really making sure the nonprofit board is appropriately empowered and has the transparency that they need to make sure that the nonprofit mission is protected.
You could write into the articles really specific rules that make sure that the board has the right knowledge and understanding of the technology and the ecosystem to do their jobs right, has the kind of transparency and information flow that they need to actually exercise the control that we think they need to have in order to maintain the primacy of the nonprofit mission and keep it enforceable. You could write those things in, and make them even more durable and more enforceable.
We talk a lot here about floors — like we think the very bare minimum that they need to do is XYZ — but this is a real opportunity for them to do way more than the bare minimum, and make the situation even better than the status quo. And I would encourage them and hope that they would do that.
Rob Wiblin: Yeah. Both for the board — while their head is in this space, and they’re considering all of these different things, and there’s potential to rewrite some of the agreements and improve them and pursue the nonprofit’s mission even more — and also the attorneys general’s offices: they’ve got their head in this game right now, they’re going to be getting their head across the issues.
They can say, “Things have gotten a little bit unbalanced since 2019 through today: the business has grown so enormously, and the resourcing of the nonprofit that is meant to be scrutinising it and ensuring that it pursues this charitable purpose has not really kept pace with that. So we would like them to have the personnel, the financing, the access to information, the independent ability to get advice without worrying that this is going to wind people up and cause problems for them,” all of that kind of thing. It would be very natural for the attorneys general to say these are the new arrangements that are necessary for this nonprofit to actually pursue its goals, so let’s put them in place.
Tyler Whitmer: Yeah, I think that’s right. And stepping back a little bit here, the nonprofit has a fiduciary duty to its mission right now: the nonprofit board members owe a special fiduciary duty to humanity as the beneficiaries of OpenAI’s charitable purpose to ensure that AGI is safe and benefits all of humanity. And that mission applies to what they do right now in approving whatever restructuring goes through. So the AGs have the power to enforce that fiduciary duty on behalf of the beneficiaries of the charitable purpose right now.
It’s almost like a meta point, where the very fact of what they’re doing right now is subject to that fiduciary duty, and they have to be doing what they do now in furtherance of the mission. Which is why I think, as we’ve been saying, there’s a floor here, but maybe the best way to further the charitable mission through this restructuring is to do more than the floor. And I think they should do that.
Rob Wiblin: Yeah. I mean, from the board’s perspective, this might for many of them be their biggest opportunity ever to have an impact and ensure that the future goes well. In taking on this role, it’s a challenging role — you know, overseeing OpenAI is not a trivial task — but it’s one that they signed on for when they became board members of this very important nonprofit.
And I guess also inasmuch as they have a vision of themselves as great people who want to make a positive difference and pursue the duties that they’ve signed on to, I really do hope that they put in the time and are willing to stand up to the pressure — because I’m sure there is a whole lot of pressure coming from the for-profit investors here, who would love to basically redistribute power and money that the nonprofit currently has and take it for themselves.
The nonprofit board should not give up its power and its resources so lightly: it belongs to these nonprofit board members, on behalf of all of humanity, and they should be good stewards of that and not allow themselves to be cut out, not allow themselves to be screwed over by other people who stand to benefit by basically cutting them out of all the decision-making procedures.
Tyler Whitmer: I think that’s exactly right. They have a really hard job to begin with. They signed up for it. Nobody made them do this. And I think it’s a hard job day to day to manage this nonprofit, it really is — not many nonprofits are sitting on top of a $300 billion business empire. That’s a really hard thing to do. And this restructuring is potentially going to really change how things work over there.
So within a hard job that’s uniquely impactful and critical, I think it’s a uniquely critical moment — and one where I trust that they’re all hopefully doing this, which is, they owe a duty to humanity here, and I hope that they’re taking that very seriously as they look at this potential restructuring.
Rob Wiblin: Yeah. One thing that is heartening is that when they refreshed the board a year and a half ago, they really appointed some top shot people, I think. If anyone could take this… These people have experience across so many different areas, and some of them have done things that are as challenging, maybe more challenging, than this before. So fingers crossed.
How the AGs can avoid being disempowered [00:48:35]
Rob Wiblin: So the second request you had for the AGs and the board, which you think is definitely necessary, is to do one of the following two things.
First option is: “Give the OpenAI-nonprofit board powers over OpenAI-PBC sufficient to ensure the OpenAI-PBC board upholds its primary fiduciary duty to the charitable mission and Charter.” That’s option one, a little bit of a mouthful.
Number two is: “Alternatively, include an enforcement regime in OpenAI-PBC’s certificate of incorporation.”
Could you explain how the first one would work first?
Tyler Whitmer: Yeah. So the first one, there’s probably a number of different ways that you could get there, but the idea is to keep the link between the nonprofit and the PBC strong enough that the attorney general oversight that they have over the nonprofit then sort of filters through to the PBC, so there’s no disconnect.
So the attorneys general, in being able to regulate what the nonprofit does through its board, by doing that, it’s also regulating what the commercial entity is doing at the same time. So it’s just like creating that connection between the nonprofit and the PBC in such a way that that link doesn’t get broken, and that the attorneys general at least have power to oversee things at the level of what really matters, which is the commercial entity going and doing things in the world. I think that’s ideal.
Like I said, the reason we’re having this discussion is because the attorneys general have that authority, and it’s led to this: the ability to put pressure on the company to do the right thing here. So I think that’s ideal. And the AGs exist because they are an organisation and an office that is beholden to the public; the public interest is their thing. And they have the power of the state behind them, which is obviously very important. So I think that’s ideal.
So that’s what the first option would be: just to arrange the relationship between the nonprofit and the PBC such that the AGs have the same control that they have now with the nonprofit in charge of the LLC.
Rob Wiblin: OK, and the second option was to include an enforcement regime in the public benefit corporation certificate of incorporation. What would that look like?
Tyler Whitmer: That could look like a lot of different things. Really what this is saying is: if things get restructured in a way that cuts out the AGs or limits their power in some important way… We were talking about the sort of gap that gets created when you have the PBC with a separate board and separate fiduciary duties, instead of the LLC that’s just a straight control situation under the nonprofit. If that gap is such that it really does put some limits on the AGs’ ability to regulate things here, then I think there has to be something that steps in to fill that gap.
So you could, in the certificate of incorporation of the PBC, write in an enforcement regime that would be there to hold OpenAI accountable to the charitable mission.
What that could look like, just for example — and this is a hypothetical, and I think it could look like a bunch of different things — but I think it needs to be someone who’s outside of OpenAI. So you have a true third party, disinterested in the sense of not a part of the corporate structure of the organisation. And the AGs are obviously that already, so you would need to approximate that third-partyness.
And then you have to have someone who’s incentivised to uphold the public interest. So someone who doesn’t have any conflicts, financial or otherwise, with OpenAI, who already has probably a remit to protect the public and to have the public interest in mind.
So there’s third-partyness, the incentive is there, and then I think they have to be well resourced — enough to take on one of the biggest corporations in the world. So the AGs kind of have that by virtue of their state power. If you’re going to put this into a non-state-power, third-party enforcement mechanism, you’ve got to make sure it’s got real teeth, right?
But the 2(b) request in the letter is meant to say, if something happens here where the AGs get disempowered — and we think that would be bad — there has to be something that steps in to fill that gap. And that has to be an outside, incentivised, and sufficiently resourced enforcement mechanism.
Rob Wiblin: So on this path, the attorneys general basically opt out of having further enforcement power, and they delegate or pass on that to some other body. I could see them being pretty reluctant to do that. Like, who else would they think has the ability to stand up to these incredibly powerful forces — already powerful, and in future, probably even more powerful — who will be arrayed against them? I would feel nervous unless it was a pretty potent entity that they were passing on this responsibility to.
Tyler Whitmer: I share your concerns. It is hard to imagine setting something up that would have the same power as the AGs. There’s situations in the world where attorneys general rely on private enforcement, so it’s not like this would be the first time something like this has ever happened, but I think that obviously having the state behind you is really important.
There’s probably ways that you could structure this so that there’s like a two-headed monster, where the AGs still have some role, but there’s another party that’s given some power. I think you can do this in a lot of different ways.
Without getting into the details on it, I think the easiest way to do it is obviously just to maintain the attorneys general’s current level of oversight. That seems to be the easiest way to do it. Then like I said, anything beyond that, you need an enforcement mechanism that’s resourced and incentivised to ensure that the nonprofit mission takes precedence over profit motives and shareholder interests and anything else.
Retaining the power to fire the CEO [00:54:49]
Rob Wiblin: Another thing that’s on my mind is that currently the nonprofit board can fire and replace the CEO of OpenAI. Do we need to write in somewhere that they still have that ability to fire the senior staff who are meant to be pursuing their goal, if they’re not doing so?
Tyler Whitmer: Yeah, I think so. By default, as we keep saying, if you have the PBC under the nonprofit, the PBC’s board would necessarily have the ability to hire and fire the executive and the CEO. But in order to preserve that kind of control for the nonprofit, you would have to write it into the PBC’s structural documents or something. That would not happen on its own.
Rob Wiblin: Is it pretty straightforward for the attorneys general to basically insist on all of these specific requirements as the restructure goes through? Is that just something that they have the right to do, to say, “We’re not going to sign off on this; you can’t do any of this stuff unless you do ABC”?
Tyler Whitmer: It’s an interesting situation, because I don’t think that there is a requirement that either of the AGs sign off in advance on anything that OpenAI is doing.
Obviously, the AGs have enforcement authority here. So if OpenAI were to do something they didn’t like, they would be sued by the AGs and have the power of the state directed against them — which nobody wants, which is why they’re speaking to them now about the situation, so that they’re not doing something that’s just buying themselves a lawsuit from an AG’s office. But I don’t think it’s the case that the AGs are required to sign off in advance necessarily on this transaction.
Given the current situation where we know they’re talking to the AGs, it’s de facto that’s probably the case. No one’s going to do anything that the AGs object to. So in that sense, the AGs probably do have the ability to direct things to some degree.
Rob Wiblin: I see. It’s not the case that in law they have to approve it ahead of time, but because they have this enforcement ability to object to things that they do after the fact and have them reversed, basically — and we think that the AGs know what they’re doing and they would have a very strong case and would be able to get their way almost certainly should that situation arise — effectively, they do have the ability to insist on things ahead of time. And OpenAI is not going to do things that the attorneys general say, “If you did this, we will bring a case and basically try to undo it.”
Tyler Whitmer: Yeah, they certainly have a lot of power in that way. And Kathy Jennings, the Delaware attorney general, filed an amicus brief in litigation where she was like, “If this needs to be enjoined, we can enjoin it” — “enjoin” just means get a legal order to stop the thing from happening. So certainly they’ve not been shy about flexing that muscle and making it clear that they consider themselves to have the ability to stop this thing if they don’t think it’s going the right way.
Will the current board get a financial stake in OpenAI? [00:57:40]
Rob Wiblin: One of the requirements for the nonprofit is that a majority of its board members cannot directly own equity, cannot directly have a financial interest in OpenAI. Do we know whether any of them are on track to receive equity in the business after this restructuring takes place? Needless to say, if they were on track to receive a huge payday should these restructuring plans go ahead, it feels like that could influence their decision. That’s not really how charities are meant to work.
Tyler Whitmer: To say the least. To answer your first question of do we know, I think the answer is no. Meaning that I’ve heard rumours and there’s been conflicting reporting. At some point, there was a report that Altman was going to get 7% or something like that.
Rob Wiblin: Wow. That’d be a lot.
Tyler Whitmer: Yeah, exactly. There’s been other reporting. That reporting happened at a time where the valuation was like half or something of what it is now. But he said since then that wasn’t true. So there’s rumour mill stuff. There’s reporting back and forth. So I think the answer is we don’t know if any of the directors are going to receive equity as part of the restructuring.
Certainly if they were going to receive equity as part of the restructuring, they should be recusing themselves from any of this decision making, or honestly probably even discussion, because that’s such a direct, obvious conflict that it’s hard to imagine that they wouldn’t be held out of the decision making if that was part of the deal.
Could the AGs insist the current nonprofit agreement be made public? [00:59:15]
Rob Wiblin: Can the attorneys general basically insist on the publication of the agreement between the nonprofit and the LLC that details what the public is owed?
Tyler Whitmer: It’s a good question. I don’t think so. The LLC agreement is not by default a public document. And nonprofits, although they have a public remit in some meaningful sense, they’re still allowed to have confidential information. They don’t have to open every book they have to everyone in the public, every nonprofit ever. So I don’t think there’s some rule by which they would have to make that public.
For what it’s worth, I think given their duty to humanity and their mission and the unique situation here, I think there’s a really strong argument, that there’s a good ethical or moral argument that they ought to do that, and that that could be really important.
But I don’t think that the AGs would necessarily have the power to force that. For what it’s worth, I think that agreement is very likely to become part of the evidentiary record. If Musk’s lawsuit goes forward and that trial actually happens on the schedule that it’s happening now, it’s really difficult for me to imagine that that document isn’t going to be part of the record.
If there’s a trial, if it was part of the record of the trial, which I think is very likely, it seems very likely to me that it would become public at that point, because there’s just a really strong thumb on the scale of things that are made part of a trial being public. And while this could be confidential or could be considered non-public by the organisation, it’s not like a trade secret. It’s not the kind of thing where you would keep it protected from disclosure, even in a trial.
How OpenAI is valued should be transparent and scrutinised [01:01:00]
Rob Wiblin: Yeah, I see. One other thing that feels very natural and desirable for there to be more transparency on is valuation of different things that the nonprofit might be conceding. So they’re on track to remove the profit caps that currently mean that if OpenAI becomes incredibly profitable, the nonprofit receives almost all of those like super-profits basically.
They’re on track to get rid of those caps, which means that any such super-profits would be more evenly distributed between the nonprofit and other for-profit investors — which I’m not actually necessarily against, but the nonprofit is giving something up there, and it should receive some cash or something else in exchange for that, because otherwise it just doesn’t make any sense in pursuit of its charitable purpose.
How are they going to value that? It’s incredibly hard to value. There’d be lots of potential for people to undervalue that if they were so inclined, if they stood to benefit from that. It feels like you really would want public scrutiny on how those valuations are being done, so that people can see whether they are in any sense reasonable, and whether the nonprofit that represents their interests — that is basically holding assets or holding future potential assets on their behalf, on the public’s behalf — whether they are getting ripped off or not.
It sounds like it might be challenging for the attorneys general to insist on them sharing any such valuations?
Tyler Whitmer: Well, I’m not so sure. I think actually, the AGs have quite a bit of power there maybe, because the protection of charitable assets is definitely in the AG’s remit. And in California, there’s a lot of statutory power around that as well, in addition to just the basic power of the AGs here. So it is going to be very difficult to approximate the cap structure if everybody’s just holding stock in a PBC.
To your point, I think you’ve kind of said this, just to make it a little bit more explicit: what would have to happen is some kind of valuation now that puts a present value on the potential future value of the profits after the cap. And I’ve never seen an economic analysis of that by the company.
I think that’s extremely important, because being able to make sure that this isn’t a giveaway of a tonne of value from the nonprofit — which has a mission, again, to use that value for the benefit of humanity — to a small group of investors certainly is extremely important. It’s not something we focus on in this letter, but we’ll certainly focus on it going forward. I do think that the attorneys general have a tonne of power here. I think they have the ability to ensure that the nonprofit gets a fair price here.
Rob Wiblin: The other thing that they’re potentially on track to give up is that currently, should OpenAI develop AGI, in its assessment, then I think Microsoft loses its control or its access to the model, and the nonprofit would control and operate that AGI for the benefit of humanity. It kind of feels important that they have their right to that, and if they are going to concede that, that is quite a significant concession that requires some compensation.
I’m glad it’s not my job to put a dollar figure on the value of that, but I would really like to see any such valuation that is put forward scrutinised by all actors who can speak to whether it is reasonable or not.
Tyler Whitmer: Yeah. I don’t want to totally give up on the ability to keep some version, at least in the economic reality of it, of the nonprofit maintaining ownership or whatever you want to call it of AGI after it’s developed.
But it’s not addressed in their original restructuring plans. In the status quo, it’s very clearly the case that the nonprofit would have control and ownership over AGI if and when it’s developed. In the original restructuring plan, it’s not clear what happens with that, and it’s not clear what happens to that in their new proposal as of last Monday.
So I think figuring out how to ensure that AGI really does benefit all of humanity, it’s pretty easy to say that you can do that when it’s sitting at a nonprofit where that’s the charitable purpose of the nonprofit. Anything else, they’re going to have to figure out how to buttress that to make that still the case.
And like you said, putting a price on that is really hard. There’s something about avoiding trying to put a price tag on that now, and really trying to actually recreate the state of affairs as it is now, rather than try to put a price on it, seems like the better way to do that from my perspective. But I think by default that would not be the case, and that’s the important bit here: something has to be done from here to make sure that happens. It’s not going to just happen on its own.
Investors aren’t bad people, but they can’t be trusted either [01:06:05]
Rob Wiblin: Yeah. One other quick comment I want to make before we finish up is that I guess we can sound fairly hostile to the for-profit investors in OpenAI in the way that we speak, but that is not the case. I am very happy for them to make tonnes of money out of this deal, inasmuch as they put their capital at risk and they’re making difficult business investments.
It’s just that the whole thing was set up with charitable purpose. Inasmuch as they’re making profit, that has to be in pursuit of the purpose of the organisation as it was set up. And indeed it probably is to a large extent.
Inasmuch as OpenAI actually is pursuing safe AI that will benefit all of humanity, and that actual purpose is preserved, then it is in their interest to take a lot of investment from for-profit investors and to pay them out whatever rate of return is required to induce them to do that. Totally reasonable. I’m not against it. I invest in the stock market and I make returns, and I invest, indeed, in some companies that have a stake in OpenAI, I guess.
That is all well and good. It just can’t come at the cost of the reason that OpenAI was established — and indeed, the public interest, which I do also have a stake in.
Tyler Whitmer: Yeah, I’m also not against investors making money. I think that’s good. I think absolutely what needs to be made clear here, and continue to be clear as we move forward, is that investment in OpenAI is a means to an end. It’s not the end in itself. So for a traditional for-profit, in some sense, making money for shareholders is kind of the whole game. Like, that’s the deal. And in a traditional for-profit, the board has fiduciary duties to increase shareholder value, like that’s their whole remit.
And this is not that. To the extent that OpenAI is taking investment here, it’s a means to an end: it’s instrumental in support of the nonprofit purpose, and that’s what needs to be protected here.
So it’s not that no one can make a profit. It’s not that if money touches this, it’s somehow dirtied it or something like that. No, they probably need a lot of money to do what they’re doing — but it should be instrumental in service of the mission, and not become the end in itself. And that’s the thing that we’re most concerned about.
Which is why I keep saying words like, “a legally enforceable primacy of the nonprofit mission over all other interests” — including shareholder interests: because I think there’s a way that this moves forward where you’re on a slippery slope to the profits becoming the end instead of the means. And we want to make sure that they remain the means and not the end.
Rob Wiblin: To really stylise things: you have the nonprofit board who are, by and large, acting in good faith, and I think are, by and large, trying to defend the public interest against what is an enormous set of forces potentially trying to persuade them to do something else, or try to trick them into doing something else.
And then you have all of these for-profit entities: in some sense you can’t condemn them, because their goal is to make money. They’re kind of following their fiduciary interest to their shareholders, which is to just earn as many returns as possible. Maybe there’s questions about whether we should set organisations up to basically just be focused on the bottom line in that way, but that is to some extent how we’ve organised our corporate sector.
So these businesses are acting according to their natural instincts. But that means that the nonprofit board has to stand up against them.
Tyler Whitmer: Right.
Rob Wiblin: They are going to take everything that they can. They will suck the blood out of this organisation in pursuit of their fiduciary duties, which is to a very specific group of people who own their company. But that is not the same as the interests of humanity. The nonprofit board is the only 10 people standing between us and that.
Tyler Whitmer: Yeah, that’s it. Right now you have the board of directors of OpenAI of the nonprofit as the stewards of the nonprofit mission. And you’re right that there’s going to be, there already is, a tonne of pressure from investors and from partners to do what their remit is — which is to make money for their organisations, or for themselves. And yeah, the board of directors is what’s standing in between that profit motive and the nonprofit mission.
Again, that’s super important. And it’s super important that the attorneys general are there standing behind them, making sure that they’re doing their jobs right, as additional incentive — because I agree with you: there’s no reason to think the board isn’t trying to do the right thing here, but it always helps when you’ve got backup, even if that backup is a way of explaining yourself to folks who may not get what you’re doing. But I think that’s super important.
You know, the reason we wrote this letter, the reason we wrote the Not for Private Gain letter and the reason we wrote this followup letter is because I think there’s memes out there in the press that this was like a big shift on May 5 when this announcement came through. And as we’ve been discussing, there’s ways this could turn out well, but I don’t think that having the view that this was a big shift is really the right way to be thinking about this situation right now.
I think keeping your guard up and keeping the pressure on is the right way to respond to this, rather than think, “Oh good, we won. The nonprofit’s in control. We can all sleep easy at night.” There’s still a lot of work to be done, and I think that work needs to be done by the board, and it needs to be done by the AGs, and it needs to be done by the public advocates that are out there trying to make sure that the pressure is on. And that everyone who’s a stakeholder, which is really everybody — like, this is a nonprofit that’s designed —
Rob Wiblin: Wait, I’m humanity!
Tyler Whitmer: Right? Yeah, exactly. I mean, the stakeholders should have as much transparency into this and have as much influence in this as they can. So I think that’s super important, and that’s why we’re doing the work that we’re doing.
Rob Wiblin: Brilliant. My guest today has been Tyler Whitmer. Thanks so much for coming on The 80,000 Hours Podcast, Tyler.
Tyler Whitmer: Yeah, thanks for having me, Rob. Really appreciate it.
Related episodes
About the show
The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.
Get in touch with feedback or guest suggestions by emailing [email protected].
What should I listen to first?
We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:
Check out 'Effective Altruism: An Introduction'
Subscribe here, or anywhere you get podcasts:
If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.