#209 – Rose Chan Loui on OpenAI’s gambit to ditch its nonprofit

One OpenAI critic describes it as “the theft of at least the millennium and quite possibly all of human history.” Are they right?

Back in 2015 OpenAI was but a humble nonprofit. That nonprofit started a for-profit, OpenAI LLC, but made sure to retain ownership and control. But that for-profit, having become a tech giant with vast staffing and investment, has grown tired of its shackles and wants to change the deal.

Facing off against it stand eight out-gunned and out-numbered part-time volunteers. Can they hope to defend the nonprofit’s interests against the overwhelming profit motives arrayed against them?

That’s the question host Rob Wiblin puts to nonprofit legal expert Rose Chan Loui of UCLA, who concludes that with a “heroic effort” and a little help from some friendly state attorneys general, they might just stand a chance.

As Rose lays out, on paper OpenAI is controlled by a nonprofit board that:

  • Can fire the CEO.
  • Would receive all the profits after the point OpenAI makes 100x returns on investment.
  • Is legally bound to do whatever it can to pursue its charitable purpose: “to build artificial general intelligence that benefits humanity.”

But that control is a problem for OpenAI the for-profit and its CEO Sam Altman — all the more so after the board concluded back in November 2023 that it couldn’t trust Altman and attempted to fire him (although those board members were ultimately ousted themselves after failing to adequately explain their rationale).

Nonprofit control makes it harder to attract investors, who don’t want a board stepping in just because they think what the company is doing is bad for humanity. And OpenAI the business is thirsty for as many investors as possible, because it wants to beat competitors and train the first truly general AI — able to do every job humans currently do — which is expected to cost hundreds of billions of dollars.

So, Rose explains, they plan to buy the nonprofit out. In exchange for giving up its windfall profits and the ability to fire the CEO or direct the company’s actions, the board will become minority shareholders with reduced voting rights, and presumably transform into a normal grantmaking foundation instead.

Is this a massive bait-and-switch? A case of the tail not only wagging the dog, but grabbing a scalpel and neutering it?

OpenAI repeatedly committed to California, Delaware, the US federal government, founding staff, and the general public that its resources would be used for its charitable mission and it could be trusted because of nonprofit control. Meanwhile, the divergence in interests couldn’t be more stark: every dollar the for-profit keeps from its nonprofit parent is another dollar it could invest in AGI and ultimately return to investors and staff.

To top it off, the OpenAI business has an investment bank estimating how much compensation it thinks it should pay the nonprofit — while the nonprofit, to our knowledge, isn’t getting its own independent valuation.

But as Rose lays out, this for-profit-to-nonprofit switch is not without precedent, and creating a new $40 billion grantmaking foundation could be its best available path.

In terms of pursuing its charitable purpose, true control of the for-profit might indeed be “priceless” and not something that it could be compensated for. But after failing to remove Sam Altman last November, the nonprofit has arguably lost practical control of its for-profit child, and negotiating for as many resources as possible — then making a lot of grants to further AI safety — could be its best fall-back option to pursue its mission of benefiting humanity.

And with the California and Delaware attorneys general saying they want to be convinced the transaction is fair and the nonprofit isn’t being ripped off, the board might just get the backup it needs to effectively stand up for itself.

In today’s energetic conversation, Rose and host Rob Wiblin discuss:

  • Why it’s essential the nonprofit gets cash and not just equity in any settlement.
  • How the nonprofit board can best play its cards.
  • How any of this can be regarded as an “arm’s-length transaction” as required by law.
  • Whether it’s truly in the nonprofit’s interest to sell control of OpenAI.
  • How to value the nonprofit’s control of OpenAI and its share of profits.
  • Who could challenge the outcome in court.
  • Cases where this has happened before.
  • The weird rule that lets the board cut off Microsoft’s access to OpenAI’s IP.
  • And plenty more.

Producer: Keiran Harris
Audio engineering by Ben Cordell, Milo McGuire, Simon Monsour, and Dominic Armstrong
Video editing: Simon Monsour
Transcriptions: Katy Moore

Highlights

How OpenAI carefully chose a complex nonprofit structure

Rose Chan Loui: It was very carefully structured. And in the beginning, in 2015, it was pretty straightforward: it was founded as a scientific research organisation. The specific purpose was to provide funding for research, development, and distribution of technology related to AI. Then they also made the promise that the resulting technology will benefit the public, and the corporation will seek to open source technology for the public benefit when applicable.

I just want to emphasise here that that is the legal purpose that is in the certificate of incorporation with the State of Delaware. Then, in its registration with the California Attorney General, it said that its goal is to engage in research activities that advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return. So just a little distinction here: that sounds more aspirational.

Now, by 2019, they had been able to raise $130 million of their initial $1 billion goal. They decided then that charitable donations were not going to be enough to achieve their charitable purpose of providing funding for research and development and distribution of technology related to AI.

So they set up this structure that would accommodate and attract investors. First step was to form an LP, a limited partnership, under OpenAI, which would be owned by the nonprofit as well as employees and some early investors. That LP would be governed by the nonprofit and operated in accordance with the nonprofit’s charitable purposes.

The LP then created a subsidiary, Open AI LLC, which you could also call the operating company. It’s at this level that Microsoft invested. And interestingly, OpenAI on their website used to call Microsoft a “minority owner.” But they clarified on their website in December 2023 that no, they only have a profits interest in OpenAI LLC. We think that is in response to inquiries by various antitrust authorities.

So again, the operating agreement of LLC, like you said before, broadcast that LLC might never make a profit and is under no obligation to do so. One of my favourite quotes is, “It would be wise to view an investment in LLC in the spirit of a donation.” And, just like with previous investors and the employees, there was a cap on how much they could get on their investment. For Microsoft, we know it’s 100 times of investment. They also said that that cap could be lowered with later investors.

And now, both the LP and the LLC are controlled by OpenAI general partner. And that is what we call a “disregarded entity” — basically, it’s just how the nonprofit is controlling the for-profit entities below it. We think that there are other subsidiaries through which the newer investors have participated, but we don’t have full transparency into that. But they formed a lot of other entities in Delaware, and some of them are registered in California.

The nonprofit board is out-resourced and in a tough spot

Rob Wiblin: So I’m going to have a fair number of sharp questions in this interview. But I don’t think there was anything wrong with OpenAI seeking for-profit investment in order to stay in the game and stay relevant. And I think the idea that they had — that they’ll have the investment, it’s kind of capped, and then it will spill over; and at least in theory, the nonprofit foundation should be able to control, to direct the business if they think it’s gone off the rails and is no longer pursuing its mission of benefiting humanity — in theory, this kind of all makes sense.

I don’t think that there was anything untoward, although you might think perhaps there was something a little bit naive about thinking that this would function as originally intended. But if they had just remained a nonprofit and only accepted charitable donations, I think it’s fair to say that they would have become irrelevant, because they just wouldn’t have been able to keep up with the costs involved in training AGI or training the frontier models.

Rose Chan Loui: I think that’s absolutely right. I mean, we can understand why they did what they did. And they’re not at all novel in setting up a for-profit subsidiary; nonprofits can do that.

I think what became challenging here was, first, that most of the nonprofit/for-profit relationships that I’ve seen anyway are wholly owned or mostly wholly owned. You know, you could operate a spaghetti factory, you pay tax on it because there’s nothing to do with your charitable purpose, but the net goes straight up to the parent to use for a nonprofit purpose.

But I think what’s so different here is that the amount of external third-party investment is so huge. The sums are so huge and completely engulf the nonprofit.

Rob Wiblin: Yeah. Which is not really resourced. It’s a bunch of volunteer, part-time board members. I don’t know whether it really has any meaningful staff to represent its own interests really seriously independent of the business. And that’s a major weakness, I guess, with the structure that was set up.

Rose Chan Loui: Right, right. I mean, they only show from their 2022 year about $19 million in assets at the nonprofit level. And here you have a subsidiary that just keeps going up, but the latest number is $156 billion of valuation. So it becomes very hard. It definitely looks like the tail is wagging the dog.

Is control of OpenAI 'priceless' to the nonprofit in pursuit of its mission?

Rob Wiblin: Just before we get to the valuation, I wanted to take a second to consider, is there any amount of compensation — any equity stake or any amount of money — that can really make the nonprofit foundation truly whole for giving up its strategic control of the organisation OpenAI in terms of pursuing its mission?

I think that the case against that is that OpenAI is one of the groups most likely to develop AGI, and this foundation is set up to make it go well. So by having a controlling stake in OpenAI, the nonprofit board gets maybe a 20% chance or something of staffing up; insisting on being in the room where all the decisions are being made, the room where it happens; and literally directing the major decisions about how this transition to an AGI-dominated world takes place — or at least, operating within the worldview of OpenAI, that this is going to happen, and this is how they could influence things.

So this is of enormous value to the pursuit of the organisation’s goal, perhaps a priceless one.

Now, it’s true that you could take some money and make grants to try to influence the development of AGI in a positive direction. But it’s kind of unclear that even trying to make hundreds of billions of dollars in grants would buy you as much ability to actually steer the direction of things in the way you want, as if you just actually retained control of the organisation that matters.

Because there have been various foundations that have tried to influence this sort of thing, but they tend to find it hard to give away more than some number of low billions of dollars. And even that takes years and is very difficult, and they’re not confident about the marginal grants that they’re making, because there just isn’t necessarily the ability to absorb that kind of capital on truly useful projects outside of the businesses that are doing the work. It’s hard to do this kind of stuff outside of the organisations that actually matter, which is the organisation that they control now.

Rose Chan Loui: Yeah. I totally agree, because the core of the purpose was not about making money: it was to raise money, but specifically so that they could guard against bad AI. So how do you compensate for that? No, I think you’re right.

I think the question really comes down to the facts as they are, which is that they’ve invited in so much external investment — can it go on this way? I think originally when it was structured, they were very careful to not have too much private benefit — but there’s an awful lot of private benefit going on right now, or at least it looks like that.

Rob Wiblin: Does the nonprofit foundation ever have to demonstrate that it is better to sell OpenAI? That that’s the best way to pursue its mission? Does it have to prove that to anyone?

Rose Chan Loui: I think that’s part of the analysis the nonprofit board has to do right now. Can they make the argument that this current structure, as carefully structured as it was, is not sustainable? And that the best thing that the nonprofit can do is just become independent, maybe? You know, I’m not sure they can act all that independently right now, or that they are, in fact, acting all that [independent]. I think they may try, but it’s really hard when you have $157 billion —

Rob Wiblin: Set against you.

Rose Chan Loui: Set against you, and you have only the $19 million sitting in your bank account. They do have good counsel, I can tell you that. I’m not sure who the investment banks are representing.

Rob Wiblin: I think Goldman Sachs might represent them.

Rose Chan Loui: Right. But they’re representing OpenAI as a whole. Not necessarily… I think because it’s more about OpenAI versus Microsoft.

Rob Wiblin: I see.

Rose Chan Loui: I can’t remember who’s representing who. I think they have Goldman and then Microsoft has Morgan Stanley. Is that right?

Rob Wiblin: So that’s bad news, I guess. Because what you really want is the nonprofit foundation to have its own totally independent legal counsel, and business analysts who are representing not the interests of the business, and not the interests of Microsoft, certainly.

Rose Chan Loui: They do have separate legal counsel. But I think it’d be nice if they also have their own valuation people. And maybe they do, but it’s not been published. It’s super complicated. Again, we keep ending up there, trying to forestall that discussion.

Control of OpenAI is independently incredibly valuable and requires compensation

Rob Wiblin: So we’ve heard this number of $37.5 billion in equity get thrown around. I guess the nonprofit board, we probably think it should do its best to bid that up on the basis of where we’re giving up control. That’s of enormous value.

Also, maybe that’s undervaluing the prospects of OpenAI as a business, that it has some chance of being this enormously valuable thing. And look at all these other businesses: look how desperate they are to get control and to get rid of this cap.

But I guess even if it’s $40 billion at the lower level, that would make them one of the biggest charitable foundations around. And if they could bid it up to more like $80 billion — which is a number that I’ve heard is perhaps a more fair amount, all things considered — then you’re saying they would be one of the biggest in the world, really.

Rose Chan Loui: Yes. And perhaps also most fair, because like you have pointed out, they’re probably not going to get cash in that amount, because they’re so cash strapped. Which is interesting that there’s this gigantic valuation, but they’re so cash strapped. That’s why they keep having to fundraise.

So I think, just realistically speaking, it’s going to be hard for the nonprofit to get that much in cash. So what’s the best then? It seems like the best is to get some combination. Or maybe, since they haven’t had any distributions, maybe part of the deal is that they have to distribute cash in some amount every year.

But going back to your point, they are giving up a lot that really can’t be paid for. They no longer get to drive, they no longer get to say that the for-profit entities will follow the charitable purpose of developing AGI and AI safely for the benefit of humanity.

Rob Wiblin: And that’s a huge sacrifice to their mission.

Rose Chan Loui: That is a big sacrifice of mission. The nonprofit board would just have to get there by saying we just don’t have the ability to force that now, with so many external investors.

Rob Wiblin: So there’s two blades to the scissors here. One is: How much would other groups be willing to pay in order to get this stuff from us? What’s the market value of it?

And then there’s the other side, which is: What would we be willing to sell it for? How much do we value it as the nonprofit foundation? And it’s kind of unclear that any amount is worth it, or any amount that they’re likely to get. But they certainly shouldn’t be selling it for less than what they think is sufficient to make up for everything that they’re giving up in terms of pursuit of their mission.

They might think that $40 billion actually just isn’t enough; if that’s all that we’re being offered, then we should actually just retain control. So that’s another hurdle that you have to pass, is arguing that it’s a sufficient amount to actually be a good decision.

Rose Chan Loui: I guess the flip side of that — trying to think, sitting in their chairs — is that, because their purpose is to develop AGI, if you don’t get the additional investment, you can’t actually develop AGI. At least that’s what they’re saying.

Rob Wiblin: OK, so you could argue it down, saying if it’s controlled by the nonprofit foundation, then this company actually isn’t worth that much. It’s only worth that much if it can break free. And then which one is the nonprofit foundation owed? Is it the amount that it’s valued at if they control it or if they don’t? I think the latter.

Rose Chan Loui: Yeah. They can’t achieve purpose without the additional investment. I mean, that’s the whole reason they established the for-profit subsidiary in the first place, and the need for funding just doesn’t seem to go away.

But I think what’s so tricky is: how does the public know when AGI has been developed? Who’s going to tell us that, when all of the for-profit incentive is to say it’s not there yet?

Rob Wiblin: Yeah. Is there anything more to say on the dollar valuation aspect?

Rose Chan Loui: Just to remember that we do have the attorneys general involved now, so there is someone, I think, speaking up for the nonprofit other than the nonprofit itself. And I’m trying to think, Rob, if there are competing interests on the part of the two states? I think they’re going to want OpenAI to stay in California, because if it starts making money, then that’s a good thing.

Rob Wiblin: They’d like to tax it.

Rose Chan Loui: They’d like to tax it. But at the same time, I think at least California is very protective of charitable assets. So I think in the present that we’ll have that assistance with getting a fair deal for the nonprofit here.

It's very important that the nonprofit gets cash and not just equity

Rob Wiblin: So you could imagine that they sell OpenAI, and all they get is equity — that is to say, they get shares, basically, in the future profits of the organisation. But very often in these situations, when the company is not yet mature, it’s not yet publicly traded, those shares can’t be sold. You have to continue to hold them, or you’re only allowed to sell them at this very incremental rate, until such time as the business decides that now we’re a mature business, now we’re going public, and everyone can sell their shares as they wish.

So if that is how things go, and the nonprofit foundation only receives equity, and it doesn’t have almost any cash on hand, then it’s not going to be able to make any grants now. It’s not going to be able to actually deploy the hypothetical resources that it has in the valuation to accomplish its mission — which is to guide the development of AGI in a positive direction.

But now is the critical time to be deploying resources to make that happen! If you wait until such time as OpenAI is already a mature business — it’s already making all of the profits, it’s already publicly traded — then we’re already in the AGI world. Probably by that stage, the technology has matured. It’s probably pretty clear what it looks like; there’s not so much room to guide it. And the amount of interest will have increased enormously, such that anything the foundation might hope to do is going to be a drop in the bucket.

So now is the crucial time to be funding governance work. Now is the crucial time to be funding technical AI safety work that might be relevant. And I think that’s the view of almost everyone who’s actually trying to pursue those missions seriously.

So they have to get cash soon; it would be totally irresponsible to only take equity and lock it up for decades. That would be completely inconsistent with their mission, to the point where it would almost seem negligent to me. I don’t know whether legally it’s negligent.

But anyway, I think this is one way that they could end up getting screwed, and not be able to actually do what they’re meant to do, that wouldn’t be immediately obvious. People could say, “But they got this huge amount of money!” — and yeah, but they can’t do anything with it for 10 years, so what really is the point?

Rose Chan Loui: Right. It’s like getting a bequest and you’re sitting around waiting for the other person to die. That’s why I think it probably will have to be, hopefully, some combination of cash and equity. But I think the equity, while not controlling, I would say that I would ask for some amount of it to be voting so that you have a real voice, even if you’re not controlling.

But you know, you make such a good point that I hadn’t really thought about, in terms of can they have impact independently? On the one hand they could just really be independent, so the nonprofit board really could protect purpose and safe development of AGI. But you’ve made the point that there’s all these other organisations out there doing that — and they don’t have, at least in your view, the same impact as OpenAI Nonprofit could by being inside the hen house.

Rob Wiblin: Yeah, yeah. I mean, people might have different views on that. To be clear, I’m not saying that the grants that they have made have been bad or haven’t been effective. But the question is, given that there’s already a lot of philanthropic interest in this area, does extra money make that much difference above and beyond the billions that are already being deployed in this area?

It’s hard. You can’t just deploy $100 billion or $10 billion all at once. These sectors, like AI governance, can only grow at a particular pace. And there’s lots of work that can only happen within the government itself; it can’t happen in nonprofits that are funded through grants.

So there’s a lot of limitations. People imagine that being a nonprofit foundation is just this fantastic qqqposition. And in some sense it is, but you also struggle to actually accomplish your mission. It’s not trivial to get the right people matched up with the projects and to grow everything really quickly.

Rose Chan Loui: I think where you’re having me conclude now is that this is a very different nonprofit. It’s not a foundation that the importance of which is the giving of philanthropic money out. They do that, but really the reason they’re so important is because they’re in the middle of a corporation that is doing this work — and only from that position can they really make sure that what’s being done is good and safe for humanity.

Once they’re spun out, they’ll be more like any typical corporate foundation. They’re giving grants out to whatever, presumably still in the scientific artificial intelligence research world. And when I say control, not just like the voting, but they won’t have the inside track to guide the work that’s being done. And that is pretty hard to compensate. It’s not a numerical amount. It’s a position that is rare.

How the nonprofit board can best play their hand

Rob Wiblin: Would you have any other advice for the folks on the board? I mean, I really do think that we should assume good faith and that they’re trying to do their best. What would you tell them if they called you in to give them advice, other than what you’ve said already?

Rose Chan Loui: Really, just to remember their fiduciary duties. And despite what the public or the investors might want to see, that they really think through what’s best for the nonprofit and what’s best for that purpose.

And remember that you are really giving up a lot by stepping out of the control position — and even though that’s irreplaceable, that they should make sure that sufficient compensation goes to the nonprofit, to pay them back for that.

And then hopefully that they figure out how the nonprofit best serves the community once it’s jettisoned from its controlling position here. Because there’s options there, and I don’t know what the best option is for how they prioritise what they do.

Rob Wiblin: Yeah, that’s a whole other aspect of this that I guess we might end up coming to at some future time.

Rose Chan Loui: Potentially, with the size of endowment that they get, maybe they can have impact that’s different from the other organisations that exist now that are watchdogs. I don’t know how well funded those organisations are.

Rob Wiblin: Yeah. An argument that I can make on the other side is that in the past we haven’t really known what to fund. It’s all seemed quite speculative, a bit all pie-in-the-sky. But now there are concrete projects that have huge compute requirements, that have huge infrastructure requirements. You know, some of the technical safety research just is getting quite expensive in absolute terms, because we’re talking tens of millions, possibly hundreds of millions of budget just to have all of the hardware that you need in order to do it.

So that’s a way that you might be able to deploy serious resources, if you had it as cash rather than equity, that really could push forward the field, that could push forward the science in a useful way. That’s an opportunity that they have that people didn’t have so clearly five years ago.

Rose Chan Loui: Rob, I don’t know where this goes, but what if they decided that one of the other for-profit organisations, let’s say the one that Ilya has gone off and started, is in a better position to develop AGI safely? I suppose they could put their money behind that if they had cash. I hadn’t thought of that until now, but if they really were independent, they could decide which horse to back.

Rob Wiblin: Yeah. And choose a different horse if they want to.

Rose Chan Loui: And choose a different horse, potentially.

Rob Wiblin: It’s totally true. And they could choose to invest in it on a for-profit basis as well. They could try to influence things that way.

Related episodes

About the show

The 80,000 Hours Podcast features unusually in-depth conversations about the world's most pressing problems and how you can use your career to solve them. We invite guests pursuing a wide range of career paths — from academics and activists to entrepreneurs and policymakers — to analyse the case for and against working on different issues and which approaches are best for solving them.

The 80,000 Hours Podcast is produced and edited by Keiran Harris. Get in touch with feedback or guest suggestions by emailing [email protected].

What should I listen to first?

We've carefully selected 10 episodes we think it could make sense to listen to first, on a separate podcast feed:

Check out 'Effective Altruism: An Introduction'

Subscribe here, or anywhere you get podcasts:

If you're new, see the podcast homepage for ideas on where to start, or browse our full episode archive.