Honestly, I think it does all add up. AGI would be the most profitable product ever developed, by probably multiple orders of magnitude. It’s also a possible existential risk to life on earth.
If you believe both of those things, a whole lot of this makes sense. It makes sense that somebody would start a not for profit to try to mitigate the existential risk. It makes sense that profitable interests would do anything they can to get their hands on it.
It costs a lot to develop, so a not for profit needs to raise many billions of dollars. Billions of dollars don’t come from nowhere. So they tried a structure that is at the very least uncommon, and possibly entirely unheard of.
A not for profit controlling a for-profit entity that might make the first multi-trillion dollar product seems inherently unstable. Of course some of the people who work there are going to want to make some of that wealth. Tension must result.
If you believe both of those things, a whole lot of this makes sense. It makes sense that somebody would start a not for profit to try to mitigate the existential risk. It makes sense that profitable interests would do anything they can to get their hands on it.
It costs a lot to develop, so a not for profit needs to raise many billions of dollars. Billions of dollars don’t come from nowhere. So they tried a structure that is at the very least uncommon, and possibly entirely unheard of.
A not for profit controlling a for-profit entity that might make the first multi-trillion dollar product seems inherently unstable. Of course some of the people who work there are going to want to make some of that wealth. Tension must result.