> If Open AI didn't mention autonomy in their Microsoft agreement
What they mentioned was that their systems have to generate the profit. That requires autonomy. It needn't be explicitly mentioned. It cannot be any other way.
A human swinging a hammer would see the profit attributed to the human, not the hammer. A human "swinging" GPT would see the profit attributed to the human, not GPT. A windmill operates autonomously and thus any profit it generates would be attributed to it. However, a windmill doesn't have broad ability to outperform humans across many tasks.
Coding agents are approaching having the autonomy of a windmill. I expect this is where we will start to see the first semblance of AGI, where you will be able to say "My problem is X" and come back in a few days and have a program written to solve it. However, that is still just a "windmill". It doesn't generalize to a wide range of tasks as your definition and most other definitions of AGI expect.
You are right that details on the agreement are slim enough that we cannot say for sure if Microsoft would accept such a coding agent as being AGI, profits notwithstanding. However, even if they would, it is unlikely a coding agent alone would be able to achieve $100B in profits under any kind of human timescale. The value of software will be effectively nothing when there is no effort involved to create it. And for that reason, we can say with near certainty that the rest of your definition will also be required...
What they mentioned was that their systems have to generate the profit. That requires autonomy. It needn't be explicitly mentioned. It cannot be any other way.
A human swinging a hammer would see the profit attributed to the human, not the hammer. A human "swinging" GPT would see the profit attributed to the human, not GPT. A windmill operates autonomously and thus any profit it generates would be attributed to it. However, a windmill doesn't have broad ability to outperform humans across many tasks.
Coding agents are approaching having the autonomy of a windmill. I expect this is where we will start to see the first semblance of AGI, where you will be able to say "My problem is X" and come back in a few days and have a program written to solve it. However, that is still just a "windmill". It doesn't generalize to a wide range of tasks as your definition and most other definitions of AGI expect.
You are right that details on the agreement are slim enough that we cannot say for sure if Microsoft would accept such a coding agent as being AGI, profits notwithstanding. However, even if they would, it is unlikely a coding agent alone would be able to achieve $100B in profits under any kind of human timescale. The value of software will be effectively nothing when there is no effort involved to create it. And for that reason, we can say with near certainty that the rest of your definition will also be required...
You gave it for a reason.