> If AGI is possible, then the entire human economy stops making sense as far as money goes
I heard people on HN saying this (even without the money condition) and I fail to grasp the reasoning behind it. Suppose in a few years Altman announces a model, say o11, that is supposedly AGI, and in several benchmarks it hits over 90%. I don't believe it's possible with LLMs because of their inherent limitations but let's assume it can solve general tasks in a way similar to an average human.
Now, how come that "the entire human economy stops making sense"? In order to eat, we need farmers, we need construction workers, shops etc. As for white collar workers, you will need a whole range of people to maintain and further develop this AGI. So IMHO the opposite is true: the human economy will work exactly as before but the job market will continue to evolve withe people using AGI in a similar way that they use LLMs now but probably with greater confidence. (Or not.)
The thinking goes:
- any job that can be done on a computer is immediately outsourced to AI, since the AI is smarter and cheaper than humans
- humanoid robots are built that are cheap to produce, using tech advances that the AI discovered
- any job that can be done by a human is immediately outsourced to a robot, since the robot is better/faster/stronger/cheaper than humans
If you think about all the people trying to automate away farming, construction, transport/delivery - these people doing the automation themselves get automated out first, and the automation figures out how to do the rest. So a fully robotic economy is not far off, if you can achieve AGI.
Why do we work? Ultimately, we work to live.* If the value of our labor is determined by scarcity, then what happens when productivity goes nearly infinite and the scarcity goes away? We still have needs and wants, but the current market will be completely inverted.
I heard people on HN saying this (even without the money condition) and I fail to grasp the reasoning behind it. Suppose in a few years Altman announces a model, say o11, that is supposedly AGI, and in several benchmarks it hits over 90%. I don't believe it's possible with LLMs because of their inherent limitations but let's assume it can solve general tasks in a way similar to an average human.
Now, how come that "the entire human economy stops making sense"? In order to eat, we need farmers, we need construction workers, shops etc. As for white collar workers, you will need a whole range of people to maintain and further develop this AGI. So IMHO the opposite is true: the human economy will work exactly as before but the job market will continue to evolve withe people using AGI in a similar way that they use LLMs now but probably with greater confidence. (Or not.)