The moment I am able to outsource work for Jira tickets to a level that AI actually delivers a reasonable pull request, many corporate managers will seriously wonder why keep the offshoring team around.
It seems like the Holy Grail here has become: "A business is one person, the CEO, sitting at his desk doing deals and directing virtual and physical agents to do accounting, run factories, manage R&D, run marketing campaigns, everything." That's it. A single CEO, (maybe) a lawyer, and a big AI/robotics bill = every business. No pesky employees to pay. That's the ultimate end game here, that's what these guys want. Is that what we want?
Keep going, the end end goal is that even the customers are AI. And the company doesn't sell anything or do anything, it just trades NFTs and stocks and digital goods. And the money isn't real, it's all crypto. This is the ideal, to create nothing, to sell nothing to no one, and for somehow that to mean you created "value" to society and therefore should be rewarded in material terms. And greatly at that, the people setting all this up expect to be at the tippy top of the social ladder for this "contribution".
This is I guess what happens when you follow capitalism to its logical conclusion. It's exactly what you expect from some reinforcement learning algorithm that only knows how to climb a gradient to maximize a singular reward. The concept of commerce has become the proverbial rat in the skinner box. It has figured out how to mainline the heroin drip if it just holds down the shock button and rewires its brain to get off on the pain. Sure it's an artificial high and hurts like hell to achieve it, but what else is there to live for? We made the line going up mean everything, so that's all that matters now. Doesn't matter if we don't want it, they want it. So that's what it's going to be.
The owner (human) would say "build a company, make me a billion dollars" and that would be the only valuable input needed from him/her. Everything else would be derived & executed by the AI swarm, while owner plays video games (or generally enjoy the product of other people's AI-labor) 100% of the time.
I'd argue GPT4 (2022) was already AGI. It could output anything you (or Tim Cook, or any other smart guy) could possibly output given the relevant context. The reason it doesn't right now is we are not passing in all your life's context. If we achieve this, a human CEO has no edge over an AI CEO.
People are figuring this problem out very quickly, therefore the explosion of agentic capabilities happening right now even though the base model fundamentally does the same stuff as GPT4.
Of all the professions that are at the risk of being downsized, I think lawyers are up there. We used to consult our lawyers so frequently about things big and small. We have now completely removed the small stuff from that equation. And most of our stuff is small. There is very little of the big stuff and I think LLMs aren't too far from taking care of that as well.
Yup I have said for the past year to anyone that'll listen, that the concept of hourly (white collar) work will go away.
And there's no better example of hourly work than lawyers.
Personally, I've always disliked the model of billing by the hour because it incentivizes the wrong things, but it is easier to get clients to justify these costs (because they're used to thinking in that framework).
I'd rather take on the risk and find ways to do more efficient work. It's actually FUN to do things that way. And nowadays, this is where AI can benefit in that framework the most.
So far, automation has only ever increased the need for software development. Jevons Paradox plus the recursive nature of software means that there's always more stuff to do.
The real threats to our profession are things like climate change, extreme wealth concentration, political instability, cultural regression and so on. It's the stuff that software stands on that one should worry about, not the stuff that it builds towards.
Maybe I’m not think big picture enough… but have you ever tried using generative AI (i.e., a transformer) to create a circuit schematic? They fail miserably. Worse than Chat GPT-2 at generating text.
The current SOTA models can do some impressive things, in certain domains. But running a business is way more than generating JavaScript.
The way I see it, only some jobs will be impacted by generative AI in the near term. Not replaced, augmented.
Because of human factors, no complains, can do overtime as much as electricity is on, no unions, and everything else that a good CEO to the whims of exponential growth for their shareholder likes to do so much.