This assumes that the bottleneck to profitability is the limit of software engineers they can afford to hire.
If they’re happy with current rate of progress (and in many companies that is the case), then a productivity increase of 100% means they need half the current number of engineers.
Is the reason for development on features going slow usually the number of developers though? Nowhere I’ve worked has that really been the case, it’s usually fumbled strategic decision making and pivots.
And the “current rate” is competitively defined. So if AI can make software developers twice as productive, then the acceptable minimum “current rate” will become 2x faster than it is today.
A computer already does in seconds what it used to take many people to do. In fact the word “computer” was a job title; now it describes the machine that replaced those jobs.
Yet people are still employed today. They are doing the many new jobs that the productivity boost of digital computing created.
I don't know why people think analogies from the past predict or prove anything in the future. It's as if a different situation applies completely to the current situation via analogy EVEN though both situations are DIFFERENT.
The computer created jobs because it takes human skills to talk to the computer.
It takes very little skill to talk to an LLM. Why would your manager ask you to prompt an LLM to do something for you when he can do it himself? You going to answer this question with another analogy?
Just think reasonably and logically. Why would I pay you a 300k annual salary when a chatGPT can do it for nothing? It's pretty straightforward. If you can't justify something with a straightforward answer, likely you're not being honest with yourself.
Why don't we use actually evidence based logic to prove things rather then justify things by leaping over some unreasonable gap with some analogy. Think about the current situation, don't base your hope on a past situation and hope that the current situation will be the same because of analogy.
My job is not to do a certain fixed set of tasks, my job is to do whatever my employer needs me to do. If an LLM can do part of the tasks I complete now, then I will leave those tasks to the LLM and move on to the rest of what my employer needs done.
Now you might say AI means that I will run out of things that my employer needs me to do. And I'll repeat what I said above: you've got to prove that. I'm not going to take it on faith that you have sussed out the complete future of business.
Future or events that haven't happened yet can't be proven out because it's an unknown.
What we can do is make a logical and theoretical extrapolation. If AI progresses to the point where it can do every single task you can do in seconds, what task is there for you left to do? And how hard is the task? If LLMs never evolve to the point where they can clean toilets, well then you can do that, but why would the boss pay you 300k to clean the toilet?
These are all logical conjectures on a possible future. The problem here is that if AI continues on the same trendline it's traveling on now I can't come up with a logical chain of thought where YOU or I keep our 300k+ engineering jobs.
This is what I keep hearing from not just you, but a ton of people. That analogy about how technology only created more jobs before with no illustration of a specific scenario of what's going on here. Yeah if LLMs replace almost every aspect of human intellectual analysis, design, art and engineering what is there left to do?
Clean the toilet. I'm not even kidding. We still have things we can do but the end comes when robotics catches up and is able to make robots as versatile as the human form. That's the true end when the boss has chatGPT clean the toilet.
Why should I be scared of technology that makes me more productive?