I read through the essay and really resonated with some parts and didn’t resonate with others, but I think they put some words to the feelings I have had on AI and its effect in the tech industry
> There are people who use these, apparently. And it just feels so… depressing. There are people I once respected who, apparently, don’t actually enjoy doing the thing. They would like to describe what they want and receive Whatever — some beige sludge that vaguely resembles it. That isn’t programming, though. That’s management, a fairly different job. I’m not interested in managing. I’m certainly not interested in managing this bizarre polite lying daydream machine. It feels like a vizier who has definitely been spending some time plotting my demise.
I was several minutes of reading before this paragraph when the idea hit me that this person hates managing. Because everyone I’ve met who hates using AI to produce software describes to me problems like the AI not being correct or lying to them if the model thought that would please you better, and that’s my experience with junior engineers as a manager.
And everyone I’ve met who loves AI at some point makes an analogy to it, that compares it to a team of eager juniors who can do a lot of work fast but can’t have their output trusted blindly, and that’s my experience with junior engineers as a manager.
And then anyone whose been trying to get an Engineering manager job over the past few months and tracking their applications metadata has seen the number of open postings for their requirements go down month after month unless you drop the manager part and keep all the same criteria but as IC
And then I read commentary from megacorps about their layoffs and read between the lines like here[1]
>… a Microsoft spokesperson said in a statement, adding that the company is reducing managerial layers …
I think our general consternation around this is coming from creators being forced into management instead of being able to outsource those tasks to their own managers.
I think there's still a difference even if you look at it as "supervising a bunch of juniors". I'm happy to review the output of a human in that case because I believe that even if they got some stuff wrong and it might have been quicker and more interesting for me to just do the thing, the process is helping them learn and get better, which is both good in itself and also means that over time I have to do the supervision and support part less and less. Supervising an LLM misses out both of those aspects, so it's just not-very-fun work.
>… the process is helping them learn and get better, which is both good in itself and also means that over time I have to do the supervision and support part less and less. Supervising an LLM misses out both of those aspects, so it's just not-very-fun work.
Legitimately I think you are missing my point. What I quoted out of your response could be applied to prompt engineering/managment/tinkering. I think everyone who likes doing this with juniors and hates it with AI is conflating their enjoyment of teaching juniors with the dopamine you get from engaging with other primates.
I think most people I’ve met who hated AI would have the same level of hate for a situation where their boss made them actually manage an underperforming employee instead of letting them continue on as is ad infinitum.
It’s hard work both mentally and emotionally to correct an independent agent well enough to improve their behavior but not strongly enough to break them, and I think most AI haters are choking on this fact.
I’m saying that from the position of an engineer who got into management and choked on the fact that sometimes upper leadership was right and the employee complaining to me about the “stupid rules” or trying to lie to me to get a gold star instead of a bronze one was the agent in the system who was actually at fault
No, I really don't think that prompt engineering is the same thing. Anything I put in the prompt may help this particular conversation, but a fresh instance of the LLM will be exactly the way it was before I started. Improvements in the LLM will happen because the LLM vendor releases a new model, not because I "taught" it anything.
Yeah. I do agree with lovich that there's a lot of stuff about management that's just not fun (and that's part of why I've always carefully avoided it!) -- and one thing about AI is not just that it's management but that it's management with a lot of the human-interaction, mentoring, etc upsides removed.
> There are people who use these, apparently. And it just feels so… depressing. There are people I once respected who, apparently, don’t actually enjoy doing the thing. They would like to describe what they want and receive Whatever — some beige sludge that vaguely resembles it. That isn’t programming, though. That’s management, a fairly different job. I’m not interested in managing. I’m certainly not interested in managing this bizarre polite lying daydream machine. It feels like a vizier who has definitely been spending some time plotting my demise.
I was several minutes of reading before this paragraph when the idea hit me that this person hates managing. Because everyone I’ve met who hates using AI to produce software describes to me problems like the AI not being correct or lying to them if the model thought that would please you better, and that’s my experience with junior engineers as a manager.
And everyone I’ve met who loves AI at some point makes an analogy to it, that compares it to a team of eager juniors who can do a lot of work fast but can’t have their output trusted blindly, and that’s my experience with junior engineers as a manager.
And then anyone whose been trying to get an Engineering manager job over the past few months and tracking their applications metadata has seen the number of open postings for their requirements go down month after month unless you drop the manager part and keep all the same criteria but as IC
And then I read commentary from megacorps about their layoffs and read between the lines like here[1]
>… a Microsoft spokesperson said in a statement, adding that the company is reducing managerial layers …
I think our general consternation around this is coming from creators being forced into management instead of being able to outsource those tasks to their own managers.
I am not really sure what to do with this insight
[1] https://www.cnn.com/2025/07/02/tech/microsoft-layoffs-9000-e...