Hacker Newsnew | past | comments | ask | show | jobs | submit | foo42's commentslogin

geek code is worthy of its own hn submission

I am in a position I suspect many would be envious of. I've never seen any of these films. My first experience waits ahead!

Where should I begin? I have kids in the 10-15 year old bracket if people think they'd make a good family experience.


The order as presented in letterboxd is pretty spot on. The first 8 are the most accessible to audiences. After that it is more for a deep dive and/or completionists. I've seen the top 11, I should probably watch them all again sometime.

https://letterboxd.com/director/hayao-miyazaki/


This is a good order for sure. Enjoy it GP, these are some of the most beautiful, bizarre, otherworldly pieces of art ever created.


I can't recommend a specific order but...

If looking for "adventure" go with Laputa, Cagliostro, maybe Porco Rosso.

If looking for something "warm" go with Totoro, Kiki, Ponyo.

If looking for something more "fantastical" (probably also more complex) try Howl's, Spirited Away, Nausicaä, Mononoke.

If you are willing to consider a TV show, then Sherlock Hound is really fun and delightful in every possible way. The complete series is IIRC 26 episodes.

Also, there are some non-Miyazaki but still Ghibli films which you may want to consider too. Personally, I absolutely love Mimi wo Sumaseba and I can hear the sea, but these may well be a bit uninteresting for a 10 year old; better for a teen probably over 15 I'd guess. Pompoko is clearly recommended for your age bracket though, and it's fun too.

Finally, there's the question of Grave of the Fireflies. I think 15 is probably fine, 10... maybe. But in any case, be warned that it's a very sad story and I have very rarely met someone who could watch it without crying profusely.


I tend to agree with ProZD's tier list[1] where "Kiki's Delivery Service", "Porco Rosso", and "Totoro" are at S rank. Those might also be a good introduction since they're pretty "normal".

[1] - https://www.youtube.com/watch?v=g_8uHtL6V0Y


Warning: I thought the first 10 minutes of Up were the saddest thing I'd ever watch, but then I watched Grave of the Fireflies.


Spirited Away is my absolute favorite one and I think great for children too. Mononoke, Porco Rosso, and Nausicaa are maybe more serious/mature but still incredible. Other valid options especially for children: Totoro, Ponyo, Howl's Moving Castle.

In any case you're in for an adventure!


What an fantastic problem to have! A few Ghibli films are a tad too odd or serious in tone for a movie night, but I can heartily recommend Ponyo, M.N. Totoro, and Spirited Away, the first being probably the most light-hearted and simple fun.


Great list! Totoro is my favorite,

I’d add Kiki’s Delivery Service to the ‘mostly fun’ side as well


I’d start win this order:

    Spirited away
    Howls moving castle
    Princess Mononoke
    Castle in the sky
    Nausicaa


Like others I would recommend starting with spirited away, should be suitable, and enjoyable for kids in 10-15 bracket.


Start with "My Neighbor Totoro"


Spirited Away.


Like many here, I too blog for the mental exercise of composing, the human desire to express, and a faith that posting to the public web is some how intrinsically worthy. On the first point I've found I get a lot of benefit from the posts I never even publish as I'll keep chewing over subjects which I'm mentally composing a post about and gain a lot of personal clarity in the process.

One thing I've wondered though (and am mentally composing a post about) is whether there's more good in ai digesting ones writing that we might first feel.

Here's a thought experiment: Would you feel good if someone read your blog and learned something from it? Probably yes. Would you feel good if they passed along something they learned to others, likely in their own words? Probably yes. What if they couldn't recall, or didn't choose to reference where they saw it? Probably still yes, although (speaking personally) my ego would probably prefer they did credit. What if the reader who passed the learning along was the ai?

In a sense we're still contributing to the public discourse and culture when we write, just mediated by models. If a model gives someone a slightly different answer in part because of something you wrote, you've still had an impact on the ultimate human reader.

Just to lay my cards on the table I'm no AI booster, nor doomer. In general I think it's over hyped and may well have a net negative effect if steered by those current at the wheel and consumed without due care, but it has its place where it can be useful.


> Here's a thought experiment: Would you feel good if someone read your blog and learned something from it? Probably yes. Would you feel good if they passed along something they learned to others, likely in their own words? Probably yes. What if they couldn't recall, or didn't choose to reference where they saw it? Probably still yes, although (speaking personally) my ego would probably prefer they did credit. What if the reader who passed the learning along was the ai?

This is definitely an interesting way of looking at it. If your blog ends up in pre-training data, it will become part of the AI. Or if not, an AI might still fetch it when a user asks something specific. It reminds me of voting in a democracy, which many people consider a right and a duty - but in reality a single vote is hardly going to swing any election.


that's a good analogy


This is a good way to look at it. I recently starting thinking something similar now that chatgpt.com and Perplexity are showing up as referral sources to my blog. So there is some verification (or hope there is) that someone got to my content and learned something from it.


I find the problem I have is once I get going on a problem I can't shake it out of my head. I end up lying in bed for hours pleading with my brain to let it go if I've not found the time to finish it during the crumbs of discretionary time in the day!


Do you have any links to places which would better fit that?

it's perhaps telling of another flaw in this community that I find I need some way to clarify this isn't snark or sarcasm.


I was thinking about this subject just this week.

I think an ideal set up for me would be if I could boot my laptop to an different user on the same os where I only had 1 or 2 apps: Obsidian for notes, and possibly an e-reader for some use cases. Then somehow put a timed lock out on the main os/user. That way I could set a period of time where I was locked into that task, but I wouldn't need a dedicated device


People interested in the history of the internet may enjoy the book "Where wizards stay up late". I'm sure there are other good books on it too (perhaps others can recommend below), but that's the one I read and enjoyed.


A side effect of the non-deterministic behaviour is that, unlike previous increases in abstraction, the high level prompts are not checked in to the code base and available to recreate their low level output on demand. Instead we commit the lower level output (ie code) and future revisions must operate on this output without the ability to modify the original high level instructions.


The tradeoff of higher velocity for less enjoyment may feel less welcome when it becomes the new baseline and the expectation of employers / customers. The excitement of getting a day's work done in an hour* (for example) is likely to fade once the expectation is to produce 8 of such old-days output per day.

I suspect it doesn't matter how we feel about it mind you. If it's going to happen it will, whether we enjoy the gains first or not.

* setting aside whether this is currently possible, or whether we're actually trading away more quality that we realise.


> The excitement of getting a day's work done in an hour* (for example) is likely to fade once the expectation is to produce 8 of such old-days output per day.

That dumb attitude (which I understand you’re criticising) of “more more more” always reminds me of Lenny from the Simpsons moving fast through the yellow light, with nowhere to go.

https://www.youtube.com/watch?v=QR10t-B9nYY

> I suspect it doesn't matter how we feel about it mind you. If it's going to happen it will, whether we enjoy the gains first or not.

That is quite the defeatist attitude. Society becoming shittier isn’t inevitable, though inaction and giving up certainly helps that along.


> "If it's going to happen it will" - That is quite the defeatist attitude. Society becoming shittier isn’t inevitable

You're right in general, but I don't think that'll save you/us from OP's statement. This is simple economic incentives at play. If AI-coding is even marginally more economically efficient (i.e. more for less) the "old way" will be swept aside at breathtaking pace (as we're seeing). The "from my cold dead hands" hand-coding crowd may be right, but they're a transitional historical footnote. Coding was always blue-collar white-collar work. No-one outside of coders will weep for what was lost.


> If AI-coding is even marginally more economically efficient (i.e. more for less) the "old way" will be swept aside at breathtaking pace (as we're seeing).

On the scale I’ve been doing this (20 years), that hasn’t been the case.

Rails was massively more efficient for what 90% of companies where building. But it never had anywhere near a 90% market share.

It doesn’t take 1000 engineers to build CRUD apps, but companies are driven to grow by forces other than pure market efficiency.

There are still plenty of people using simple text editors when full IDEs have offered measurable productivity boosts for decades.

>(as we’re seeing)

I work at a big tech company. Productivity per person hasn’t noticeably increased. The speed that we ship hasn’t noticeably increased. All that’s happening is an economic downturn.


I think that you're correct in that Rails and IDEs offer significant productivity benefits but aren't/weren't widely adopted.

But AI seems to be different in that it claims to replace programmers, instead of augment them. Yes, higher productivity means you don't have to hire as many people, but with AI tools there's specifically the promise that you can get rid of a bunch of your developers, and regardless of truth, clueless execs buy the marketing.

Stupid MBAs at big companies see this as a cost reduction - so regardless of the utility of AI code-generation tools (which may be high!), or of the fact that there are many other ways to get productivity benefits, they'll still try to deploy these systems everywhere.

That's my projection, at least. I'd love to be wrong.


I believe you’re 100% right about trying to replace devs. In that respect it’s like offshoring.

But no matter how hard cost cutters wanted to, they were never able to actually reduce the total number of devs outside of major economic downturns.


> more economically efficient

I suspect we'll find that the amount of technical debt and loss of institutional knowledge incured by misuse of these tools was initially underappreciated.

I don't doubt that the industry will be transformed, but that doesn't mean that these tools are a panacea.


I read about AI assistants allegedly creating tech debt but my experience is opposite. Claude Code makes it easy to refactor helping to reduce tech debt. Tech debt usually happens because refactoring takes time but is hard to justify to upper management because upper management only sees new features but not quality of code. With Claude Code refactoring is much faster so it gets done.


Are you talking about refactoring code you’re already familiar with? Or a completely unknown codebase that no one else at the company knows anything about and you’ve been tasked with fixing?


Both. But I'm not talking about fixing. I'm talking about refactoring.


It’s often the case that in order to fix an issue you need to refactor first.


I would argue that you should only allow Claude to refactor code that you understand. Once that institutional knowledge is lost you would then have to regain it before you can safely refactor it, even with Claude's help.

I also specifically used the term "misuse" to significantly weaken my claim. I mean only to say that the risks and downsides are often poorly understood, not that there are no good uses.


> “more more more”

If you're a salaried or hourly employee, you aren't paid for your output, you are paid for your time, with minimum expectations of productivity.

If you complete all your work in an hour... you still owe seven hours based on the expectations of your employment agreement, in order to earn your salary and benefits.

If you'd rather work in an output based capacity, you'll want to move to running your own contacting business in a fixed-bid type capacity.


That's funny, because, at more than one place I've worked as a salaried employee, when I had to work OT, the narrative was "you're salary because you're paid to get the job done, doesn't matter how many hours it takes". Unfortunately, "the job" somehow never worked out to be less than 40 hours a week.


> you are paid for your time, with minimum expectations of productivity

There's legal distinctions between part time and full time employment. Hence, you are expected to put in a minimum number of hours. However, there's nothing to say that the minimum expectation is the minimum for classification for full time employment.

If AI lets you get the job done in 1 hour when you otherwise would have worked overtime, you're still technically being paid to work more than that one hour, and I don't know of any employer that'll pay you to do nothing.


In many (most?) salary jobs, employees are typically paid both to get the job done, and to supply at least N hours of their time for the company to have them use as it sees fit.


Yeah, I've been in industry for over a decade, still don't understand the value of salary for the devs.


> That is quite the defeatist attitude. Society becoming shittier isn’t inevitable, though inaction and giving up certainly helps that along.

If the structures and systems that are in-place only facilitate life getting more difficult in some way, then it probably will, unless it doesn't.

Housing getting nearly unownable is a symptom of that. Climate change is another.


moving fast through the yellow light, with nowhere to go.

My company has been preparing for this for a while now, I guess, as my backlog clearly has years' worth of work in it and positions of people who have left the org remain unfilled. My colleagues at other companies are in a similar situation. Considering round after round of layoffs, if I got ahead a little bit and found that I had nothing else to do, I'd be worried for my job.

Society becoming shittier isn’t inevitable

Yes, I agree, but the deck is usually stacked against the worker, especially in America. I doubt this will be the issue that promotes any sort of collectivism.


> That is quite the defeatist attitude. Society becoming shittier isn’t inevitable, though inaction and giving up certainly helps that along.

Correct. But it becoming shittier is the strong default, with forces that you constantly have to fight against.

And the reason is very simple: Someone profits from it being shittier and they have a lot of patience and resources.


> That dumb attitude (which I understand you’re criticising) of “more more more” always reminds me of Lenny from the Simpsons moving fast through the yellow light, with nowhere to go.

Realizing that attitude in myself at times has given me so much more peace of mind. Just in general, not everything needs to be as fast and efficient as possible.

Not to mention the times where in the end I spend a lot of time and energy in trying to be faster only to end up with this xkcd: https://xkcd.com/1319/

As far as LLM use goes, I don't need moar velocity! so I don't try to min max my agentic workflow just to squeeze out X amount more lines code.

In fact, I don't really work with agentic workflows to begin with. I more or less still use LLMs as tools external to the process. Using them as interactive rubber duckies. Things like deciphering spaghetti code, do a sanity check on code I wrote (and being very critical of the suggestions they come up with), getting a quick jump start on stuff I haven't used again (how do I get started with X of Y again?), that sort of stuff.

Using LLMs in the IDE and other agentic use is something I have worked with. But to me it falls under what I call "lazy use" where you are further and further removed from the creation of code, the reasoning behind it, etc. I know it is a highly popular approach with many people on HN. But in my experience, it is an approach that makes skills of experienced developers atrophy and makes sure junior developers are less likely to pick them up. Making both overly reliant on tools that have been shown to be less than reliable when the output isn't properly reviewed.

I get the firm feeling that velocity crowd works in environments where they are judged by the amount of tickets closed. Basically "feature complete, test green, merged, move on!". In that context, it isn't really "important" that the tests that are green are also refactored by the thing itself, just that they are green. It is a symptom of a corporate environment where the focus is on these "productivity" metrics. From that perspective I can fully see the appeal for LLM heavy workflows as they most certainly will have an impact on metrics like "tickets closed" or "lines of code written".


> Just in general, not everything needs to be as fast and efficient as possible.

It does when you are competing for getting and keeping employment opportunities.


Even then it is not always needed, but I also more directly address this already in the comment you are replying to.


> That is quite the defeatist attitude. Society becoming shittier isn’t inevitable, though inaction and giving up certainly helps that along.

This feels like kicking someone when they’re down! Given the current state of corporate and political America, it doesn’t look likely there will be any pressure for anything but enshittification to me. Telling people at the coal face to stay cheerful seems unlikely to help. What mechanism do you see for not giving up to actually change the experience of people in 10 ish years time?


Unionization. Work speed-ups are a classic management assault on the working man.


> Telling people at the coal face to stay cheerful seems unlikely to help.

That isn't what they said tho. They said you have to do something, not that you should just be happy. Doing something can involve things that historically had a big impact in improving working conditions, like collective action and forming unions.

The opposite advice would be: "Everything's fucked, nothing you can do will change it, so just give up." Needless to say that is bad advice unless you are a targeted individual in a murderous regime or similar.


> That is quite the defeatist attitude. Society becoming shittier isn’t inevitable, though inaction and giving up certainly helps that along.

The hypothetical that we're 8x as productive but the work isn't as fun isn't "society becoming shittier".


How is everyone working shitty jobs not "society becoming shittier"? Seems pretty awful


"Society" is more than just software developers.

We are very well paid for very cushy work. It's not good for anyone's work to get worse, but it's not a huge hit to society if a well-paid cushy job gets less cushy.

And presumably people buy our work because it's valuable to them. Multiplying that by 8 would be a pretty big benefit to society.

I don't want my job to get less fun, but I would press that button without hesitation. It would be an incredible trade for society at large.


Well lets think about it.

Software devs jobs getting less cushy is no biggie. We can afford to amp up the efficiency. Teachers jobs got "less cushy" -> not great for users/consumers or the ppl in those jobs. Doctors jobs got "less cushy" -> not great for users/consumers or the ppl in those jobs. Even waiters/ check-out staff, stockists jobs at restaurants, groceries and AMZ got "less cushy" -> not great for users/consumers or the ppl in those jobs. at least not when you need to call someone for help.

These things are not as disconnected as they seem. Businesses are in fact made up of people.


Maybe everybody's job should be cushy instead. We were not put on this earth to toil away for a bunch of rich fucks


My cushy software job has burned me out so badly that I am on medical leave with massive memory problems and a bit of concern about my heart

So I mean... Yeah

Is software more comfortable generally than many other lines of work? Yes probably

Is it always soft and cushy? No, not at all. It is often high pressure and high stress


I'm sorry for implying there can't be hardship (significant, even devastating) in this line of work. Thanks for posting about your experience.


What kind of massive memory problems? I might have this but didn't think to attribute it to burnout.


When I burned out I experienced skill regression and short term memory loss. Like, an inability to remember specific events of the day before, inability to perform skills I had done for decades like play an instrument. Took over a year to stabilize and return to normal.


Yes, this is extremely similar to my experience

I cannot remember events, conversations, or details about important things. I have partially lost my ability to code, because I get partway through implementing a feature and forget what pieces I've done and which pieces still need to be done

I can still write it, but the quality of my work has plummeted, which is part of why I'm off on leave now


Hang in there, it gets better and the skills come back. From my github:

  2,350 contributions in 2021
  2,661 contributions in 2022
  381 contributions in 2023 <--- burnout
  794 contributions in 2024 <--- recovery
  1,632 contributions in 2025 (so far)
My recovery took about 18 months. It took time, and a lot of rest. I'd have to sleep like 12 hours a day sometimes.


Thank you for the kind encouragement

I hope my recovery doesn't take that long, but if it does it does

I would rather give myself the space and time to really get better, rather than simply rush back to work and burn out again


was going through something similar. here's my anti burnout protocol thats kept me functional all the way to my current position as founder and CTO of a profitable startup.

1. 1 tablespoon of cold extracted cod liver oil EVERY MORNING

2. 30 min of running 3-4 times a week

3. 2-3 weight lifting sessions every week

4. regular walks.

5. cross train on different intellectually stimulating subjects. doing the same cognitive tasks over and over is like repetive motion on your muscles

6. regularly scheduled "fallow mind time." I set aside an 30 min to an hour everyday to just sit in a chair and let my mind wander. its not meditation. I jsut sit and let my mind drift to whatever it wants.

7. while it should be avoided, in the event that you have to crunch, RESPECT THE COOLDOWN. take downtime after. don't let your nontechnical leads talk you out of it. thinking hard for extended periods of time requires appropriate recovery.

the human brain is a complex system and while we think of our mind as abstract and immaterial, it is in reality a whole lot of physical systems that grow, change and use resources the same way any other physical system in your body does. just like muscles need to recover after a workout to get stronger, so too does your brain after extended periods of deep thinking.


Mine is more of a long term memory loss. Inability to recall some memorable events from months or even a year ago. I’ll definitely check or go talk with someone.


Also asking for a friend…


Really sorry to hear that

All I can suggest is see a doctor as soon as possible and talk to them about it


Difficult to explain because it's inconsistent

But I am struggling to remember things I did not used to struggle with

Going to an event on a weekend with my wife and completely forgetting that we ran into a friend there. Not just "oh yeah I forgot we saw them", like feeling my wife is lying to me when she tells me we saw them. Texting them to ask and they agree we saw each other

These are people I trust with my life so I believe they would not gaslight me, my own memory has just failed

Many examples like this, just completely blacking out things. Not forgetting everything, but blacking out large pieces of my daily life. Even big things


FWIW, I am not your doctor: Taking a daily antioxidant, glutathione, has helped me manage memory-related symptoms that appeared coincident with other burnout symptoms.

Disclaimer: talk to your doctor. I don’t know if your doctor can tell you whether this is a good idea, but it might help in some countries with good medical systems.


I might as well ask my doctor about it, thanks


If you think software development is cushy, I wonder what kind of software you're writing. Because there are different levels; getting something to work is not the same thing as writing maintainable, high quality software.

I've seen plenty enough people try, really try, to get into software development; but they just can't do it.


This places a lot of faith in the following assumptions:

1. Efficiency measures as written to benchmark this coupling with economic productivity overall

2. Monetary assessments of value in the context of businesses spending money corresponding with social value

3. The gains of economic productivity being distributed across society to any degree, or the effect of this disparity itself being negligible

4. The negative externalities of these processes scaling less quickly than whatever we're measuring in our productivity metric

5. Aforementioned externalities being able to scale to even a lesser degree in lockstep with productivity without crashing all of the above, if not causing even more catastrophic outcomes

I have very little faith in any of these assumptions


Okay, but the reality of society becoming shittier is society becoming shittier.


The trick is not telling anyone you spent an hour to do 7 hours of work.

That's stupid and detrimental to your mental health.

You do it in an hour, spend maybe 1-2 hours to make it even better and prettier and then relax. Do all that menial shit you've got lined up anyway.


> The trick is not telling anyone you spent an hour to do 7 hours of work.

I wish that the hype crowd would do that. It would make a for a much more enjoyable and sane experience on platforms like this. It's extremely difficult to have actual conversations subjects when there are crowds of fans involved who don't want to hear anything negative.

Yes, I also do realize there are people completely on the other side as well. But to honest, I can see why they are annoyed by the fan crowd.


>I wish that the hype crowd would do that.

Exactly, IME the hype crowd is really the worst at this. They will spend 8h doing 8 different 1h tries at getting the right context for the LLM and then claim they did it in 1h.

They claim to be faster than they are. There's a lot of mechanical turking going about as soon as you ask a few probing questions.


Until your coworkers who've never heard of work-life balance start bragging about it, and volunteering to spend 8 hours to do 56 hours of work, or maybe spending 11 hours to impress the boss.


The most challenging thing I'm finding about working with LLM-based tools is the reduction in enjoyment. I'm in this business because I love it, and I'm worried about that going forward.


My daughter who switching from engineering to software because she enjoyed coding expressed that LLMs are taking away everything she found enjoyable about the job and reducing her to QA. She hates it and if the trend continues I won’t be surprised if she switches industries.


Yup. not sure if it's just me but the trend seems to be just ship slop as fast as you can, nothing else matters, just ship ship ship


For the longest time, IT workers were 'protected' from Marx's alienation of labor by the rarity of your skill, but now it's coming for you/us, too. Claude Code is to programmers what textile machines were to textile workers.

>In the capitalist mode of production, the generation of products (goods and services) is accomplished with an endless sequence of discrete, repetitive motions that offer the worker little psychological satisfaction for "a job well done." By means of commodification, the labour power of the worker is reduced to wages (an exchange value); the psychological estrangement (Entfremdung) of the worker results from the unmediated relation between his productive labour and the wages paid to him for the labour.

Less often discussed is Marx's view of social alienation in this context: i.e., workers used to take pride in who they are based on their occupation. 'I am the best blacksmith in town.' Automation destroyed that for workers, and it'll come for you/us, too.


Nope, not until AI is writing high quality, maintainable software; and no one knows if that's even possible yet.


What caused textile machines to replace the manual labor wasn’t the quality of their output, it was quantity. In fact, manually made clothing was of higher quality than what was machine-produced.


A low quality fabric makes the fashion police come and arrest you.

Low quality software kills people.


Safety critical (will kill someone if not bug free) code makes up <1% of what's shipped, safety clothes which must be of high quality else risk harm to someone make up a similarly small percent

Both will stay manual / require high level of review they're not what's being disrupted (at-least in near term) - it's the rest.


Nearly all clothing is still produced in an extremely manual process.

What was automated was the production of raw cloth.


This is a distinction without a difference. Even if you take a rudimentary raw cloth comparison like cotton vs heavy wool (the latter being fire resistant and used historically used by firemen, ie. “Safety critical”), the machines’ output quality was significantly lower than manual output for the latter.

This phenomenon is a general one… chainsaws vs hand saws, bread slicers vs hand slicing, mechanical harvesters vs manual harvesting, etc.


That’s just not the general case at all. Automated or “powered” processes generally lead to a more consistent final product. In many cases the quality is just better than what can be done by hand.


There are many corporate nightmare level scenarios out there. There is no need to reach loss of life situations to make my point.

A large enough GDPR or SOX violation is the boogeyman that CEO's see in their nightmares.


Have plenty of people, quite literally worth less than most material goods (evident from current social positions and continued trajectories) so why would companies care if it makes more money overall? Our lives have a value and in general its insultingly low.


That’s a misconception.

The machines we’re talking about made raw cloth not clothing and it was actually higher quality in many respects because of accuracy and repeatability.

Almost all clothing is still made by hand one piece at a time with sewing machines still very manually operated.


“ …by the mid‑19th century machine‑woven cloth still could not equal the quality of hand‑woven Indian cloth. However, the high productivity of British textile manufacturing allowed coarser grades of British cloth to undersell hand‑spun and woven fabric in low‑wage India” [0]

“…the output of power looms was certainly greater than that of the handlooms, but the handloom weavers produced higher quality cloths with greater profit margins.” [1]

The same can be said about machines like the water frame. It was great at spinning coarse thread, but for high quality/luxury textile (ie. fine fabric), skilled (human) spinners did a much better job. You can read the book Blood in the Machine for even more context.

[0] https://en.wikipedia.org/wiki/Industrial_Revolution

[1] https://en.wikipedia.org/wiki/Dandy_loom


The problem with those quotes is the lack of definition of “quality”. Machine woven cloth in many ways is better because of consistency and uniformity.

If your goal is to make 1000 of the exact same dress, having a completely consistent raw material is synonymous with high quality.

It’s not fair to say that machines produced some kind of inferior imitation of the handmade product, that only won through sheer speed and cost to manufacture.


Yes, but they still filled their purpose.

AI slop code doesn't even work beyond toy examples.


After the operating system and the spreadsheet, most software is toys.


There are a lot of professional software out there. CAD, DAW, software that automated some services, and software to support all of those.


That isn't even close to true; we've based more or less our entire society on software, and it's getting worse every day.


In fairness to AI, many software devs are not writing high quality, maintainable software

But in fairness to human devs, most are still writing software that is leagues better than the dog shit AI is producing right now


Why wouldn’t it be possible if meat computers can do it more or less reliably? The construction of such a machine is a matter of time; whether it’s a year or a century is anyone’s guess.

When it is eventually made, though… it’s either aligned or we’re in trouble. Job cushiness will be P2 or P3 in a world where a computer can do everything economically viable better than any human.


Speak for yourself, I'm not a computer.


Your opinion on the matter is not relevant whether you are or aren't one. Besides, if our brains aren't computers, what are they?


> what are they?

They are brains. I think it's on you to prove they're the same, rather than assuming they're the same and then demanding proof they aren't!


Turing machines are universal. Anything that does anything is, at most, a Turing machine.


Turing machines are not universal. They can compute anything computable, that’s a huge difference.


Exactly, maybe "prompt engineering" is really a skill, but the reward for getting better at this is just pumping out more features at a low skill grade. What's excited about this ? Unless I want to spend all my time building minimum viable product.


Prompt engineering is just writing acceptance criteria; it's moving from someone who writes code to someone who writes higher level feature descriptions. Or user stories, if you will.

Thing is though, many people don't know how to do that (user stories / acceptance criteria) properly, and it's been up to software developers to poke holes and fill in the blanks that the writer didn't think about.


>The tradeoff of higher velocity for less enjoyment may feel less welcome when it becomes the new baseline and the expectation of employers / customers.

This is precisely the question that scares me now. It is always so satisfying when a revolution occurs to hold hands and hug each other in the streets and shout "death to the old order". But what happens the next morning? Will we capture this monumental gain for labor or will we cede it to capital? My money is on the latter. Why wouldn't it be? Did they let us go home early when the punch card looms weaved months worth of hand work in a day? No, they made us work twice as hard for half the pay.


What would this gain for labor even look like? Four hour workdays?


Short-term, automated tech debt creation will yield gains.

Long term the craftsperson writing excellent code will win. It is now easier than ever to write excellent code, for those that are able to choose their pace.


Given it's 2025 and companies saddled with tech debt continue to prioritize speed of delivery over quality, I doubt the craftperson will win.

If anything we'll see disposable systems (or parts) and the job of an SE will become even more like a plumber, connecting prebuilt business logic to prebuilt systems libraries. When one of those fails, have AI whip up a brand new one instead of troubleshooting the existing one(s). After all, for business leader it's the output that matters, not the code.

For 20+ years business leaders have been eager to shed the high overhead of developers via any means necessary while ignoring their most expensive employees' input. Anyone remember Dilbert? It was funny as a kid, and is now tragic in its timeless accuracy a generation later.


> it's 2025 and companies saddled with tech debt continue to prioritize speed of delivery over quality

Maybe. I'm seeing the opposite - yes, the big ships take time to turn, but with the rise of ransomware and increasing privacy regulation around the world, companies are putting more and more emphasis on quality and avoiding incidents.


Also, companies are expected to adapt more faster or see their lunch money taken by startups (unless you're in a heavily regulated space). There's a lot of quality opensource software out there, so you don't need much to bootstrap. The tech debt that was ok because you can take your sweet time to deliver some feature is no longer so.


Yes there will be a class of developer like that, but it would only be considered winning if you're satisfied with climbing some artificial institutional hierarchy.


Indeed, the job of an SE is deviating further and further from code, much like how very few people write assembly anymore.

An earlier iteration of your reply said "Is that really winning?" The answer is no. I don't think any class of SE end up a winner here.


Climbing institutional hierarchy usually comes with being rewarded with more credits that I can trade for food, and I like eating food.


Avocado toast strikes again


Can you give an example that's playing out today?


> The tradeoff of higher velocity for less enjoyment may feel less welcome when it becomes the new baseline and the expectation of employers / customers. The excitement of getting a day's work done in an hour* (for example) is likely to fade once the expectation is to produce 8 of such old-days output per day.

That's why we should be against it but hey, we can provide more value to shareholders!


That is certainly how it played out for me. Give your employers an inch and they will take a mile.


> The excitement of getting a day's work done in an hour* (for example) is likely to fade once the expectation is to produce 8 of such old-days output per day.

It's not really about excitement or enjoyment of work.

It's the fear about the __8x output__ being considered as __8x productivity__.

The increase in `output/productivity` factor has various negative implications. I would not say everything out loud. But the wise can read between the lines.


I always feel most productive when I remove more code than I add.


Simplicity is the ultimate sophistication.


> The tradeoff of higher velocity for less enjoyment may feel less welcome when it becomes the new baseline and the expectation of employers / customers

This is what happens with big technological advancements. Technology that enables productivity won’t free people time, but only set higher expectations of getting more work done in a day.


You won’t enjoy any of the gains, the company will be worth 10x and you’ll get a 10% raise to match inflation


If there are 8 days per day worth of work to be done (which I doubt), why wouldn’t you want to have it done ASAP? You’re going to have to do it eventually, so why not just do it now? Doesn’t make sense. You act like they’re just making up new work for you to do when previously there wouldn’t have been any.


Yes, work will expand to fill all your available hours due to unaligned incentives between who does the work (the SWE in this example) and who decide the quantity, timeline, and cost of work.

If the SWE can finish his work faster, 8x faster in this case, then backlogs will also be pushed to complete 8x faster by the project manager. If there are no backlogs, new features will be required at 8x faster / more by sales team / clients. If no new features are needed, pressures will be made until costs are 8x lower by finance. If there are no legal, moral, competitive, or physical constraints, the process should continue until either there’s only a single dev working on all his available time, or less time but for considerably less salary.


The reward for being faster is to do more work.


> The tradeoff of higher velocity for less enjoyment

I'm enjoying exactly what the author describes, so it's different strokes for different folks.

I always found the "code monkey" aspect of software development utterly boring and tedious, and have spent a lot of my career automating that away with various kinds of code generators, DSLs, and so on. \

Using an LLM as a general-purpose automated code monkey is a near-ideal scenario for me, and I enjoy the work more. It's especially useful for languages like Go or Java where repetitive boilerplate is endemic.

I also find it helps with procrastination, because I can just ask an LLM to start something for me and get over the initial hump.

> whether we're actually trading away more quality that we realise.

This is completely up to the people using it.


Even if the developer is keeping the quality of the LLM generated code high (by constant close reading of the output, rejecting low quality work and steering with prompts) does this mean the project as a whole is improving? I have my doubts! I'm also skeptical that this developer has increased their velocity as much as they believe, IMHO this has long been a difficult thing to measure.

Overall, is this even a good thing? With this increase in output, I suspect we'll need to apply more pressure to people requesting features to ensure those requests are high quality. When each feature takes half the time to implement, I bet it's easy to agree to more features without spending as much time evaluating their worth.


I write with my hand below the line to avoid smudging. A consequence of this is my pen meets the page at quite a shallow angle which I find is perfect for fountain pens but scratchy with ball points. These days I do very little hand writing and find my traditional pose (described above) causes hand cramps, but I don't know if that's specific to the odd way I write or if all poses would when so out of practice


Did you learn that handwriting pose already as a child? If not, how hard was it to teach yourself writing that way?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: