Hacker Newsnew | past | comments | ask | show | jobs | submit | qsera's commentslogin

We should be more worried what AI will due to the ability of an average human to think.

Not that I think there is a lot of thinking going on now anyway, thanks to our beloved smartphones.

But just think about a time when human ability to reason has atrophied globally. AI might even give us true Idiocracy!


You think smartphones are the cause of atrophy ?

No sir, there was nothing there to begin with - if you read recent history, you'll see that it's full of stupidity, and a few rabble rousers leaving entire nations by the nose.

With the mollification off the smartphone, we've merely taken off the edge of this killing machine.


> We should be more worried what AI will due to the ability of an average human to think.

I had a wake up call on this yesterday. After a recent HN thread about Zed editor, I decided to give it another try, so I loaded it up, disabled AI, and tried writing some code from scratch. No AI completion, no intellisence. Two things came to mind. First, my editor seems so much more peaceful without being told what to do. Second, it was a bit scary how lost I felt. It was obvious that my own ability to communicate through code had declined a bit since I began using AI coding assistants. It turns out that as expected, coding assistants really are competitive cognitive artifacts. After that experience, I've decided that I am going to do at least part of my coding with all completions turned off. Unfortunately at work you are paid to produce quickly, so I think my AI free editor will have to be reserved for personal projects.

Further related to your statement about thought, the hallucinations persist, and even last night I got a response about 80's pop culture that was over 50% bullshit. Just imagine what intentional persuasion through LLM models will do to society. Independent thought has never been more important.


Similar rhetoric was allways there with new technologies. Calculators, radio, cameras, phones, computers, smartphones, social networks...

Regulations do more harm than learning process from mistakes.


Yes we have this before and everytime it was correct.

What was correct? That we became idiots with new tools?

>That we became idiots with new tools?

Yes, a little bit each time. But AI will finish the job.

Because earlier it was writing skills, or attention span that was at stake.

This time it is literally the ability to think.


Pray tell, what makes you impervious to the atrophy and mental decline caused by these inventions? Do you just not use "calculators, radio[s], cameras, phones, computers, smartphones, [or] social networks"? And so, you have avoided the trap of technologies through defiance?

I mean we were seeing this even before AI. It's the same type of person. To slop is human.

It's like for some reason we thought that like some good percentage of us aren't just tribal worker drones who fundamentally just want fats, sugars, salts, dopamine and seratonin. People actively vote against things like UBI, higher corporate taxes, making utilities public. People actively choose to believe misinformation because it suits their own personal tribal narratives.


This is the way "AI" will deliver on the promise to become more intelligent than humans. Or at least than humans who believe in it.

The proper way to do it is how it was done always. Teach your kids. If they grow up and your teachings stand the test of time, they will pass it to their kids, and so on...

What knowledge do you have that was passed down 3 or more generations to you?

Spoken language, reading, writing, arithmetic, farming, tool use, law, mechanics, tracking, sewing, leatherwork, casting, weaving, forrestry, swimming, diving, shooting, general warfare, riding, dishwashing ...

Almost, but not, aviation.

There's probably more. That's my family though - we're still using 1935 CWA cookbooks and occasionally pull a plough with horses or bullocks.

We've obviously added and upgraded those skills over generations, but we still have stuff the great grandparents used, and added in a few more contempory skills.

I guess we'll lose some of that if we ever get a mechanical dishwasher though.


I'm not sure if OP's blog posts are likely to have the same level of utility to future generations as something like, say, spoken language :)

I was obviously talking about 3 generations from your own known family, which is the only interpretation given the topic being discussed. How can people miss the point of a question so completely? I am actually interested in understanding what goes on in your mind to talk about "spoken language" in the context of someone asking how to make a website last a few generations.

You directly asked

> What knowledge do you have that was passed down 3 or more generations to you?

and I answered, incompletely, with a list of knowledge and skills I directly learned within my own family, things that have been passed on within various branches of that family over generations.

I was raised by a large extended family, my grandson had, as a baby, a blanket crocheted by my grandmother for my son, the same grandmother who taught me how to darn, sew, weave, etc. just as she taught my father who used those skills in the navy to maintain his kit.

I learnt english and other languages from the generation before me .. and the generation before them as they were not dead when I was a child - and I had living great grandparents.

Do you not count knowledge of language as something passed on by prior generations?

Many of these people left journals or memoirs .. and number have portions of their lives collected in national archives (

* https://en.wikipedia.org/wiki/Battle_of_Beersheba_(1917)

* https://en.wikipedia.org/wiki/The_Rats_of_Tobruk

* https://en.wikipedia.org/wiki/Gallipoli_campaign

* https://en.wikipedia.org/wiki/Federation_of_Australia

* https://en.wikipedia.org/wiki/Crimean_War (1853)

Also part of my larger extended family are (still alive) and were (now passed) people that passed down stories and looked after and maintained rock art (both painted and carved) that has survived several tens of thousands of years.

Almost all the geophysical data I gathered is archived, along with data gathered before me, in walk in fireproof safes on tapes, as ascii, on acid free paper, etc.

In short, what goes through my mind is my direct experience with the transmission of and preservation of knowledge.

How about yourself?


If I remember correctly IPFS alone won't guarantee that the file will be around.

You have to pay some service to keep it pinned. Also cross your fingers and hope IPFS is still a thing in 100 years.

I am surprised it still exists given how it still fails to deliver anything revolutionary.

But I remember Unreal being unreachable for me at that time because I couldn't even dream of getting a graphics accelerator and it won't even start with out a one, or was it the sound card requirement that was the blocker?

Everything up to (and including) Unreal Tournament had software rendering. It was one of the selling points when its competitor (Quake 3) was Hardware-accelerated-only.

Deus Ex too.

Original Unreal had soft rendering just like Quake 1/2 and so on. Though at that point my brother and I had saved up enough beans to buy a Voodoo2.

I think it was the graphics card. I remember getting a paper route and waking up at 5am every morning to save up the money for my Voodoo card. Was absolutely mind blowing as a 13 year old.

>I can buy on the used market a ~2018 laptop with a 15W quad core CPU, 8GB RAM, 256 NVME and 1080p IPS display, that's orders of magnitude more capable..

But it won't be as reliable, mostly motherboards won't last long.


Don't know what your source is for that, but that's not my experience, and i've had dozens of laptops through my hands due to my hobby.

The ticking timebomb lemons with reliability or design issues, will just die in the first 2-4 years like clockwork, but if they've already survived 6+ years without any faults, they'll most likely be reliable from then on as well.


Bathtub curve is extremely common https://en.wikipedia.org/wiki/Bathtub_curve

>survived 6+ years without any faults, they'll most likely be reliable from then on as well

Ok, let us say they ll last 4 more years, so 10 years total lifespan.

A PI would last a lot longer.


>let us say they ll last 4 more years

Why not 50 more years if we're just making up numbers? I still have an IBM thinkpad from 2006 in my possession with everything working. I also see people with Macbooks from the era with the light up apple logo in the wild and at DJs.

>A PI would last a lot longer.

Because you say so? OK, sure.


In your comment you didn't say Apple computers or Thinkpads. Those are different. I was talking about plain old vanilla business class laptop (because we are talking about raspberry alternative).

I have computers that are ~20 years or even more and still work fine. My main computer which I just replaced is ~14 years old (with some components even older than that), was used every single day, and is now a perfectly functioning server. I have stacks of SFFs and minipcs from eBay going back to 2008 but most from 2012-2015, which have been running virtually uninterrupted for a decade, and still working fine. I have several laptops from different OEMs, business and consumer lines, that are as old as 2008 and have been used regularly for at least 10 years, all still fine.

I understand what you're saying but saying it isn't enough. There's nothing to support your claim.


You're contradicting yourself

I was referring to your original comment

>I can buy on the used market a ~2018 laptop with a 15W quad core CPU, 8GB RAM, 256 NVME and 1080p IPS display, that's orders of magnitude more capable..


What makes you think so? Just a feeling? A Vibe?

About what exactly..

That if a laptop is 6 years old it will only last 4 more years. Or that a Pi will last more than 10.

If it is a generic laptop, yes. 10 years is a stretch. Components used in the motherboard are probably not high quality enough to last more than 10 years. A manufacturer does not have an incentive to put high quality stuff (that is probably costlier) in a laptop who's only selling point is cheap for the "features", and not reliability or longevity..

One might get lucky with such a laptop, but I won't count on it.


Again, is that just a feeling, or do you have some data to actually show this. In my experience even old basic Acer laptops easily last more than 10 years, probably without the battery and married to the charger forever now, but they will work fine. But I don't go on the internet and tell everyone laptops last most than 10 years just because I know of a few Acers lasting that. Likewise, do you have any statistics on longevity of Raspberry Pis.

Sure, but $200 on eBay will get you something along the lines of a Dell Latitude, with decent build quality, cost-optimized more than a flagship workstation-class laptop, but certainly not designed to squeeze out the last penny at the expense of reliability or repairability like the cheapest consumer models.

And if you buy a 5-year-old corporate laptop in very good condition with minimal visible wear on the keyboard and touchpad, it was likely only used as a desktop replacement connected to a dock, so unlikely to have suffered abuse not apparent from visual inspection alone.

If you're planning to use it as an actual laptop, price out a replacement battery before purchase, as battery capacity will degrade over time, even if the laptop is exclusively used on AC, so will always be something of a crapshoot.

Otherwise, I'd expect the rate of component failure to be no higher than for any other lightly-used laptop of similar vintage, which is low.


What makes you assume that the Raspberry Pi is using higher quality components?

One thing is that Raspberry PI have a fewer of them. So less chance of one becoming faulty.

Regarding higher quality components, I think the for the usecase (I mean the kinds of thing it is supposed to be used for) of Raspberry PI, reliability is more important.

This also matches with my experience.


> Regarding higher quality components, I think the for the usecase (I mean the kinds of thing it is supposed to be used for) of Raspberry PI, reliability is more important.

That you think that reliability is more important for a Raspberry Pi usecase than a laptop doesn't somehow magically make it a fact that its components are of higher quality than your average laptop. You only speculate and then speculate further on the basis of your original speculation. That's not how you arrive at a basis for a factual claim or an estimate.


Is it a fanless laptop? Fans aren't any different than filters in cars. They fail and need replaced.

3-5 years for a cheap laptop [0].

3-5 years of office use for a Pi. [1]

Sure, there's other numbers to find as well, but I'd suggest that they're pretty comparable in the way they handle environments. If one would fail, so would the other.

[0] https://pcpatching.com/2025/11/extend-your-pcs-life-how-long...

[1] https://raspberrypicase.com/how-long-does-a-raspberry-pi-las...


It is the imitation of intelligence.

It works because people have answered similar questions a million times on the internet and the LLMs are trained on it.

So it will work for a while. When the human generated stuff stops appearing online, then LLMs ll quickly fall in usefulness.

But that is enough time for the people who might think that it going to last for ever to make huge investments into it, and the AI companies to get away with the loot.

Actually it is the best kind of scam...

EDIT: Another thought. Thus it seems that AI companies actually have an incentive to hinder developements, because new things mean that their model is less useful. With the widespread dependence on AI, they might even get away with manipulating the population to stagnate.


There is Intelligence and there is Imitation of Intelligence. LLMs do the latter.

Talk to any model about deep subjects. You ll understand what I am saying. After a while it will start going around in circles.

FFS ask it to make an original joke, and be amused..


Many animals are clearly intelligent, but I can't talk to them at all.


If you understood anything about this topic you would know that imitation of intelligence requires intelligence. It's the same thing.


Give it some thought.


It has been given significant thought already. This isn't a new topic.

> After a while it will start going around in circles.

so like your average human

> FFS ask it to make an original joke, and be amused..

let's try this one on you - say an original joke

oh, right, you dont respond to strangers prompts, thus you have agency, unlike an LLM


>so like your average human

If an average human has seen and read all that is written till now, I bet that they can hold the conversation going for quite a long time...

>say an original joke

I asked an LLM if it had a good night with sweet dreams, It said, "I don't sleep and I only dream when I work!"


>how many times faster my iPhone 17 Pro Max is..

Sadly most of that power is not working for you, most of the time, but working against you, by spying, tracking and manipulating you.

Bet that was not include in your sci-fi dreams in 70s..


Oh but we had The Forbin Project, its sequel Colossus, and later Wargames. Not to mention Star Trek episodes with malignant computers. And I have No Mouth But I Must Scream.

In the 70s, science fiction fed me Cold War fears of world-controlling mainframes.


Colossus: The Forbin Project is simply a renamed release of The Forbin Project, a few months after the later had a poor opening. Didn’t help the box office much. I liked it, back when it was easy to dismiss as an impossible dystopia.


Oh. Well the sequel in print was named Colossus. It is about continuing life under the reign of supercomputer.


I'd love to see you substantiate that - bet you can't.


I certainly can't. I'm running GrapheneOS.


I would like to see how things will be when using AI would require half of a devs current paycheck.


>I made the wrong choice with software development.

If you didn't like working with computers, then you (and another gazillion people who choose it for the $$$) probably made the wrong choice.

But totally depends on what you wanted to get out of it. If you wanted to make $$$ and you are making it, what is the problem? That is assuming you have fun outside of work.

But if you wanted to be the best at what you do, then you gotta love what you are doing. May be there are people who have super human discipline. But for normal people, loving what they goes a long way towards that end.


> If you didn't like working with computers, then you probably made the wrong choice.

This doesn't match what I have seen in other industries. Many auto mechanics I know drive old Buicks or Ford's with the 4.6l v8 because the cars are reliable and the last thing they want to do on a day off is have to work on their own car. I know a few people in other trades like plumbers, electricians, and chefs and the pattern holds pretty well for them as well.

You can enjoy working with computers and also enjoy not working in your personal time.


Exactly this. I love writing code and solving problems. In my 20s and very early 30s I worked a lot of long hours and tried my best to always be learning new things and upskilling but it's never ending. It's hard sometimes to not look back and think about the hours I spent on code instead of building stronger friendships and relationships.


Every career path presents you with some version of this opportunity cost dilemma. The good news is you are not stuck - you can recalibrate to allow more of what you now know you want, while still maintaining a grip on the part of the job/career/enterprise that you actually excel at, and jettisoning the rest.


> If you didn't like working with computers, then you (and another gazillion people who choose it for the $$$) probably made the wrong choice.

The problem is the field is changing, fast. I love writing code... I'm not so sure I love prompting Claude, coordinating agents and reviewing +30k vibe-coded PRs.


> If you didn't like working with <insert anything>, then you ...

This type of argument can hold for any profession and yet we aren't seeing this pattern much in other white-collar professions. Professors, doctors, economists, mechanical engineers, ... it seems like pretty much everybody made the wrong choice then?

I think this is a wrong way to look at it. OP says that he invested a lot of time into becoming proficient in something that today appears to be very close to part extinction.

I think that the question is legit, and he's likely not the only person asking oneself this question.

My take on the question is ability to adapt and learn new skills. Some will succeed some will fail but staying in status-quo position will certainly more likely lead to a failure rather than the success.


Your first point hits the nail on the head. We are expected to have side projects and to keep up with new things (outside of work) but most other jobs don't have that. I would be okay with my work sending me off for additional training, on company time, but I don't want it to consume the time I have left after work.


I don't know why but our profession for some reason is different than the others in this respect and people often like to think that this is a norm and if you're not doing it you're not worthwhile. I think it has to do with some interesting psychological effects of people who are generally attracted to this profession but also due to the companies who implemented those mental hacks as a means to attract people who are 100% for it. Leetcode style interviews where you virtually have to spend months to prepare oneself for the interview, even as a senior, is one example of that but I also do remember the age, which was not too long ago, where your resume wouldn't even get a look if you didn't have several open-source repositories/contributions to show. This is in some part even valid as of today.

There are plenty of such examples but both of these imply that you're ready to devote a lot of your extra time, before or after the job, only that you can show you're relevant in the eyes of those who are the decision makers. This normally means that you're single, that you have no kids, family, no other hobbies but programming etc. This works when you're in your 20's and only up to the certain point unless you become a weirdo in your 30's and 40's etc. without any of these.

However, in the age where we are met with the uncertainty, it may become a new normal to devote extra effort in order to be able to remain not competitive but a mere candidate for the job. Some will find the incentive for this extra pain, some will not but I think it won't be easy. Perhaps in 5 years time we will only have "AI applied" engineers developing or specializing their own models for given domains. Writing code as we have it today I think it's already a thing of a past.


> for some reason

I think the reason is quite simple. Software is endlessly configurable. And thus a lot higher chance to get the configuration wrong.

This is what makes it attractive, and makes it hard to get right.

You cannot get good at it without making a ton of mistakes. When companies look for people with a lot of side projects, they are looking at people who already have made such mistakes and learned from them, preferably on their own time and not on paid, companies time.


That argument would be sound if no other profession existed that is at least comparably complex.


It is not about complexity.

I ll list some attributes of software development that makes it unique.

* No hard rules, textbooks to follow, industry as a whole still make costly mistakes and recovery cycles.

* No easy way to gauge the requirement-fit of the thing you made. Only time will tell.

* Cheap (financially) to practice, make mistakes and learn.


You're making some strong assumptions about other industries which are incorrect. All of that exists elsewhere and is not so unique to software as you may think. Things are never that simple. Your argument reads more as a justification to the status quo and gatekeeping rather than being objective. I'm sure the doctors would have said something similar for their profession too but it doesn't necessarily mean it is true. Software engineering is a demanding profession but it is not that special as we like to think it is.


>doctors would have said something similar for their profession

Actually that applies to doctors. A doctor who is not curious and is not willing to do learn/research on their own initiative is only a marketing hand of pharma.

But it is quite hard for doctors to do any real research independently. They can't really do experiments on real people...

Software is really special.


So a software engineer who is not curious enough to invest 15+ hours daily over the course of years is just a marketing hand of ... what ... programming language of their choice or company they work for?

Don't get me wrong. I am that guy, who probably over-invested into the development of his skills but I don't think it's a normal thing to expect.


>So a software engineer who is not curious enough to invest 15+ hours daily over the course of years is just a marketing hand of ..

That does not apply here. Because more often than not, we don't prescribe products/services that our clients must go out and buy, without exception.

>it's a normal thing to expect.

It is not a normal thing to expect because in other fields there are few people who can afford to do that. So an employer cannot really pick someone from that pool.

But in software, it is possible if one choose to do it. So the pool is a lot bigger, so it becomes feasible for an employer to pick someone from there, instead of picking from I-am-only-as-good-as-I-am-paid-to-be pool..


> That does not apply here. Because more often than not, we don't prescribe products/services that our clients must go out and buy, without exception.

You know that treating patients is not only about picking the right medicament and writing prescriptions? It's about diagnosing, testing the hypotheses, optimizing for the particular patient case, learning about all the specific factors of their environment including the genetics, then we have surgeons, etc.

And yet I don't quite see doctors being on a time spending spree to become exquisite in all of those things. Nor do I see hospitals or clinics doing such knowledge and ability harness tests over their potential employees. Stakes are much higher in medicine than they are in software so it makes no sense at all to make an argument that doctors cannot "afford" it. They can, they have books and practice the same way we do. I don't get to modify the production system every day but yet I am learning constantly of how not to make those same production system go down when I do.

> It is not a normal thing to expect because in other fields there are few people who can afford to do that.

It's not a normal thing in software too, you know? Let's please stop normalizing things which are not normal. If there is one thing that makes me happy in this new era of AI-assisted development is that all this bs is coming to its end.


I am not normalizing anything!

I am just describing the logical behavior of an employer who wants to get the best person for the job.

About the other thing, I think I will let you have the last word since I feel that we are speaking past each other.


Software development as a career was born, reached maturity and died in less than 100 years (being generous).

It never had time to develop into a truly professional field like medicine, law or engineering.


I don't know where you take the idea that it's dead or dying as a discipline. The need for software solutions is clearly bigger than ever and growing. And what I see, even as and especially with LLM coding becoming more prevalent, is a breakneck rapid decline in the quality of delivered software and a downright explosion of security issues and incidents.


AI is making it so that”working with computers” is no longer a viable career path. At least that’s the goal.

As AI allows more and more people to accomplish tasks without a deep understanding of computers, “working with computers“ will be as much of a marketable job skill as “working with pencils” 50 or 100 years ago.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: