Amen. It's one real "downside" in this day and age is that it requires fairly undivided attention to be used... that aside, it's without question my favorite way to interact with information.
On that note, a big thank you to whoever added "read this page" to Safari on iOS! Being able to turn long form articles into ad-hoc podcasts has been a game changer for me.
Canva destroyed graphic design well before LLMs caught up, but UX is still (somewhat surprisingly) on the ropes.
My bet: front end devs who need mocks to build something that looks nice get crowded out by UX designers with taste as code generation moves further into "good enough" territory.
Then those designers get crowded out as taste generation moves into "good enough" territory.
Yeah, this is notable. If true, every mobile carrier getting hit at the same time says "coordinated incident", from whatever source and for whatever purpose.
Edit to say: my Verizon FioS and cell service are both working fine, no noticeable interruption at any point today.
Second edit to say: never mind, downdetector's home page normalizes report spikes so 1k and 100k both look identical.
This isn't inherent, just a side effect of poorly designed text UI. Suggestions on the input, manual commands, or honest answers in response to the question "what can you do" all do as good a job as a GUI does, and sometimes a better job.
So many of the complaints I hear about TUIs just come down to bad design. Even one input and textual responses require thoughtful design.
That's design as in function, not color palette. Although... that too.
OK - so in the case of text interface to a constrained tool, you are effectively mapping free text down to some underlying set of function calls and parameters, and you could ask the tools to describe those.
For more general AI tools, I guess it becomes harder to give a succinct description - and so that's still a bit trial and error ( even if you have good feedback ).
Steam reached a new peak of 42 million concurrent players today [1]. An average/mid-tier gaming PC uses 0.2 kWh per hour [2]. 42 million * 0.2 gives 8,400,000 kWh per hour, or 8,400 MWh per hour.
By contrast, training GPT3 was estimated to have used 1,300 MWh of energy [3].
This does not account for training costs of newer models, nor inference costs. But we know inference costs are extraordinarily inexpensive and energy efficient [2]. The lowest estimate of energy cost for 1 hour of Steam's peak concurrent player count uses 6.5x more energy than all of the energy that went into training GPT3.
I was skeptical of the LLM energy use claim. I went looking for numbers on energy usage in a domain that most people do not worry about or actively perceive as a net negative. Gaming is a very big industry ($197 billion in 2025 [1], compare to the $252 billion in private AI investment for 2025 [2]) and mostly runs on the same hardware as LLMs. So it's a good gut check.
I have not seen evidence that LLM energy usage is out of control. It appears to be much less than gaming. But please feel free to provide sources that demonstrate this lie.
The question is whether claims of AI energy use have sustenance, or if there are other industries that should be more concerning. People are either truly concerned about the cost of energy or it's a misplaced excuse to reinforce their negative opinions.
I see no point in making this a numbers game. (Like, I was supposed to say "five" or something?)
Let's make it more of a category thing: when AI shows itself responsible for a new category of life-saving technique, like a cure for cancer or Alzheimer's, then I'd have to reconsider.
(And even then, it will be balanced against rising sea levels, extinctions, and other energy use effects.)
Search through github for commits authored by .edu, .ac.uk etc emails and spend a few days understanding what they’ve been building the past few years. Once you’ve picked your jaw off the floor, take another 10 minutes to appreciate that this is just the public code by some researchers, and is crumbs compared to what is being built right now behind closed doors.
Tenured professors are abdicating their teaching positions to work on startups. Commercial labs are pouring billions into tech that was unreachable just a few years ago. Academic labs are downscaling their interns 20x. Historically hermit companies are opening their doors to build partnerships in industry.
The scale of what is happening is difficult to comprehend.
Local LLMs that you can run on consumer hardware don't really do anything though. They are amusing, maybe you could use them for basic text search, but they don't have any real knowledge like the hosted ones do.
Gemma 3 27B, some smaller models in the 8-16B size range, and up to 32B can be run on hardware that fits in the "consumer" bracket. RAM is more expensive now, but most people can afford a machine with 32GB and maybe a small graphics card.
Small models don't have as much world knowledge as very large models (proprietary or open source ones), but it's not always needed. They still can do a lot of stuff. OCR and image captioning, tagging, following well-defined instructions, general chat, some coding, are all things local models do pretty well.
I said this elsewhere. The whole argument is so boring. There are people trying to make money by pushing the tech (annoying videos I come across), but the most vehement side on HN are the anti-LLM.
Within five years I think the debate will be over, and I think I know what the outcome will be.
Merry Christmas, everyone! Continually grateful for this community: stumbling into it as a teenager changed my life and I never get tired of spinning out about new tech big and small with the folks here. Stay safe, everybody
As usual, size matters. Equating typical 401(k)s with the economic activity of the class GP is referring to is... absurd.
But pretending small-time participation in public markets is the same as billionaire participation in private markets is a great way to convince the lower classes our financial system isn't structured to move wealth away from them.
On that note, a big thank you to whoever added "read this page" to Safari on iOS! Being able to turn long form articles into ad-hoc podcasts has been a game changer for me.
reply