Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I mean fair enough, I probably don't know as much about hardware and physics as you



Just pointing out that there are limits and there’s no reason to believe that models will improve indefinitely at the rates we’ve seen these last couple of years.


There is reason to believe that humans will keep trying to push the limitations of computation and computer science, and that recent advancements will greatly accelerate our ability to research and develop new paradigms.

Look at how well Deepseek performed with the limited, outdated hardware available to its researchers. And look at what demoscene practitioners have accomplished on much older hardware. Even if physical breakthroughs ceased or slowed down considerably, there is still a ton left on the table in terms of software optimization and theory advancement.

And remember just how young computer science is as a field, compared to other human practices that have been around for hundreds of thousands of years. We have so much to figure out, and as knowledge begets more knowledge, we will continue to figure out more things at an increasing pace, even if it requires increasingly large amounts of energy and human capital to make a discovery.

I am confident that if it is at all possible to reach human-level intelligence at least in specific categories of tasks, we're gonna figure it out. The only real question is whether access to energy and resources becomes a bigger problem in the future, given humanity's currently extraordinarily unsustainable path and the risk of nuclear conflict or sustained supply chain disruption.


> And remember just how young computer science is as a field, compared to other human practices that have been around for hundreds of thousands of years.

How long do you think Homo sapiens have been on Earth and how long has civilization been here?

I’ve been programming since 89. I know what you can squeeze into 100k.

But you can only blast so much electricity into a dense array of transistors before it melts the whole thing and electrons jump rails. We hit that limit a while ago. We’ve done a lot of optimization of instruction caching, loading, and execution. We front loaded a ton of caching in front of the registers. We’ve designed chips specialized to perform linear algebra calculations and scaled them to their limits.

AI is built on scaling the number of chips across the board. Which has the effect of requiring massive amounts of power. And heat dissipation. That’s why we’re building out so many new data centres: each one requiring land, water, and new sources of electricity generation to maintain demand levels for other uses… those sources mostly being methane and coal plants.

Yes, we might find local optimizations in training to lower the capital cost and external costs… but they will be a drop in the bucket at the scale we’re building out this infrastructure. We’re basically brute forcing the scale up here.

And computer science might be older than you think. We just used to call it logic. It took some electrical engineering innovations to make the physical computers happen but we had the theoretical understanding of computation for quite some time before those appeared.

A young field, yes, and a long way to go… perhaps!

But let’s not believe that innovation is magic. There’s hard science and engineering here. Electrons can only travel so fast. Transistor density can only scale so much. Etc.


> How long do you think Homo sapiens have been on Earth and how long has civilization been here?

I already corrected my typo in a child comment.

> We’re basically brute forcing the scale up here

Currently, but even that will eventually hit thermodynamic and socioeconomic limits, just as single chips are.

> And computer science might be older than you think. We just used to call it logic.

In my opinion, two landmark theory developments were type theory and the lambda calculus. Type theory was conceived to get around Russell's paradox and others, which formal logic could not do on its own.

As far as hardware, sure we had mechanical calculators in the 17th century, and Babbage's analytical engine in the 19th century, and Ada Lovelace's program, but it wasn't until the mid-20th century that computer science coalesced as its own distinct field. We didn't used to call computer science logic; it's a unification of physical advancements, logic and several other domains.

> Electrons can only travel so fast.

And we have no reason to believe that current models are at all optimized on a software or theoretical level, especially since, as you say yourself, we are currently just focused on brute-forcing innovation as its the more cost-effective solution for the time being.

But as I said, once theoretical refinement becomes more cost-effective, we can look at the relatively short history of computer science to see just how much can be done on older hardware with better theory:

>> Even if physical breakthroughs ceased or slowed down considerably, there is still a ton left on the table in terms of software optimization and theory advancement.


I agree. And if human civilization survives, your concerns about energy and resources will be only short term on the scale of civilization, especially as we make models more efficient.

The human brain uses just 20 watts of power, so it seems to me like it is possible to reach human-level intelligence in principle by using much greater power and less of the evolutionary refinement over billions of years that the brain has.


* hundreds or thousands, not of




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: