Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Law of diminishing returns.

We’re talking about less than a 10% performance gain, for a shitload of data, time, and money investment.



I'm not sure what "10% performance gain" is supposed to mean here; but moving from "It does a decent job 95% of the time but screws it up 5%" to "It does a decent job 98% of the time and screws it up 2%" to "It does a decent job 99.5% of the time and only screws it up 0.5%" are major qualitative improvements.


Yeah I think that throwing more and more compute at the same training data produces smaller and smaller gains.

Maybe quantum compute would be significant enough of a computing leap to meaningfully move the needle again.


What exactly is being moved? It's trained on human data, you can't make code more perfect than what is written out there by a human.


Some think it’s possible, I don’t, we agree actually.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: