Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The silent victory here is this seems like it is being built to be faster and cheaper than o3 while presenting a reasonable jump, which is an important jump in scaling law

On the other hand if it's just getting bigger and slower it's not a good sign for LLMs





Yeah, this very much feels like "we have made a more efficient/scalable model and we're selling it as the new shiny but it's really just an internal optimization to reduce cost"

Significant cost reduction while providing the same performance seems pretty big to me?

Not sure why a more efficient/scalable model isn't exciting


Oh it's exciting, but not as exciting when sama pumps GPT-5 speculation and the market thinks we're a stones throw away from AGI, which it appears we're not.

Personally, I am more concerned about accuracy than speed.

Yeah, but OpenAI is concerned with getting on a path to making money, as their investors will eventually run out of money to light on fire, so...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: