Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's no scaling issues to speak of. These AIs are stateless, which makes them embarrassingly parallel. They can always just throw more GPUs at it. Microsoft even had some videos where they bragged about how these models can be run on any idle GPU around the world, dynamically finding resources wherever it is available!

If there's not enough GPUs at a certain price point, raise prices. Then lower prices later when GPUs become available.

They did it with GPT 3.5, so why not GPT 4?



More GPUs currently don't exist. Nvidia is at capacity for production, and they have to compete with other companies who are also bidding on these GPUs. It's not an issue of raising the price point. The GPUs they want to buy have to be purchased months in advance.


> embarrassingly parallel

I don’t see why such a thing should be embarrassing. Or, at least no more so than being acute or obtuse. Just as long as nothing is askew.


"Embarrassingly parallel" is a term of art: https://en.wikipedia.org/wiki/Embarrassingly_parallel


Yes I was anthropomorphising it back into the realm of human emotion, wherein the angles at which one’s lines run need not be a source of emotional distress. Excepting perhaps the innate sadness of two parallel lines destined to ever be at each other’s sides but still never to meet across the infinite plane.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: