Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think maybe you don't understand that they don't have enough GPUs to do this, and money can't buy enough GPUs to do it.


This is the bottleneck. EUV Photolithography is one of the hardest engineering challenges ever faced, it's like trying to drop a feather from space and guaranteeing it lands on a specific blade of grass. Manufacturing these GPUs at all requires us to stretch the limit of what is physically possible in multiple domains, much less producing them at scale.


Thanks for this explanation! :) (as someone without knowledge of the hardware process I appreciated it).

It is SO amazing that we have such a driving force (LLMs/consumer-AI) for this (instead of stupid cryptocurrencies mining or high-performance gaming). This should drive innovation pretty strongly and I am sure the next "leap" in this regard (processing hardware) will put technology in a completely different level.


That's a cool last minute detour from techno gods. We can incentivize AI to work on crypto mining, and regain our fully engaged primeval lives back ;)


Not disagreeing but just curious, why can't money buy enough GPU's? OpenAI's prices seem low enough that they could reasonably charge 2x or more to companies eager to get on the best models now.


They're giving people access to GPT-4 via Bing for free, but apparently can't accommodate paying API users!?

That makes no sense.

What makes much more sense -- especially if you listen to his interviews -- is that Sam Altman doesn't think you can be trusted with the power of GPT-4 via an API unless it has first been aligned to death.


Microsoft is giving that for free but I assume they're paying OpenAI for it.

And having such a big anchor tenant, its reasonable that you would prioritize them if GPUs are in short supply.


> Microsoft is giving that for free but I assume they're paying OpenAI for it.

Yeah, but Microsoft already gets 75% of the profits OpenAI makes, it's not the same price for them as the rest of us.


It’s the exactly the same. If they could make 75 cents selling the compute to someone else for $1 versus not making it providing the Bing chat service, that is 75 cents they lose.


Why do you assume that the same amount of computing power would be used by someone else? There are only so many customers. You can't magically start selling more compute if you stop using it yourself.


At scale, GPU's are capacity constrained right now, so if Microsoft stopped using them, their capacity would be absorbed by others.


10 billion$


Bing GPT-4 is a much smaller and less capable model than regular GPT-4.


"Free". The worst four-letter F word in America.


I think GPUs are in short supply and Nvidia can't make enough to keep up with demand.


To a first approximation, the increased share price of NVIDIA is because AI developers including OpenAI bought as many as NVIDIA can make.


This may be true but isn’t their official stance that their models are too powerful and could destroy Western civilization as we know it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: