Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Until January this year I mostly used Google Colab for both LLMs and deep learning projects. In January I spent about $1800 getting Apple Silicon M2Pro 32G. When I first got it, I was only so-so happy with the models I could run. Now I am ecstatically happy with the quality of the models I can run on this hardware.

I sometimes use Groq Llama3 APIs (so fast!) or OpenAI APIs, but I mostly use my 32G M2 system.

The article calculates cost of self-hosting, but I think it is also good taking into account how happy I am self hosting on my own hardware.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: