Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You would need to have a local copy of the GPT model, which are not exactly OpenAI's plans.


For embeddings, you can use smaller transformers/llms or sentence2vec and often get good enough results.

You don't need very large models to generate usable embeddings.


You are correct, I assumed parent was referring to specific embeddings generated by OpenAI LLMs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: