Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In fact, sentence transformer models run orders of magnitude more quickly. Performance penalties will be small.

They do not. Sentence transformers aren't new, and have well-known trade offs. What source or line of reasoning misled you to believe otherwise?

> Here's more examples of low hanging fruit. The proof in that they work is in the implementations which I provide. You can run them, they work!: https://gist.github.com/Hellisotherpeople/45c619ee22aac6865c...

This...is your blog about prompt engineering. What do you believe this "proves"? How have you blown away current production encoding or attention mechanisms?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: