Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

On the contrary, those things are quite predictable. Once you know those issues exist, you can reliably avoid them. But with LLMs you can't reliably avoid hallucinations. The unreliability is baked into the very nature of the tool.





Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: