Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs are guessing machines, they don’t “decide” anything. It would be decided by the people programming it and putting in alignment guardrails.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: