Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nothing about sentience is obvious. If the trees were sentient, would it be obvious? Is it therefore obvious that they're not? I think its a no in both cases. Same argument applies to AI model.


Sentience is at once too high a standard and too low a standard for AI.

It's too high in that it requires actual consciousness, which may be a very tough architectural problem at best (if functionalism is true) or an unknowable metaphysical mystery at worse (if some form of substance or property dualism is true).

And it's much too low a standard in that many, many sentient creatures are nowhere near intelligent enough to be useful assistants in the domains where we want to use AI.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: