Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He is engaging in magical thinking. I showed a factual error, that AI has neither information gathering and verifying capability or a network of peers to substantiate their hypothesis, and you refuse to engage it.



Opinions about what's necessary for AGI are a dime a dozen. You shared your opinion as though it was fact, and you claim that it's incompatible with Eliezer's opinion. I don't find your opinion particularly clear or compelling. But even if your forecast about what's needed for AGI is essentially accurate, I don't think it has much to do with Eliezer's claims. It can simultaneously be the case that AGI will make use of information gathering, verifying capability, and something like a "network of peers", AND that Eliezer's core claims are also correct. Even if we take your opinion as fact, I don't see how it represents a disagreement with Eliezer, except maybe in an incredibly vague "intelligence is hard, bro" sort of way.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: