> If $100 can save 1 life 10 blocks away from me or 5 lives in the next town over, what kind as asshole chooses to let 5 people die vs 1?
The thing about strict-utilitarian-morality is that it can't comprehend any other kind of morality, because it evaluates the morality of... moralities... on its own utilitarian basis. And then of course it wins over the others: it's evaluating them using itself!
There are entirely different ethical systems that are not utilitarian which (it seems) most people hold and innately use (the "personal morality" I'm talking about in my earlier post). They are hard to comprehend rationally, but that doesn't make them less real. Strict-utilitarianism seems "correct" in a way that personal morality does not because you are working from a premise "only things that I can understand like math problems can be true". But what I observe in the world is that people's fear of the rationalist/EA mindset comes from the fact that they empirically find this way of thinking to be insidious. Their morality specifically disagrees with that way of thinking: it is not the case that truth comes from scrutable math problems; that is not the point of moral action to them.
The EA philosophy may be put as "well sure but you could change to the math-problem version, it's better". But what I observe is that people largely don't want to. There is a purpose to their choice of moral framework; it's not that they're looking at them all in a vacuum and picking the most mathematically sound one. They have an intrinsic need to keep the people around them safe and they're picking the one that does that best. EA on the other hand is great if everyone around you is safe and you have lots of extra spending money and what you're maximizing for is the feeling of being a good person. But it is not the only way to conceive of moral action, and if you think it is, you're too inside of it to see out.
I'll reiterate I am trying to describe what I see happening when people resist and protest rationalism (and why their complaints "miss" slightly---because IMO they don't have the language to talk about this stuff but they are still afraid of it). I'm sympathetic to EA largely, but I think it misses important things that are crippling it, of the variety above: an inability to recognize other people's moralities and needs and fears doesn't make them go away; it just makes them hate you.
> The thing about strict-utilitarian-morality is that it can't comprehend any other kind of morality, because it evaluates the morality of... moralities... on its own utilitarian basis.
I can comprehend them just fine, but I have a deep seated objection to any system of morality that leaves behind giant piles of dead bodies. We should be trying to minimize the size of the pile of dead bodies (and ideally eliminate the pile altogether!)
Any system or morality that boils down to "I don't care about that pile of dead bodies being huge because those people look different" is in fact not a system morality at all.
Well, you won't find anyone who disagrees with you here. No such morality is being discussed.
The job of a system of morality is to synthesize all the things we want to happen / want to prevent happening into a way of making decisions. One such thing is piles of dead bodies. Another is one's natural moral instincts, like their need to take care of their family, or the feeling of responsibility to invest time and energy into improving their future or their community or repairing justice or helping people who need help, or to attend to their needs for art and meaning and fun and love and respect. A coherent moral system synthesizes these all and figures out how much priority to allocate to each thing in a way that is reasonable and productive.
Any system of morality that takes one of these criteria and discards the rest of them is not a system of morality at all, in the very literal sense that nobody will do it. Most people won't sell out one of their moral impulses for the others, and EA/rationalism feels like it asks them too, since it asks them to place zero value in a lot of things that they inherently place moral value in, and so they find it creepy and weird. (It doesn't ask that explicitly; it asks it by omission. By never considering any other morality and being incapable of considering them, because they are not easily quantifiable/made logical, it asks you to accept a framework that sets you up to ignore most of your needs.)
My angle here is that I'm trying to describe what I believe is already happening. I'm not advocating it; it's already there, like a law of physics.
The thing about strict-utilitarian-morality is that it can't comprehend any other kind of morality, because it evaluates the morality of... moralities... on its own utilitarian basis. And then of course it wins over the others: it's evaluating them using itself!
There are entirely different ethical systems that are not utilitarian which (it seems) most people hold and innately use (the "personal morality" I'm talking about in my earlier post). They are hard to comprehend rationally, but that doesn't make them less real. Strict-utilitarianism seems "correct" in a way that personal morality does not because you are working from a premise "only things that I can understand like math problems can be true". But what I observe in the world is that people's fear of the rationalist/EA mindset comes from the fact that they empirically find this way of thinking to be insidious. Their morality specifically disagrees with that way of thinking: it is not the case that truth comes from scrutable math problems; that is not the point of moral action to them.
The EA philosophy may be put as "well sure but you could change to the math-problem version, it's better". But what I observe is that people largely don't want to. There is a purpose to their choice of moral framework; it's not that they're looking at them all in a vacuum and picking the most mathematically sound one. They have an intrinsic need to keep the people around them safe and they're picking the one that does that best. EA on the other hand is great if everyone around you is safe and you have lots of extra spending money and what you're maximizing for is the feeling of being a good person. But it is not the only way to conceive of moral action, and if you think it is, you're too inside of it to see out.
I'll reiterate I am trying to describe what I see happening when people resist and protest rationalism (and why their complaints "miss" slightly---because IMO they don't have the language to talk about this stuff but they are still afraid of it). I'm sympathetic to EA largely, but I think it misses important things that are crippling it, of the variety above: an inability to recognize other people's moralities and needs and fears doesn't make them go away; it just makes them hate you.