I just read EA Has A Lying Problem and realised that it was similar to some things I’ve said in private conversation before. So I decided I might as well paste my thoughts from chat into Tumblr for interested parties to see. This is just a copy/paste I added some links to, so don’t expect a quality post. But, like, if you want to know what I think about this, here you go:
This also doesn’t handle the ‘what if EA is wrong (or being done) wrong’ problem very well.
Like, you don’t have perfect information about the world. No one does. We each know what our map of the world looks like, so what we have to do is phrase things as “I am someone who thinks they are X” rather than “I am X”, and go from there.
The correct perspective for EAs to take isn’t “I am part of a movement that is going to save the world” but “I am someone who thinks they are part of a movement that is going to save the world”.
Then you step back and take the outside view. What is the track record of people who thought they were part of a movement that was going to save the world?
Spoiler alert: It doesn’t look good.
As far as I can tell from history, most ideologies that think they’re saving the world are wrong.
So EA needs to be done in such a way that even if it’s wrong (which is a hugely important possibility based on priors), it fails gracefully.
You need to act such that, if your ideas are right, you’ll have made the world a better place; but, if they’re wrong, you’ll just be embarrassed.
The world would be a much better place today if Lenin had simply embarrassed himself.
If EA is wrong, we want to have done less harm in the time we thought it was right than, say, Christianity did during its period of thinking it would save the world if it just crusaded a little harder.
And if you have an idea that you can only implement using lies, theft, abuse, and murder – don’t try. You’re probably wrong, because your predecessors were.
I don’t care if it’s the most amazing, logical, obviously-right idea ever-
Don’t. Fucking. Try.
When I visited [EA group redacted], a lot of people said it was obviously correct to kill five people in the trolley problem if the one person is an EA, because then they’ll donate money and save even more people to offset that.
And I said this was hella suspicious because pretty much every ideology will come up with reasons why its own members are more valuable to the world, so it completely fails the “what if EA is wrong” test.
Like, pretty dramatically, really.
Especially given how much of a known mind-virus this is. It isn’t some new plague we haven’t developed immunities to. If you haven’t been inoculated against it, this means something is wrong with you.
(And maybe if we stop thinking about things in terms of “Treating my friends as special is actually a plot to help them save the world! Yay, morality!”, we’ll do less of the less-bad-but-super-annoying thing of always justifying aid to friends as “This is actually indirectly EA!” instead of “I want to be nice to my friend”.)
(Like, maybe this is good for some people, but when I wrote a post complaining about people doing it to me I got flooded with private messages of the form “THANK YOU FOR SAYING THIS! I didn’t know how to talk about what was making me so uncomfortable so thank you for putting it in words.” So, like, a ton of people are also upset by this.)
(Also, the fact that they felt the need to confide in me, a virtual stranger, instead of telling the people doing this to them to stop, is kind of problematic. It means something has gone seriously wrong here if people feel like they can’t talk about it.)
Anyway, back to the point: Don’t kill normies in favour of EAs in the trolley problem, dude. Don’t do things that would be deeply immoral if your ideology was wrong.
If your ideology is hanging out in the parking lot behind the school whispering to you “Hey, kid, wanna try some seductively plausible moral justifications for why Your People deserve more protection than The Outgroup?”, then you have to stand firm and say “My
mommyepistemology says drugsjustifications for things which have been wrong every other time they were justified are bad.”
It doesn’t matter how convincing the argument is, either. It could be pure, perfect, and flawless logic. You might have no counter argument. It might be so brilliant that it’s completely convincing. You still have to stand your ground and say “You’re lying”. No exceptions.
I mean, sure, maybe some day a time will come when it would be correct to think the people close to you need life more than the people you don’t know. But, if ever you think that day is today, you will be wrong. Don’t try.
But, hey, maybe you’re special. Maybe you’re unique in human history. Maybe you’re the one person who can assess the arguments perfectly such that you would never ever be misled by a bad one that flattered your biases. In fact, presumably, you don’t have biases! Your cognition is perfect and flawless and you’ll never be misled by a convincing argument. Sounds like you, right? Right?
In shocking, breaking news: Overconfidence is also a bias.
Biased! Biased biased biased! None of you are free from bias!
(And, if you’re astonished at my doctrine, go read the sequences again.)