Ruby Bloom recently posted about the significance of Eliezer Yudkowsky‘s Less Wrong Sequences on his thinking. I felt compelled to do the same.
Several people have explicitly told me that I’m one of the most rational people they know. I can also think of at least one case where I was complimented by someone who was politically “my sworn enemy”, who said something along the lines of “I do grant that *your* arguments for your position are good, it’s just everyone *else* on your side…”, which I take as some evidence of me being able to maintain at least some semblance of sanity even when talking about politics.
(Seeing what I’ve written above, I cringe a little, since “I’m so rational” sounds like so much like an over-the-top, arrogant boast. I certainly have plenty of my own biases, as does everyone who is human. Imagining yourself to be perfectly rational is a pretty good way of ensuring that you won’t be, so I’d never claim to be exceptional based only on my self-judgment. But this is what several people have explicitly told me, independently of each other, sometimes also vouching part of their own reputation on it by stating this in public.)
Before reading the Sequences, I was very definitely *not* that. I was what the Sequences would call “a clever arguer” – someone who was good at coming up with arguments for their own favored position, and didn’t really feel all that compelled to care about the truth.
The one single biggest impact of the Sequences that I can think of is that before reading them, as well as Eliezer’s other writings, I didn’t really think that beliefs had to be supported by evidence.
Sure, on some level I acknowledged that you can’t just believe *anything* you can find a clever argument for. But I do also remember thinking something like “yeah, I know that everyone thinks that their position is the correct one just because it’s theirs, but at the same time I just *know* that my position is correct just because it’s mine, and everyone else having that certainty for contradictory beliefs doesn’t change that, you know?”.
This wasn’t a reductio ad absurdum, it was my genuine position. I had a clear emotional *certainty* of being right about something, a certainty which wasn’t really supported by any evidence and which didn’t need to be. The feeling of certainty was enough by itself; the only thing that mattered was in finding the evidence to (selectively) present to others in order to persuade them. Which it likely wouldn’t, since they’d have their own feelings of certainty, similarly blind to most evidence. But they might at least be forced to concede the argument in public.
It was the Sequences that first changed that. It was reading them that made me actually realize, on an emotional level, that correct beliefs *actually* required evidence. That this wasn’t just a game of social convention, but a law of universe as iron-clad as the laws of physics. That if I caught myself arguing for a position where I was making arguments that I knew to be weak, the correct thing to do wasn’t to hope that my opponents wouldn’t spot the weaknesses, but rather to just abandon those weak arguments myself. And then to question whether I even *should* believe that position, having realized that my arguments were weak.
I can’t say that the Sequences alone were enough to take me *all* the way to where I am now. But they made me more receptive to other people pointing out when I was biased, or incorrect. More humble, more willing to take differing positions into account. And as people pointed out more problems in my thinking, I gradually learned to correct some of those problems, internalizing the feedback.
Again, I don’t want to claim that I’d be entirely rational. That’d just be stupid. But to the extent that I’m more rational than average, it all got started with the Sequences.
I was thinking through some challenges and I noticed the sheer density of rationality concepts taught in the Sequences which I was using: “motivated cognition”, “reversed stupidity is not intelligence”, “don’t waste energy of thoughts which won’t have been useful in universes were you win” (possibly not in the Sequences), “condition on all the evidence you have”. These are fundamental concepts, core lessons which shape my thinking constantly. I am a better reasoner, a clearer thinker, and I get closer to the truth because of the Sequences. In my gut, I feel like the version of me who never read the Sequences is epistemically equivalent to a crystal-toting anti-anti-vaxxer (probably not true, but that’s how it feels) who I’d struggle to have a conversation with.
And my mind still boggles that the Sequences were written by a single person. A single person is responsible for so much of how I think, the concepts I employ, how I view the world and try to affect it. If this seems scary, realise that I’d much rather have my thinking shaped by one sane person than a dozen mad ones. In fact, it’s more scary to think that had Eliezer not written the Sequences, I might be that anti-vaxxer equivalent version of me.
I feel very similarly. I have slightly more difficulty pointing to specific concepts from the Sequences that I employ in my daily thinking, because they’ve become so deeply integrated to my thought that I’m no longer explicitly aware of them; but I do remember a period in which they were still in the process of being integrated, and when I explicitly noticed myself using them.
Thank you, Eliezer.
(There’s a collected and edited version of the Sequences available in ebook form. I would recommend trying to read it one article at a time, one per day: that’s how I originally read the Sequences, one article a day as they were being written. That way, they would gradually seep their way into my thoughts over an extended period of time, letting me apply them in various situations. I wouldn’t expect just binge-reading the book in one go to have the same impact, even though it would likely still be of some use.)