Reading Yudkowsky, part 18

by Luke Muehlhauser on March 4, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Overcoming Bias (now moved to Less Wrong) are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to improve their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 118th post is A Rational Argument:

You are, by occupation, a campaign manager, and you’ve just been hired by Mortimer Q. Snodgrass, the Green candidate for Mayor of Hadleyburg.  As a campaign manager reading a blog on rationality, one question lies foremost on your mind:  “How can I construct an impeccable rational argument that Mortimer Q. Snodgrass is the best candidate for Mayor of Hadleyburg?”

Sorry.  It can’t be done.

“What?” you cry.  “But what if I use only valid support to construct my structure of reason?  What if every fact I cite is true to the best of my knowledge, and relevant evidence under Bayes’s Rule?”

Sorry.  It still can’t be done.  You defeated yourself the instant you specified your argument’s conclusion in advance.

If you really want to present an honest, rational argument for your candidate, in a political campaign, there is only one way to do it:

  • Before anyone hires you, gather up all the evidence you can about the different candidates.
  • Make a checklist which you, yourself, will use to decide which candidate seems best.
  • Process the checklist.
  • Go to the winning candidate.
  • Offer to become their campaign manager.
  • When they ask for campaign literature, print out your checklist.

We Change Our Minds Less Often Than We Think opens with the following quote from a study by Griffin and Tversky:

Over the past few years, we have discreetly approached colleagues faced with a choice between job offers, and asked them to estimate the probability that they will choose one job over another.  The average confidence in the predicted choice was a modest 66%, but only 1 of the 24 respondents chose the option to which he or she initially assigned a lower probability, yielding an overall accuracy rate of 96%.

Yudkowsky reacts:

When I first read the words above – on August 1st, 2003, at around 3 o’clock in the afternoon – it changed the way I thought.  I realized that once I could guess what my answer would be - once I could assign a higher probability to deciding one way than other – then I had, in all probability, already decided.  We change our minds less often than we think.  And most of the time we become able to guess what our answer will be within half a second of hearing the question.

If it saves me time, I like it!

Next is Avoiding Your Belief’s Real Weak Points:

Modern Orthodox Judaism is like no other religion I have ever heard of, and I don’t know how to describe it to anyone who hasn’t been forced to study Mishna and Gemara.  There is a tradition of questioning, but the kind of questioning…  It would not be at all surprising to hear a rabbi, in his weekly sermon, point out the conflict between the seven days of creation and the 13.7 billion years since the Big Bang – because he thought he had a really clever explanation for it, involving three other Biblical references, a Midrash, and a half-understood article inScientific American. In Orthodox Judaism you’re allowed to notice inconsistencies and contradictions, but only for purposes of explaining them away, and whoever comes up with the most complicated explanation gets a prize.

There is a tradition of inquiry.  But you only attack targets for purposes of defending them.  You only attack targets you know you can defend.

In Modern Orthodox Judaism I have not heard much emphasis of the virtues of blind faith.  You’re allowed to doubt.  You’re just not allowed to successfully doubt.

The reason that educated religious people stay religious, I suspect, is that when they doubt, they are subconsciously very careful to attack their own beliefs only at the strongest points – places where they know they can defend.  Moreover, places where rehearsing the standard defense will feel strengthening.

It probably feels really good, for example, to rehearse one’s prescripted defense for “Doesn’t Science say that the universe is just meaningless atoms bopping around?”, because it confirms the meaning of the universe and how it flows from God, etc..  Much more comfortable to think about than an illiterate Egyptian mother wailing over the crib of her slaughtered son.  Anyone who spontaneously thinks about the latter, when questioning their faith in Judaism, is really questioning it, and is probably not going to stay Jewish much longer.

…To do better:  When you’re doubting one of your most cherished beliefs, close your eyes, empty your mind, grit your teeth, and deliberately think about whatever hurts the most.  Don’t rehearse standard objections whose standard counters would make you feel better.  Ask yourself what smart people who disagree would say to your first reply, and your second reply.  Whenever you catch yourself flinching away from an objection you fleetingly thought of, drag it out into the forefront of your mind.  Punch yourself in the solar plexus.  Stick a knife in your heart, and wiggle to widen the hole.

The Meditation on Curiosity is a linkfest to past posts all while revisiting his “first virtue of rationality”: curiosity.

Singlethink tells the story of when young Eliezer discovered he wanted to be a rationalist:

In my very first step along the Way, I caught the feeling - generalized over the subjective experience – and said, “So that’s what it feels like to shove an unwanted truth into the corner of my mind!  Now I’m going to notice every time I do that, and clean out all my corners!”

In No One Can Exempt You from Rationality’s Laws, Yudkowsky argues that rationality is a set of mathematical rules from which you cannot escape.

In A Priori, Yudkowsky argues for the surprising conclusion that

There is nothing you can know “a priori”, which you could not know with equal validity by observing the chemical release of neurotransmitters within some outside brain.  What do you think you are, dear reader?

…Brains evolved from non-brainy matter by natural selection; they were not justified into existence by arguing with an ideal philosophy student of perfect emptiness.  This does not make our judgments meaningless.  A brain-engine can work correctly, producing accurate beliefs, even if it was merely built – by human hands or cumulative stochastic selection pressures – rather than argued into existence.  But to be satisfied by this answer, one must see rationality in terms of engines, rather than arguments.

Priming and Contamination discusses more of the experimental psychology literature on how our brains work.

Do We Believe Everything We’re Told? discusses an old debate:

One might naturally think that on being told a proposition, we would first comprehend what the proposition meant, then consider the proposition, and finally accept or reject it.  This obvious-seeming model of cognitive process flow dates back to Descartes.  But Descartes’s rival, Spinoza, disagreed; Spinoza suggested that we first passively accept a proposition in the course of comprehending it, and only afterward actively disbelieve propositions which are rejected by consideration.

Most of philosophy has sided with Descartes because it’s more, ya know, intuitive. But recent experimental research favors Spinoza.

Cached Thoughts opens with a bit of brain science:

One of the single greatest puzzles about the human brain is how the damn thing works at all when most neurons fire 10-20 times per second, or 200Hz tops.  In neurology, the “hundred-step rule” is that any postulated operation has to complete in at most 100 sequential steps – you can be as parallel as you like, but you can’t postulate more than 100 (preferably less) neural spikes one after the other.

Can you imagine having to program using 100Hz CPUs, no matter how many of them you had?  You’d also need a hundred billion processors just to get anything done in realtime.

If you did need to write realtime programs for a hundred billion 100Hz processors, one trick you’d use as heavily as possible is caching.  That’s when you store the results of previous operations and look them up next time, instead of recomputing them from scratch.  And it’s a very neural idiom – recognition, association, completing the pattern.

It’s a good guess that the actual majority of human cognition consists of cache lookups.

Cache lookup, of course, can get us in trouble. Some things should be re-computed.

Previous post:

Next post:

{ 7 comments… read them below or add one }

Jacopo March 4, 2011 at 5:00 am

Interesting post, thanks.

Any of this available in book form? For whatever reason, I prefer them somewhat to reading blogs, at least when reading about something in-depth.

  (Quote)

Luke Muehlhauser March 4, 2011 at 6:52 am

Jacopo,

It’s coming. Eliezer is shopping for publishers with two manuscripts compiled from his Less Wrong material.

  (Quote)

Garren March 4, 2011 at 7:16 am

“How can I construct an impeccable rational argument that…”

For the sake of their continued employment, it’s a good thing campaign managers ask a slightly different question: “How can I construct an impeccable rhetorical argument that…”

  (Quote)

Charles R March 4, 2011 at 7:44 am

118 is pretty serious. It says the Christian cannot hope to make a rational argument for the existence of their god. But neither can the atheist! Only the agnostic can do this, and then only once. Because having made the argument they will have chosen sides.

  (Quote)

Garren March 4, 2011 at 8:03 am

@Charles R
“118 is pretty serious. It says the Christian cannot hope to make a rational argument for the existence of their god.”

Not so fast. A Christian could still ask, “Can I construct an impeccable rational argument that….” leaving off the first word ‘How’. The error is in assuming there must be an impeccably rational conclusion before putting together an argument. Anyone can still engage in critical thinking about current positions.

  (Quote)

plutosdad March 4, 2011 at 8:48 am

The part about only attacking where you are strong was very true of me, even though I didn’t realize it. When I was an active Christian, I read a lot of apologetics. I was always interested in history as well as science. But I would read books on all the “holes” in evolution, and apologetics defending the Bible, so in some cases I was so wrong, but thought I was thinking clearly. After all, I had a degree in math and read a lot more than other Christians (who would tell me I read too much and didn’t need to know that stuff! Except for my pastor who also had a math degree and was into apologetics).

Anyway, one day I read Shadows of Forgotten Ancestors. I don’t know why I got it, it was during a period of unemployment and I was reading alot. But the similarities between animal and human behavior really struck a chord within me. And then I thought “maybe I should not just read apologetics, maybe I should read that Blind Watchmaker book I read about”

Once I started reading attacks by people who did NOT believe, who attacked the weak, not only the strong, points of my beliefs, that was the beginning of the end. I was dumbfounded at how so many apologetics arguments were decided decades ago yet the same poor arguments were presented as if they were new. Part of me was angry and felt lied to about the “10,000 copies of the NT” and other arguments which we knew were wrong, but people kept repeating. But the anger wasn’t really what motivated me, it was simply the truth. I could read the arguments on both sides of an issue and decide which sounded better.

Once I started seeing things from a different point of view, I could not hang onto my worldview anymore. Of course, now this is my worldview, and I find myself prejudiced again, just pointing backward at my old “team”. For instance when I read news stories about pastors who may have “fallen”, I give them great credulity and don’t question the attacks.

The past few years, as I read more “left” leaning sites, I also am learning to see both sides, and see how both the left sites and the right sites see the same exact event so totally differently. Sometimes both have a shade of the truth, sometimes one or both are being mendacious.

  (Quote)

MarkD March 4, 2011 at 6:17 pm

The EY post on priming touches down where neural structure and cognitive processing experimentally intersect. Related to this is the relationship between Magical Thinking indices and word associations (greater belief in magical causes correlates with a more expansive belief in the relationships between certain paired stimuli). The same underlying spreading activation may play a role in subconscious memory integration and other odd effects.

The “caching” suggestion is interesting and reminds me of “Poverty of Stimulus” calculations concerning vocabulary acquisition (among other arenas). Cached lookups looks a lot like heuristics, also.

  (Quote)

Leave a Comment