Reading Yudkowsky, part 12

by Luke Muehlhauser on January 30, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Overcoming Bias (now moved to Less Wrong) are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to improve their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 70th post is One Argument Against an Army:

I suggest that when people encounter a contrary argument, they prevent themselves from downshifting their confidence by rehearsing already-known support.

Suppose the country of Freedonia is debating whether its neighbor, Sylvania, is responsible for a recent rash of meteor strikes on its cities.  There are several pieces of evidence suggesting this: the meteors struck cities close to the Sylvanian border; there was unusual activity in the Sylvanian stock markets before the strikes; and the Sylvanian ambassador Trentino was heard muttering about “heavenly vengeance”.

Someone comes to you and says:  “I don’t think Sylvania is responsible for the meteor strikes.  They have trade with us of billions of dinars annually.”  “Well,” you reply, “the meteors struck cities close to Sylvania, there was suspicious activity in their stock market, and their ambassador spoke of heavenly vengeance afterward.”  Since these three arguments outweigh the first, you keep your belief that Sylvania is responsible – you believe rather than disbelieve, qualitatively. Clearly, the balance of evidence weighs against Sylvania.

Then another comes to you and says:  “I don’t think Sylvania is responsible for the meteor strikes.  Directing an asteroid strike is really hard. Sylvania doesn’t even have a space program.”  You reply, “But the meteors struck cities close to Sylvania, and their investors knew it, and the ambassador came right out and admitted it!”  Again, these three arguments outweigh the first (by three arguments against one argument), so you keep your belief that Sylvania is responsible.

Indeed, your convictions are strengthened. On two separate occasions now, you have evaluated the balance of evidence, and both times the balance was tilted against Sylvania by a ratio of 3-to-1.

You encounter further arguments by the pro-Sylvania traitors – again, and again, and a hundred times again – but each time the new argument is handily defeated by 3-to-1.  And on every occasion, you feel yourself becoming more confident that Sylvania was indeed responsible, shifting your prior according to the felt balance of evidence.

…But to selectively double-count only some evidence is sheer farce.  I remember seeing a cartoon as a child, where a villain was dividing up loot using the following algorithm:  “One for you, one for me.  One for you, one-two for me.  One for you, one-two-three for me.”

Hindsight Bias summarizes some of the research on hindsight bias. Hindsight Devalues Science is an argument against this kind of ignorant statement from Cullen Murphy, editor of The Atlantic:

[The social sciences turn up] no ideas or conclusions that can’t be found in [any] encyclopedia of quotations… Day after day social scientists go out into the world.  Day after day they discover that people’s behavior is pretty much what you’d expect.

Is he crazy? Science shows us lots of counterintuitive information about ourselves we did not predict. Not only that, but science shows us useful counterintuitive information about ourselves.

Scientific Evidence, Legal Evidence, Rational Evidence is a nice summary of what counts as “evidence” at different levels: scientific, legal, and Bayseian. Yudkowsky applies these ideas to a specific case in Is Molecular Nanotechnology Scientific?

A post near and dear to my heart is Fake Explanations:

Once upon a time, there was an instructor who taught physics students.  One day she called them into her class, and showed them a wide, square plate of metal, next to a hot radiator.  The students each put their hand on the plate, and found the side next to the radiator cool, and the distant side warm.  And the instructor said, Why do you think this happens? Some students guessed convection of air currents, and others guessed strange metals in the plate.  They devised many creative explanations, none stooping so low as to say “I don’t know” or “This seems impossible.

And the answer was that before the students entered the room, the instructor turned the plate around.

Consider the student who frantically stammers, “Eh, maybe because of the heat conduction and so?” …

Ponder that innocent little phrase, “because of”, which comes before “heat conduction”.  Ponder some of the other things we could put after it.  We could say, for example, “Because of phlogiston”, or “Because of magic.”

“Magic!” you cry.  “That’s not a scientific explanation!”  Indeed, the phrases “because of heat conduction” and “because of magic” are readily recognized as belonging to different literary genres. “Heat conduction” is something that Spock might say on Star Trek, whereas “magic” would be said by Giles in Buffy the Vampire Slayer.

However, as Bayesians, we take no notice of literary genres.  For us, the substance of a model is the control it exerts on [anticipated experiences].  If you say “heat conduction”, what experience does that lead you to anticipate? Under normal circumstances, it leads you to anticipate that, if you put your hand on the side of the plate near the radiator, that side will feel warmer than the opposite side.  If “because of heat conduction” can also explain the radiator-adjacent side feeling cooler, then it can explain pretty much anything.

Just because you use the language of rationalism and science doesn’t mean you are doing science or being rational. You may have merely switched literary genres. Sophisticated theologians who offer “God did it” as a “Bayesian” explanation have merely switched literary genres.

The important followup is Guessing the Teacher’s Password:

[When] I began to read the Feynman Lectures on Physics, I ran across a gem called “the wave equation”.  I could follow the equation’s derivation, but, looking back, I couldn’t see its truth at a glance.  So I thought about the wave equation for three days, on and off, until I saw that it was embarrassingly obvious.  And when I finally understood, I realized that the whole time I had accepted the honest assurance of physicists that light was waves, sound was waves, matter was waves, I had not had the vaguest idea of what the word “wave” meant to a physicist.

There is an instinctive tendency to think that if a physicist says “light is made of waves”, and the teacher says “What is light made of?”, and the student says “Waves!”, the student has made a true statement.  That’s only fair, right?  We accept “waves” as a correct answer from the physicist; wouldn’t it be unfair to reject it from the student?  Surely, the answer “Waves!” is either true or false, right?

Which is one more bad habit to unlearn from school. Words do not have intrinsic definitions. If I hear the syllables “bea-ver” and think of a large rodent, that is a fact about my own state of mind, not a fact about the syllables “bea-ver”.  The sequence of syllables “made of waves” (or “because of heat conduction“) is not a hypothesis, it is a pattern of vibrations traveling through the air, or ink on paper.  It can associate to a hypothesis in someone’s mind, but it is not, of itself, right or wrong.  But in school, the teacher hands you a gold star for saying “made of waves”, which must be the correct answer because the teacher heard a physicist emit the same sound-vibrations.  Since verbal behavior (spoken or written) is what gets the gold star, students begin to think that verbal behavior has a truth-value.  After all, either light is made of waves, or it isn’t, right?

This is not a hypothesis about [matter].  This is not even a proper belief.  It is an attempt to guess the teacher’s password.

Previous post:

Next post:

{ 6 comments… read them below or add one }

Patrick January 30, 2011 at 9:18 am

Yudkowsky’s “One Argument Against an Army” needs to be taken one step further. This is one of the more common ways that Bayes is abused.

1. Take a common religious apologetic.
2. Claim that its Bayesian evidence for the existence of God.
3. Acknowledge, for once, that Bayesian evidence without numbers just means a nonzero shift in probability.
4. Acknowledge that someone who views the existence of God as very unlikely will not be convinced.
5. But say that its convincing to you, because you view the probability of God as being around 50/50, and that you’re offering the argument to other people in the same position.
6. And totally ignore that this COMMON APOLOGETIC YOU ALREADY KNEW ABOUT was part of your background knowledge when you decided that the existence of God was around 50/50.
7. Continue this process of taking things in your background knowledge, making them explicit and Bayesian, and then using them to update your background knowledge.

  (Quote)

Adito January 30, 2011 at 10:44 am

Patrick, that would only be dishonest if the argument was made towards others who know of the apologetic and use it to strengthen their belief. If it’s aimed towards someone who hasn’t heard it but does hold the chance that God exists to be around 50/50 then it’s not so bad. The process in 7 actually sounds like a great way to educate someone on possible reasons to believe there is a god because chances are they haven’t heard all of the arguments that will be described.

  (Quote)

Patrick January 30, 2011 at 12:07 pm

Patrick, that would only be dishonest if the argument was made towards others who know of the apologetic and use it to strengthen their belief. If it’s aimed towards someone who hasn’t heard it but does hold the chance that God exists to be around 50/50 then it’s not so bad. The process in 7 actually sounds like a great way to educate someone on possible reasons to believe there is a god because chances are they haven’t heard all of the arguments that will be described.  

I’m pretty sure that anyone who tells me that both of the following

1. they consider the chance of the christian god to be around 50/50, but
2. they have never previously considered whether the biblical claim of an empty tomb is evidence of the resurrection

is a liar.

Anyone with the knowledge base to make the first claim has the knowledge base to have already considered certain common supporting arguments of the claim.

Ditto first cause arguments, or even fine tuning arguments. The only semi-common apologetic that I can think of that wouldn’t fall into this would be the ontological argument. Remember, someone doesn’t have to have heard of the specific name of the argument, or of the specific formulation, to have it in their prior knowledge and therefore be disqualified from using it to update their beliefs. They just have to have considered the underlying inferences.

At best such a person could claim to have previously been slightly compelled by the evidence, but now highly compelled by a newly presented, officious sounding, math-ish formulation. But even that would have to be a TRUE claim about the state of their beliefs, not just one that I can’t disprove. And I’m highly skeptical of people with long standing ideological commitments who suddenly claim that a new argument has convinced them of something they only previously held loosely, in contradiction of their own prior behavior.

  (Quote)

Adito January 30, 2011 at 7:42 pm

If the argument is aimed towards 50/50 agnostics then I doubt there are many ideological commitments in play.

The key is really going to be who your target audience is. If you grab a random agnostic off the street then I doubt that they’ve ever heard of a detailed bayesian analysis of the resurrection. Perhaps something like “well if a god exists then maybe he’s the sort of thing that would raise people from the dead” went into their 50/50 assessment but an actual analysis of the evidence would be several orders of magnitude stronger than that and therefor has a decent chance of dipping the scale in favor of theism. This sort of approach becomes less and less useful the more educated and careful your target audience becomes.

Of course the true strength of this approach comes up when the argument is about things even educated and careful laymen are unlikely to have considered. For instance I’ve heard a couple theists recommend an argument from proper function for God that I hadn’t heard of before. I don’t think your point was aimed towards these sorts of arguments though.

  (Quote)

Patrick January 30, 2011 at 8:52 pm

“If you grab a random agnostic off the street then I doubt that they’ve ever heard of a detailed bayesian analysis of the resurrection.”

They don’t have to have heard a detailed Bayesian analysis. They just have to be familiar with the input that went into the equation.

“The Bible says that Paul was a total dick, and then he converted” might be the input. It doesn’t matter if they’ve never heard a Bayesian analysis of that before, if they already knew that the Bible says that Paul was a total dick and then he converted, then they can’t update the probability of anything that depended on that by merely re-analyzing the old information under a new mathematical framework.

  (Quote)

melior February 1, 2011 at 3:19 am

And I’m highly skeptical of people with long standing ideological commitments who suddenly claim that a new argument has convinced them of something they only previously held loosely, in contradiction of their own prior behavior.

Well put. This reminds me of a similar argument well-known to those who are “aware of all internet traditions”, as the kids say.

  (Quote)

Leave a Comment