Reading Yudkowsky, part 8

by Luke Muehlhauser on January 4, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Overcoming Bias (now moved to Less Wrong) are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to improve their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

Yudkowsky’s 46th post is Scope Insensitivity:

Once upon a time, three groups of subjects were asked how much they would pay to save 2000 / 20000 / 200000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered $80, $78, and $88. This is scope insensitivity or scope neglect: the number of birds saved – the scope of the altruistic action – had little effect on willingness to pay.

Similar experiments showed that Toronto residents would pay little more to clean up all polluted lakes in Ontario than polluted lakes in a particular region of Ontario, or that residents of four western US states would pay only 28% more to protect all 57 wilderness areas in those states than to protect a single area.

People visualize “a single exhausted bird, its feathers soaked in black oil, unable to escape”. This image, or prototype, calls forth some level of emotional arousal that is primarily responsible for willingness-to-pay – and the image is the same in all cases. As for scope, it gets tossed out the window – no human can visualize 2000 birds at once, let alone 200000…

We are insensitive to scope even when human lives are at stake: Increasing the alleged risk of chlorinated drinking water from 0.004 to 2.43 annual deaths per 1000 – a factor of 600 – increased willingness-to-pay from $3.78 to $15.23…

The moral: If you want to be an effective altruist, you have to think it through with the part of your brain that processes those unexciting inky zeroes on paper, not just the part that gets real worked up about that poor struggling oil-soaked bird.

In the same theme is One Life Against the World:

Saving one life probably does feel just as good as being the first person to realize what makes the stars shine. It probably does feel just as good as saving the entire world.

But if you ever have a choice, dear reader, between saving a single life and saving the whole world – then save the world. Please. Because beyond that warm glow is one heck of a gigantic difference…

Why should it be any different if a philanthropist spends $10 million on curing a rare but spectacularly fatal disease which afflicts only a hundred people planetwide, when the same money has an equal probability of producing a cure for a less spectacular disease that kills 10% of 100,000 people? I don’t think it is different. When human lives are at stake, we have a duty to maximize, not satisfice…

In Risk-Free Bonds Aren’t, Yudkowsky notes that the U.S. is not immune from defaulting on its bonds:

Do you expect the United States to still be around in 300 years?  If not, do you know exactly when it will go bust?  Then why isn’t the risk of losing your capital on a 30-year Treasury bond at least, say, 10%?

Correspondence Bias explains that:

The correspondence bias is the tendency to draw inferences about a person’s unique and enduring dispositions from behaviors that can be entirely explained by the situations in which they occur.

In my own field (if I can call ethics my “field”), the discovery of this bias presents a problem for many forms of virtue ethics, as explained for example by John Doris. It turns out that we don’t have robust characters like we thought. Rather, we observe each other in more or less the same type of situation nearly all the time. But “robust character” theory is simpler than situationism, and because we observe each other in mostly the same situations all the time, it does nearly as good a job predicting behavior. So we stick with it.

Well, we stick with “robust character” theory when describing others, anyway:

We tend to see far too direct a correspondence between others’ actions and personalities.  When we see someone else kick a vending machine for no visible reason, we assume they are “an angry person”.  But when you yourself kick the vending machine, it’s because the bus was late, the train was early, your report is overdue, and now the damned vending machine has eaten your lunch money for the second day in a row.  Surely, you think to yourself, anyone would kick the vending machine, in that situation.

This same bias leads you to wonder: Are Your Enemies Innately Evil? In fact, Joshua Knobe has found that we sometimes attribute intention to people when the result of their decision is bad, but not when the result of their decision is good. We are especially eager to attribute bad actions to bad character.

Of course, we must also remember that people don’t see themselves as evil:

So let’s come right out and say it – the 9/11 hijackers weren’t evil mutants.  They did not hate freedom.  They, too, were the heroes of their own stories, and they died for what they believed was right – truth, justice, and the Islamic way.  If the hijackers saw themselves that way, it doesn’t mean their beliefs were true.  If the hijackers saw themselves that way, it doesn’t mean that we have to agree that what they did was justified.  If the hijackers saw themselves that way, it doesn’t mean that the passengers of United Flight 93 should have stood aside and let it happen.  It does mean that in another world, if they had been raised in a different environment, those hijackers might have been police officers.  And that is indeed a tragedy.  Welcome to Earth.

Next up is an Open Thread, and then Two More Things to Unlearn from School. Elsewhere, Ben Casnocha had given Three Things to Unlearn from School:

The importance of opinion. “Schools, especially good ones…that so emphasize student voice, teach us to value opinion. This is a great deception. Opinion is really the lowest form of human knowledge; it requires no accountability, no understanding. The highest form of knowledge, according to George Eliot, is empathy, for it requires us to suspend our egos and live in another’s world. It requires profound, purpose‐larger‐than‐the‐self kind of understanding.”

The importance of solving given problems. “Schools teach us to be clever, great problem solvers, but not to include ourselves in the problem that’s being solved. This is a great delusion. It makes us arrogant and complacent and teaches us to look at the world as a problem outside of us. As in Oedipus, public problems – the plague on Thebes or our own pestilences, war or global warming – are private problems. The plague is only lifted when each person sees his responsibility not in analyzing the problem, not in solving the riddle, but in changing our actions to address a public need. Oedipus destroyed the two things that had deceived him – his eyes and his power – and in so doing saved his city.”

The importance of earning the approval of others. “Schools teach students to seek the approval of their teachers. Indeed, for all of our differences, this is one area that parents and teachers share; we are wired or we are hired to believe in you, to approve you, to prevent or mitigate the experiences of disappointment…Try to correct this in two ways. First seek people, work for people who don’t have to like you, people who can easily disapprove of you, people that you can’t easily please. Their skepticism or indifference will define you. Second, if you don’t how to do so already, begin working for yourself, and let the teachers be damned. But they won’t be – they’ll just be all the more approving because that kind of integrity can only command respect. After all, most of the work we devise is devised for students who are not working for themselves, so those that do surpass our expectations and teach us things that we’ve never thought of.”

Yudkowsky adds two more:

I suspect the most dangerous habit of thought taught in schools is that even if you don’t really understand something, you should parrot it back anyway. One of the most fundamental life skills is realizing when you are confused, and school actively destroys this ability – [it] teaches students that they “understand” when they can successfully answer questions on an exam, which is very very very far from absorbing the knowledge and making it a part of you.

Well, one more. I’m not sure what the second one is.

Previous post:

Next post:

{ 2 comments… read them below or add one }

Dan Nelson January 5, 2011 at 9:02 am

Luke: Well, one more. I’m not sure what the second one is.

In the comments to Yudkowsky’s original post on the subject of things to unlearn from school, he is asked if he has really presented two more things rather than one. He answers with:

Bad Habit #1) Don’t notice when you’re confused.

Bad Habit #2) All authoritative ideas / all new ideas / all ideas that have a few plausible reasons to support them, are true.

By the way, I really enjoy this series. It is interesting to read another person’s perspective on Yudkowsky’s blogwork.

  (Quote)

Luke Muehlhauser January 5, 2011 at 9:59 am

Dan,

Ah, thanks.

  (Quote)

Leave a Comment