Reading Yudkowsky, part 15

by Luke Muehlhauser on February 17, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Overcoming Bias (now moved to Less Wrong) are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to improve their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 89th post is “Science” as a Curiostiy-Stopper:

If you thought the light bulb was scientifically inexplicable, it would seize the entirety of your attention.  You would drop whatever else you were doing, and focus on that light bulb.

But what does the phrase “scientifically explicable” mean?  It means that someone else knows how the light bulb works.  When you are told the light bulb is “scientifically explicable”, you don’t know more than you knew earlier; you don’t know whether the light bulb will brighten or fade.  But because someone else knows, it devalues the knowledge in your eyes.  You become less curious.

But is that really true? It isn’t for me. When I explain how the light bulb works by saying “Electricity!” I don’t pretend that I’ve offered a thorough explanation, and nor does my curiosity abate. But usually, I recognize  that I don’t have time to learn the equations and work them. That’s not where my comparative advantage lies.

In Absurdity Bias, Absurdity Heuristic, Eliezer struggles toward a formal definition of what is meant when we call something “absurd” (and why we should avoid doing so). Availability concerns the availability bias: we overestimate the probability of events we can call easily to mind. Why is the Future So Absurd? attempts to answer the question:

Why, historically, has the future so often turned out to be more “absurd” than people seem to have expected?

Anchoring and Adjustment explains the well-known phenomena of anchoring, which affects our judgment in all kinds of silly ways, and is worth watching out for. The Crackpot Offer gives us a personal story of when young Eliezer had to say “oops” and move on.

Radical Honesty concerns Brad Blanton and his call for people to make it mandatory for themselves to tell people exactly and honestly what they’re thinking, all the time, including “I think you’re fat.” Yudkowsky wonders:

Will Blanton’s Rules ever catch on?  I worry that Radical Honesty would selectively disadvantage rationalists in human relationships.  Broadcasting your opinions is much easier when you can deceive yourself about anything you’d feel uncomfortable saying to others.  I wonder whether practitioners of Radical Honesty tend to become more adept at self-deception, as they stop being able to tell white lies or admit private thoughts to themselves.  I have taken a less restrictive kind of honesty upon myself – to avoid statements that are literally false  – and I know that this becomes more and more difficult, more and more of a disadvantage, as I deceive myself less and less.

We Don’t Really Want Your Participation is a little post about a Singularity Summit.

Applause Lights is a neat little post about how some words do not have meaning in a proposition, but rather function as applause lights to get everyone to applaud. Words like “democracy,” sometimes. Eliezer says he is tempted to write a speech made entirely of applause lights and see how long it takes the audience to catch on and start laughing:

I am here to propose to you today that we need to balance the risks and opportunities of advanced Artificial Intelligence.  We should avoid the risks and, insofar as it is possible, realize the opportunities.  We should not needlessly confront entirely unnecessary dangers.  To achieve these goals, we must plan wisely and rationally.  We should not act in fear and panic, or give in to technophobia; but neither should we act in blind enthusiasm.  We should respect the interests of all parties with a stake in the Singularity.  We must try to ensure that the benefits of advanced technologies accrue to as many individuals as possible, rather than being restricted to a few.  We must try to avoid, as much as possible, violent conflicts using these technologies; and we must prevent massive destructive capability from falling into the hands of individuals.  We should think through these issues before, not after, it is too late to do anything about them…

Rationality and the English Language is a rationalist’s update to George Orwell’s Politics and the English Language, and both are highly worth reading. Human Evil and Muddled Thinking stays with Orwell, who wrote:

If you simplify your English, you are freed from the worst follies of orthodoxy. You cannot speak any of the necessary dialects, and when you make a stupid remark its stupidity will be obvious, even to yourself.

Eliezer responds:

To make our stupidity obvious, even to ourselves – this is the heart of Overcoming Bias.

Evil sneaks, hidden, through the unlit shadows of the mind.  We look back with the clarity of history, and weep to remember the planned famines of Stalin and Mao, which killed tens of millions.  We call this evil, because it was done by deliberate human intent to inflict pain and death upon innocent human beings.  We call this evil, because of the revulsion that we feel against it, looking back with the clarity of history.  For perpetrators of evil to avoid its natural opposition, the revulsion must remain latent.  Clarity must be avoided at any cost.  Even as humans of clear sight tend to oppose the evil that they see; so too does human evil, wherever it exists, set out to muddle thinking.

Doublethink (Choosing to be Biased) wonders:

What if self-deception helps us be happy?  What if just running out and overcoming bias will make us – gasp! -unhappy? Surely, true wisdom would be second-order rationality, choosing when to be rational.  That way you can decide which cognitive biases should govern you, to maximize your happiness.

Leaving the morality aside, I doubt such a lunatic dislocation in the mind could really happen.

Second-order rationality implies that at some point, you will think to yourself, “And now, I will irrationally believe that I will win the lottery, in order to make myself happy.”  But we do not have such direct control over our beliefs.  You cannot make yourself believe the sky is green by an act of will.  You might be able to believe you believed it – though I have just made that more difficult for you by pointing out the difference.  (You’re welcome!)  You might even believe you were happy and self-deceived; but you would not in fact be happy and self-deceived…

You can’t know the consequences of being biased, until you have already debiased yourself.  And then it is too late for self-deception.

That is similar to a description of my deconversion. When I figured out, quite unintentionally, that God does not exist, I finally had a “choice.” But by then it was too late. I actually prayed for God to “roll back” my self-education so I could believe again (just in case some kind of vague non-Christian God existed), but it never happened.

Why I’m Blooking explains why Eliezer blogs.

Previous post:

Next post:

{ 7 comments… read them below or add one }

Garren February 17, 2011 at 9:04 am

I’ve run into this notion of second-order rationality before, in Walter Sinnott-Armstrong’s typology of justification in Moral Skepticisms. He draws a distinction between being “instrumentally justified” and “epistemically justified.” What he means is that there are things we’re justified in believing are true by means of good epistemology…and things that we’re practically justified in believing because the state of believing them advances important goals besides having true beliefs.

The example I give (click on my name above to see the post) is that of an Atheist married to a Theist who wishes she could believe in God because having a better relationship with her spouse is more important to her than holding a true belief. If she’s offered a pill which will alter her mind so that she will believe in God, she would be instrumentally justified in taking it. We could even say her artificially-induced God belief is itself instrumentally justified.

Yudkowsky says “there is no second-order rationality.” I agree we can’t voluntarily choose arbitrary beliefs (“just believe” is not effective advice), but even today we can choose to take certain drugs which have crippling effects on rational thinking. To adapt an example I heard somewhere (forgot where!), consider a man who is being lethally threatened unless he helps the threatening person sneak into a secure building in the next hour. Further suppose he injects himself with a drug he knows will make himself very obviously crazy to the point of not caring about his own life and unable to sneak anyone into the building unnoticed. Under the drug’s effects, he won’t be acting rationally but it might be rational for him to choose to take the drug and enter a state of irrationality. Not caring about survival — in a way transparent to others — may be his best strategy to survive.

By the way, John Crichton on Farscape occasionally appears to employ this tactic without the need for special drugs (e.g. “Look at the Princess” part 2). His character is barely holding on to sanity half the time anyway.

  (Quote)

Garren February 17, 2011 at 9:10 am

I’ve run into this notion of second-order rationality before, in Walter Sinnott-Armstrong’s typology of justification in Moral Skepticisms. He draws a distinction between being “instrumentally justified” and “epistemically justified.” What he means is that there are things we’re justified in believing are true by means of good epistemology…and things that we’re practically justified in believing because the state of believing them advances important goals besides having true beliefs.

The example I give (click on my name above to see the post) is that of an Atheist married to a Theist who wishes she could believe in God because having a better relationship with her spouse is more important to her than holding a true belief. If she’s offered a pill which will alter her mind so that she will believe in God, she would be instrumentally justified in taking it. We could even say her artificially-induced God belief is itself instrumentally justified.

Yudkowsky says “there is no second-order rationality.” I agree we can’t voluntarily choose arbitrary beliefs (“just believe” is not effective advice), but even today we can choose to take certain drugs which have crippling effects on rational thinking. To adapt an example I heard somewhere (forgot where!), consider a man who is being lethally threatened unless he helps the threatening person sneak into a secure building in the next hour. Further suppose he injects himself with a drug he knows will make himself very obviously crazy to the point of not caring about his own life and unable to sneak anyone into the building unnoticed. Under the drug’s effects, he won’t be acting rationally but it might be rational for him to choose to take the drug and enter a state of irrationality. Not caring about survival — in a way transparent to others — may be his best strategy to survive.

By the way, John Crichton on Farscape occasionally appears to employ this tactic without the need for special drugs (e.g. “Look at the Princess” part 2). His character is barely holding on to sanity half the time anyway.

  (Quote)

Garren February 17, 2011 at 9:21 am

Is there a character count limit on comments that I would have hit with a 2100 character (340 word) reply? I thought I submitted it, waited a while and refreshed…then tried again. Nada!

(‘Blooking’ is a terrible word.)

  (Quote)

Luke Muehlhauser February 17, 2011 at 1:47 pm

Garren,

Akismet spam filter grabbed your posts. I have freed them.

  (Quote)

Adito February 17, 2011 at 7:32 pm

If she’s offered a pill which will alter her mind so that she will believe in God, she would be instrumentally justified in taking it.

I don’t see any belief based on instrumentality here. She makes a choice based on certain values she has but the justification for her belief is something completely different from this choice. Presumably once she’s taken the pill the belief will be based on her altered biology and not the instrumentality of the belief.

I think it’s very rare that you can actually choose to believe something and even if you accomplish that task you certainly aren’t justified in your belief. Your examples involve the justification of a choice that leads to an involuntary belief and a justification of beliefs arrived at in this way.

  (Quote)

Steven R. February 17, 2011 at 7:53 pm

Hm, I’m not sure I necessarily agree with Yudkowsky on “Science As a Curiosity Stopper” in it’s entirety. I understand it’s concept, and to a large degree agree with it, but after a while, some concepts are extremely complex and rely upon years of expertise on a subject area to understand. So I think you are somewhat justified in saying “well, it’s science” or “mathematics” or whatever and take the professional on his word for it. Of course, thinking that this is an explanation is folly but at least knowing that a professional who has researched the topic extensively can explain a phenomenon seems good enough.

Not only that, but if you have no time to investigate all the formulas and complexity to a lightbulb, can you be blamed for answering “electricity” when asked how lightbulbs work? Once again, I don’t see the problem so long as you understand this isn’t an explanation and the answer is much more complex and nuanced. I don’t know, any other thoughts on this?

@Garren:
Interesting post about that and the Consumer Review Magazine.

@Adito:

Maybe I interpreted the example wrong or what you’re saying wrong, but I think Garren’s example holds up. As I understand it, belief based on instrumentality isn’t based on reason, but rather, to achieve some other end than the belief that is adopted, so, it would seem like taking a pill to believe in God for the purpose of improving your marriage would be just the thing that exemplifies instrumental beliefs.

  (Quote)

Adito February 19, 2011 at 4:37 pm

Steven,

“Your examples involve the justification of a choice that leads to an involuntary belief and a justification of beliefs arrived at in this way. ”

Should have read “and not a justification of beliefs arrived at in this way”

I’m not disputing that we make form beliefs based on something other than reason or that this may be the best thing to do based on some values we hold. However, as long as we take the justification of a belief to be more than just an explanation of a belief I don’t think instrumental beliefs can be justified. Consider a women who has lost 9 children to war. An instrumental belief is that this occurred for a greater purpose and perhaps this belief is the “best” choice for her because the alternative is soul crushing depression. But I don’t see how we could call this belief justified rather than simply explained (“excused” might be a better word). If we’re interested in tracking the truth of the matter then I think it’s clear we should eliminate instrumental beliefs as far as we possibly can.

  (Quote)

Leave a Comment