Reading Yudkowsky, part 25

by Luke Muehlhauser on April 3, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Less Wrong are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to “level up” their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 185th post is Evaporative Cooling of Group Beliefs:

Early studiers of cults were surprised to discover than when cults receive a major shock – a prophecy fails to come true, a moral flaw of the founder is revealed – they often come back stronger than before, with increased belief and fanaticism…

Why would a group belief become stronger after encountering crushing counterevidence?

The conventional interpretation of this phenomenon is based on cognitive dissonance.  When people have taken “irrevocable” actions in the service of a belief – given away all their property in anticipation of the saucers landing – they cannot possibly admit they were mistaken. The challenge to their belief presents an immense cognitive dissonance; they must find reinforcing thoughts to counter the shock, and so become more fanatical.

Yudkowsky has another contributing theory:

In Festinger’s classic “When Prophecy Fails”, one of the cult members walked out the door immediately after the flying saucer failed to land.  Who gets fed up and leaves first? An average cult member?  Or a relatively more skeptical member, who previously might have been acting as a voice of moderation, a brake on the more fanatic members?

After [skeptical members] escape, the remaining discussions will be between the extreme fanatics on one end and the slightly less extreme fanatics on the other end, with the group consensus somewhere in the “middle”.

Yudkowsky gives other possible examples of this phenomenon:

When Ayn Rand’s long-running affair with Nathaniel Branden was revealed to the Objectivist membership, a substantial fraction of the Objectivist membership broke off and followed Branden into espousing an “open system” of Objectivism not bound so tightly to Ayn Rand.  Who stayed with Ayn Rand even after the scandal broke?  The ones who really, really believed in her – and perhaps some of the undecideds, who, after the voices of moderation left, heard arguments from only one side.  This may account for how the Ayn Rand Institute is (reportedly) more fanatic after the breakup, than the original core group of Objectivists under Branden and Rand.

A few years back, I was on a transhumanist mailing list where a small group espousing “social democratic transhumanism” vitriolically insulted every libertarian on the list.  Most libertarians left the mailing list, most of the others gave up on posting.  As a result, the remaining group shifted substantially to the left.

Frankly, this reminds of PZ Myers’ blog. Because of PZ’s personality, anyone there who is not vitriolic toward religion is insulted and demeaned, and so all those kinds of people leave the blog. The remaining bunch is, unsurprisingly, going to be of a decidedly more vitriolic nature on average.

When None Dare Urge Restraint recalls that the reaction to 9/11 has been so much worse than 9/11 itself. The resultant wars have more than 30 times more casualties, for example. And how did this happen? Because of a spiral of hate:

I think the best illustration was “the suicide hijackers were cowards“.  Some common sense, please?  It takes a little courage to voluntarily fly your plane into a building.  Of all their sins, cowardice was not on the list.  But I guess anything bad you say about a terrorist, no matter how silly, must be true.  Would I get even more brownie points if I accused al Qaeda of having assassinated John F. Kennedy?  Maybe if I accused them of being Stalinists?  Really, cowardice?

Yes, it matters that the 9/11 hijackers weren’t cowards.  Not just for understanding the enemy’s realistic psychology.  There is simply too much damage done by spirals of hate.  It is just too dangerous for there to be any target in the world, whether it be the Jews or Adolf Hitler, about whom saying negative things trumps saying accurate things.

…If the USA had completely ignored the 9/11 attack – just shrugged and rebuilt the building – it would have been better than the real course of history.  But that wasn’t a political option.  Even if anyone privately guessed that the immune response would be more damaging than the disease, American politicians had no career-preserving choice but to walk straight into al Qaeda’s trap.  Whoever argues for a greater response is a patriot.  Whoever dissects a patriotic claim is a traitor.

The next post describes the fascinating Robbers Cave Experiment.

Misc Meta just covers some Overcoming Bias news.

Every Cause Wants to be a Cult discusses the allegation that Wikipedia’s top administrators are engaging in cult-like behavior:

The ingroup-outgroup dichotomy is part of ordinary human nature.  So are happy death spirals and spirals of hate.  A Noble Cause doesn’t need a deep hidden flaw for its adherents to form a cultish in-group.  It is sufficient that the adherents be human.  Everything else follows naturally, decay by default, like food spoiling in a refrigerator after the electricity goes off.

In the same sense that every thermal differential wants to equalize itself, and every computer program wants to become a collection of ad-hoc patches, every Cause wants to be a cult.  It’s a high-entropy state into which the system trends, an attractor in human psychology.  It may have nothing to do with whether the Cause is truly Noble.  You might think that a Good Cause would rub off its goodness on every aspect of the people associated with it – that the Cause’s followers would also be less susceptible to status games, ingroup-outgroup bias, affective spirals, leader-gods.  But believing one true idea won’t switch off the halo effect.  A noble cause won’t make its adherents something other than human.

On one notable occasion there was a group that went semicultish whose rallying cry was “Rationality!  Reason!  Objective reality!”  (More on this in future posts.)  Labeling the Great Idea “rationality” won’t protect you any more than putting up a sign over your house that says “Cold!”  You still have to run the air conditioner – expend the required energy per unit time to reverse the natural slide into cultishness.  Worshipping rationality won’t make you sane any more than worshipping gravity enables you to fly.  You can’t talk to thermodynamics and you can’t pray to probability theory.  You can use it, but not join it as an in-group.

Cultishness is quantitative, not qualitative.  The question is not “Cultish, yes or no?” but “How much cultishness and where?”  Even in Science, which is the archetypal Genuinely Truly Noble Cause, we can readily point to the current frontiers of the war against cult-entropy, where the current battle line creeps forward and back.  Are journals more likely to accept articles with a well-known authorial byline, or from an unknown author from a well-known institution, compared to an unknown author from an unknown institution?  How much belief is due to authority and how much is from the experiment?  Which journals are using blinded reviewers, and how effective is blinded reviewing?

Next, Reversed Stupidity is Not Intelligence:

If you knew someone who was wrong 99.99% of the time on yes-or-no questions, you could obtain 99.99% accuracy just by reversing their answers.  They would need to do all the work of obtaining good evidence entangled with reality, and processing that evidence coherently, just to anticorrelate that reliably.  They would have to be superintelligent to be that stupid.

A car with a broken engine cannot drive backward at 200 mph, even if the engine is really really broken.

If stupidity does not reliably anticorrelate with truth, how much less should human evil anticorrelate with truth?  The converse of the halo effect is the horns effect:  All perceived negative qualities correlate.  If Stalin is evil, then everything he says should be false.  You wouldn’t want to agree with Stalin, would you?

Stalin also believed that 2 + 2 = 4.  Yet if you defend any statement made by Stalin, even “2 + 2 = 4″, people will see only that you are “agreeing with Stalin”; you must be on his side.

Argument Screens Off Authority describes an asymmetry between argument and authority:

Scenario 1:  Barry is a famous geologist.  Charles is a fourteen-year-old juvenile delinquent with a long arrest record and occasional psychotic episodes.  Barry flatly asserts to Arthur some counterintuitive statement about rocks, and Arthur judges it 90% probable.  Then Charles makes an equally counterintuitive flat assertion about rocks, and Arthur judges it 10% probable.  Clearly, Arthur is taking the speaker’s authority into account in deciding whether to believe the speaker’s assertions.

Scenario 2:  David makes a counterintuitive statement about physics and gives Arthur a detailed explanation of the arguments, including references.  Ernie makes an equally counterintuitive statement, but gives an unconvincing argument involving several leaps of faith.  Both David and Ernie assert that this is the best explanation they can possibly give (to anyone, not just Arthur).  Arthur assigns 90% probability to David’s statement after hearing his explanation, but assigns a 10% probability to Ernie’s statement.

It might seem like these two scenarios are roughly symmetrical: both involve taking into account useful evidence, whether strong versus weak authority,  or strong versus weak argument.

But now suppose that Arthur asks Barry and Charles to make full technical cases, with references; and that Barry and Charles present equally good cases, and Arthur looks up the references and they check out.  Then Arthur asks David and Ernie for their credentials, and it turns out that David and Ernie have roughly the same credentials – maybe they’re both clowns, maybe they’re both physicists.

Assuming that Arthur is knowledgeable enough to understand all the technical arguments – otherwise they’re just impressive noises – it seems that Arthur should view David as having a great advantage in plausibility over Ernie, while Barry has at best a minor advantage over Charles.

Indeed, if the technical arguments are good enough, Barry’s advantage over Charles may not be worth tracking.  A good technical argument is one that eliminates reliance on the personal authority of the speaker.

Similarly, if we really believe Ernie that the argument he gave is the best argument he could give, which includes all of the inferential steps that Ernie executed, and all of the support that Ernie took into account – citing any authorities that Ernie may have listened to himself – then we can pretty much ignore any information about Ernie’s credentials.  Ernie can be a physicist or a clown, it shouldn’t matter.  (Again, this assumes we have enough technical ability to process the argument.  Otherwise, Ernie is simply uttering mystical syllables, and whether we “believe” these syllables depends a great deal on his authority.)

So it seems there’s an asymmetry between argument and authority.  If we know authority we are still interested in hearing the arguments; but if we know the arguments fully, we have very little left to learn from authority.

The rest of the post is a technical Bayesian explanation of this. Hug the Query continues:

In the art of rationality there is a discipline of closeness-to-the-issue – trying to observe evidence that is as near to the original question as possible, so that it screens off as many other arguments as possible.

The Wright Brothers say, “My plane will fly.”  If you look at their authority (bicycle mechanics who happen to be excellent amateur physicists) then you will compare their authority to, say, Lord Kelvin, and you will find that Lord Kelvin is the greater authority.

If you demand to see the Wright Brothers’ calculations, and you can follow them, and you demand to see Lord Kelvin’s calculations (he probably doesn’t have any apart from his own incredulity), then authority becomes much less relevant.

If you actually watch the plane fly, the calculations themselves become moot for many purposes, and Kelvin’s authority not even worth considering.

The more directly your arguments bear on a question, without intermediate inferences – the closer the observed nodes are to the queried node, in the Great Web of Causality – the more powerful the evidence.  It’s a theorem of these causal graphs that you can never get more information from distant nodes, than from strictly closer nodes that screen off the distant ones.

…Whenever you can, dance as near to the original question as possible – press yourself up against it – get close enough to hug the query!

Guardians of the Truth opens:

The criticism is sometimes leveled against rationalists:  “The Inquisition thought they had the truth!  Clearly this ‘truth’ business is dangerous.”

There are many obvious responses, such as “If you think that possessing the truth would license you to torture and kill, you’re making a mistake that has nothing to do with epistemology.”  Or, “So that historical statement you just made about the Inquisition – is it true?”

…And yet… I think the Inquisition’s attitude toward truth played a role.  The Inquisition believed that there was such a thing as truth, and that it was important; well, likewise Richard Feynman.  But the Inquisitors were not Truth-Seekers.  They were Truth-Guardians.

Why don’t scientists torture those who reject atomic theory?

Questions like that don’t have neat single-factor answers.  But I would argue that one of the factors has to do with assuming a defensive posture toward the truth, versus a productive posture toward the truth.

Guardians of the Gene Pool continues:

Like any educated denizen of the 21st century, you may have heard of World War II.  You may remember that Hitler and the Nazis planned to carry forward a romanticized process of evolution, to breed a new master race, supermen, stronger and smarter than anything that had existed before.

Actually this is a common misconception.  Hitler believed that the Aryan superman had previously existed – the Nordic stereotype, the blond blue-eyed beast of prey – but had been polluted by mingling with impure races.  There had been a racial Fall from Grace.

It says something about the degree to which the concept of progress permeates Western civilization, that the one is told about Nazi eugenics and hears “They tried to breed a superhuman.”  You, dear reader – if you failed hard enough to endorse coercive eugenics, you would try to create a superhuman.  Because you locate your ideals in your future, not in your past.  Because you are creative. The thought of breeding back to some Nordic archetype from a thousand years earlier would not even occur to you as a possibility – what, just the Vikings? That’s all? If you failed hard enough to kill, you would damn well try to reach heights never before reached, or what a waste it would all be, eh?  Well, that’s one reason you’re not a Nazi, dear reader.

Guardians of Ayn Rand applies the thought to Objectivism:

Ayn Rand’s novels glorify technology, capitalism, individual defiance of the System, limited government, private property, selfishness. Her ultimate fictional hero, John Galt, was a scientist who invented a new form of cheap renewable energy; but then refuses to give it to the world since the profits will only be stolen to prop up corrupt governments.

And then – somehow – it all turned into a moral and philosophical “closed system” with Ayn Rand at the center.  The term “closed system” is not my own accusation; it’s the term the Ayn Rand Institute uses to describe Objectivism.  Objectivism is defined by the works of Ayn Rand.  Now that Rand is dead, Objectivism is closed.  If you disagree with Rand’s works in any respect, you cannot be an Objectivist.

He also makes an interesting point about the opportunity to be a rationalist:

Rand wrote about “rationality”, yet failed to familiarize herself with the modern research in heuristics and biases.  How can anyone claim to be a master rationalist, yet know nothing of such elementary subjects?

“Wait a minute,” objects the reader, “that’s not quite fair!  Atlas Shrugged was published in 1957!  Practically nobody knew about Bayes back then.”  Bah.  Next you’ll tell me that Ayn Rand died in 1982, and had no chance to read Judgment Under Uncertainty: Heuristics and Biases, which was published that same year.

Science isn’t fair.  That’s sorta the point.  An aspiring rationalist in 2007 starts with a huge advantage over an aspiring rationalist in 1957.  It’s how we know that progress has occurred.

The Litany Against Gurus is a short poem about gurus. Politics and Awful Art argues against, among other things, atheistic hymns. Two Cult Koans tells two very short stories about avoiding cultishness.

False Laughter offers a warning:

There’s this thing called “derisive laughter” or “mean-spirited laughter”, which follows from seeing the Hated Enemy get a kick in the pants.  It doesn’t have to be an unexpected kick in the pants, or a kick followed up with a custard pie.  It suffices that the Hated Enemy gets hurt.  It’s like humor, only without the humor…

If you find yourself in a group of people who tell consistently unfunny jokes about the Hated Enemy, it may be a good idea to head for the hills, before you start to laugh as well…

Another application:  You and I should be allowed not to laugh at certain jokes – even jokes that target our own favorite causes – on the grounds that the joke is too predictable to be funny.  We should be able to do this without being accused of being humorless, “unable to take a joke”, or protecting sacred cows.  If labeled-Godzilla-stomps-a-labeled-building isn’t funny about “Bush” and “Science”, then it also isn’t funny about “libertarian economists” and “American national competitiveness”, etc.

The most scathing accusation I ever heard against Objectivism is that hardcore Objectivists have no sense of humor; but no one could prove this by showing an Objectivist a cartoon of Godzilla-”Rand” stomping on building-”humor” and demanding that he laugh.

Requiring someone to laugh in order to prove their non-cultishness – well, like most kinds of obligatory laughter, it doesn’t quite work.  Laughter, of all things, has to come naturally.  The most you can do is get fear and insecurityout of its way.

Effortless Technique argues that rationality need not be hard:

In the Hollywood version of rationality – or even the Traditional rationality that was passed down from supervisor to grad student in ancient days before Bayesianism – rationality is a great strain, a great effort, a continuous battle to coerce your mind into a desired shape.  Spock, the archetype of Hollywood’s concept of rationality, represses allhis emotions.

And this great effort, they conceive, is virtue unto a rationalist.  The more effort you expend on forcing yourself into the mold, the better the rationalist you must be.  It’s like working extra hard at your job, as demanded by the Protestant work-ethic.  If the one works long hours – sweating, getting ulcers – surely the one must be worthy of praise?

This, I think, is an instance of a Lost Purpose.  People see that successful folk must sometimes make an effort, and so they conclude that effort of itself is virtuous whether or not it succeeds.

Still, being a rationalist is probably hard. You can’t just “go with the flow”:

According to authentic Taoism, you can exert no effort at all while accomplishing all worthwhile things.  This seems to me around as plausible as an agent that achieves its utility function using zero computing power and is therefore maximally intelligent.  The only way you could do it is if the agent assigns constant utility to all outcomes, or if the utility function’s maximum is set by sleight of hand to wherever the universe goes anyway.  This may be why I am not a Taoist:  “A maximally intelligent agent with zero computing power and no utility function” sounds like a good metaphor for the Tao.  I object to a metric of intelligence that makes me dumber than a rock.

Zen and the Art of Rationality examines (and rebuts) some comparisons between Bayesian rationality and Eastern traditional wisdom.

The Amazing Virgin Pregnancy proposes a new theory for the origins of Christianity.

Previous post:

Next post:

{ 7 comments… read them below or add one }

Chris Hallquist April 3, 2011 at 8:41 am

Re: PZ’s blog. I’ve heard others say similar things, but I haven’t any sense of whether they’re really true or not, because pay next to no attention to blog comment sections. What’s the best evidence you’ve seen of this claim?

  (Quote)

MarkD April 3, 2011 at 5:55 pm

In the same sense that every thermal differential wants to equalize itself, and every computer program wants to become a collection of ad-hoc patches, every Cause wants to be a cult. It’s a high-entropy state into which the system trends, an attractor in human psychology.

This kind of thing turns me off to Less Wrong. These claims have several dimensions of wrongness and are reminiscent of Marxism and Freudianism in presuming invisible forces that are only weakly justifiable.

  (Quote)

drj April 3, 2011 at 6:15 pm

Re: PZ’s blog. I’ve heard others say similar things, but I haven’t any sense of whether they’re really true or not, because pay next to no attention to blog comment sections. What’s the best evidence you’ve seen of this claim?

The comments on PZ’s blog, in my experience, are pretty bad, and the characterization is pretty accurate. They are mostly unreasonable, angry, militant, and ill-informed.

  (Quote)

Chris Hallquist April 3, 2011 at 7:41 pm

drj,

I’m almost certain your statement is false. Militant? Like, advocating getting things done with guns and bombs? No, I’m betting you’re just going after them with whatever negative adjective is handy.

The “unreasonable, angry, and ill-informed” claim is more plausible, but you don’t provide any evidence of it, so I have little reason to believe it, given the above and given the fact that I often see atheists accused of those things for saying perfectly sensible stuff.

  (Quote)

Garren April 3, 2011 at 8:42 pm

Another example of ‘evaporative cooling’ might be the high concentration of Jesus mythicism at FRDB.

Or, for those of us who believe there was a historical Jesus much as portrayed in the Gospels, in the early Jesus movement itself. Even the Gospels talk about people leaving the movement. It’s a decent bet these were the more moderate, critical followers.

  (Quote)

antiplastic April 3, 2011 at 11:47 pm

Good thing the Singularity Movement in no way, shape or form resembles a cult, and that none of their predictions have ever been crushed by counterevidence!

And I loved the suggestion that the last decade would have been much better with another 10 years of Al Qaeda having a state sponsor. Pure comedy gold.

  (Quote)

drj April 4, 2011 at 2:27 pm

drj,

I’m almost certain your statement is false. Militant? Like, advocating getting things done with guns and bombs? No, I’m betting you’re just going after them with whatever negative adjective is handy.

The “unreasonable, angry, and ill-informed” claim is more plausible, but you don’t provide any evidence of it, so I have little reason to believe it, given the above and given the fact that I often see atheists accused of those things for saying perfectly sensible stuff.

Well, no – they arent suicide bombers or anything… and come to think of it, I generally don’t have much of a problem with anyone being militant – but combined with the other aforementioned adjectives, it gets to be a problem:) Maybe ‘strident’ is a better word.

In any case, I don’t know what kind of evidence you would want… just surf the comments there for a while, and you’ll eventually either agree, or you won’t.

PS – I’m atheist too…

  (Quote)

Leave a Comment