Reading Yudkowsky, part 26

by Luke Muehlhauser on April 5, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Less Wrong are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to “level up” their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 203rd post describes Asch’s Conformity Experiment. Some conclusions are given in On Expressing Your Concerns:

The scary thing about Asch’s conformity experiments is that you can get many people to say black is white, if you put them in a room full of other people saying the same thing.  The hopeful thing about Asch’s conformity experiments is that a single dissenter tremendously drove down the rate of conformity, even if the dissenter was only giving a different wrong answer.  And the wearisome thing is that dissent was not learned over the course of the experiment – when the single dissenter started siding with the group, rates of conformity rose back up.

Being a voice of dissent can bring real benefits to the group.  But it also (famously) has a cost.  And then you have to keep it up.  Plus you could be wrong.

I think the most important lesson to take away from Asch’s experiments is to distinguish “expressing concern” from “disagreement”.  Raising a point that others haven’t voiced is not a promise to disagree with the group at the end of its discussion.

The ideal Bayesian’s process of convergence involves sharing evidence that is unpredictable to the listener.  The Aumann agreement result holds only for common knowledge, where you know, I know, you know I know, etc.  Hanson’s post or paper on “We Can’t Foresee to Disagree” provides a picture of how strange it would look to watch ideal rationalists converging on a probability estimate; it doesn’t look anything like two bargainers in a marketplace converging on a price.

Unfortunately, there’s not much difference socially between “expressing concerns” and “disagreement”.

Lonely Dissent adds:

If there’s one thing I can’t stand, it’s fakeness – you may have noticed this if you’ve been reading Overcoming Biasfor a while.  Well, lonely dissent has got to be one of the most commonly, most ostentatiously faked characteristics around.  Everyone wants to be an iconoclast.

I don’t mean to degrade the act of joining a rebellion.  There are rebellions worth joining.  It does take courage to brave the disapproval of your peer group, or perhaps even worse, their shrugs.  Needless to say, going to a rock concert is not rebellion.  But, for example, vegetarianism is.  I’m not a vegetarian myself, but I respect people who are, because I expect it takes a noticeable amount of quiet courage to tell people that hamburgers won’t work for dinner.  (Albeit that in the Bay Area, people ask as a matter of routine.)

To Lead, You Must Stand Up tells a story which reminds us that you must stand out (sometimes like a clown) to lead.

Cultish Countercultishness remarks:

Point one:  “Cults” and “non-cults” aren’t separated natural kinds like dogs and cats.  If you look at any list of cult characteristics, you’ll see items that could easily describe political parties and corporations – “group members encouraged to distrust outside criticism as having hidden motives”, “hierarchical authoritative structure”.  I’ve posted on group failure modes like group polarizationhappy death spiralsuncriticality, and evaporative cooling, all of which seem to feed on each other.  When these failures swirl together and meet, they combine to form a Super-Failure stupider than any of the parts, like Voltron.  But this is not a cult essence; it is a cult attractor.

…If you are terribly nervous about cultishness, then you will want to deny any hint of any characteristic that resembles a cult.  But any group with a goal seen in a positive light, is at risk for the halo effect, and will have to pump against entropy to avoid an affective death spiral.  This is true even for ordinary institutions like political parties – people who think that “liberal values” or “conservative values” can cure cancer, etc.  It is true for Silicon Valley startups, both failed and successful.  It is true of Mac users and of Linux users.  The halo effect doesn’t become okay just because everyone does it; if everyone walks off a cliff, you wouldn’t too.  The error in reasoning is to be fought, not tolerated.  But if you’re too nervous about “Are you sure this isn’t a cult?” then you will be reluctant to see any sign of cultishness, because that would imply you’re in a cult, and It’s not a cult!! So you won’t see the current battlefields where the ordinary tendencies toward cultishness are creeping forward, or being pushed back.

But that’s only one tiny point in a long post worth reading.

My Strange Beliefs comments:

Just because this blog is called Overcoming Bias, it does not mean that any time any author says something you disagree with, you should comment “OMG!  How biased! I am sooo disappointed in you I thought you would do better.”  Part of the art of rationality is having extended discussions with people you disagree with.  “OMG U R BIASED!” does not present much basis for continuing discussion.

Previous post:

Next post:

{ 4 comments… read them below or add one }

Jacopo April 5, 2011 at 5:51 am

More great stuff Luke. These summary posts are becoming a real favourite of mine.

Has he found a publisher yet for his book? And what would be the best book(s) to read in the interim? Irrationality by Stuart Sutherland is one you’ve mentioned before, any others?


Zeb April 5, 2011 at 12:52 pm

I am enjoying these more and more, but they really just serve as a teaser – I need more detail if I am to evaluate and incorporate the lessons. Do you recommend reading straight through Yudkowsky’s posts, or is there a more digested and cohesive presentation?


Luke Muehlhauser April 5, 2011 at 4:57 pm

That’s what this is – a teaser so you can pick out which posts you want to read. Until Eliezer’s book comes out, there isn’t a more digested presentation available.


Brian April 6, 2011 at 7:13 am

I have a partially digested and regurgitated presentation I am making by quoting select bits of the articles and (mostly Yudkowsky’s) comments. A few posts are represented in full, most have one small paragraph or so in my agglomeration. It began as a sort of favorites list from random posts but then I went back and did it from the beginning in order. The purposes were to enable me to remind myself of what I had learned very quickly and enable my very intelligent brother to get a solid idea of the contents by reading it once.

Organization is minimal but content compression is high. Restated principles and miniature introductions to topics which are necessarily part of the blogging process, where any given post might be the reader’s first exposure, are omitted. This means that an idea important enough for Yudkowsky to have brought up in thirty posts might be in my list only once, hopefully in its snappiest form, and after other ideas that depend on it.

I didn’t even try to compress most of the physics stuff.

I also transcribed quotes and bits from twelve or so hours of audio and video.

Although I have read all the posts, I have not yet taken cut and pasted bits from all of them.


Leave a Comment