Sending Soldiers Across the Field of Battle

by Luke Muehlhauser on November 7, 2011 in Rationality

Months ago, a group of people including Anna Salamon gave a battery of standard rationality tests (e.g. Frederick’s CRT) to attendees of a Less Wrong meetup group. When she scored the tests, she discovered the group had hit the ceiling on most of the questions. For example, their mean CRT score was 2.69 (n=32), which is much better than Harvard and MIT students do on these tests.

Rationality is different than intelligence, and rarer (see Stanovich’s What Intelligence Tests Miss). Rationality also seems to be teachable, but almost nobody is teaching it. As a result, almost everybody is thoroughly irrational in dozens of ways (see Kahneman’s Thinking, Fast and Slow).

Including we atheists. Which is why I write so many posts about atheists being wrong. A debate between an atheist and a theist almost always looks like two angry generals sending their soldiers across the field of battle, showing all the classic signs of motivated cognition and cognitive override failure.

The difference in rationality between a top-tier atheist blogger and a top-tier Christian blogger is often negligible. But the difference between either one of them and Anna Salamon is immediately noticeable. Mostly, I suspect, because Anna tries harder. She has read thousands of pages of material on the cognitive science of rationality, practices rationality exercises every week, and surrounds herself with some of the most rationality-dedicated people in the world.

Rationality is a real thing. That you can learn. With it, you can improve your life and avoid stupid mistakes, like dying of a curable disease even though you’re a billionaire because you wasted time on alternative medicine.

Previous post:

Next post:

{ 31 comments… read them below or add one }

Leomar November 7, 2011 at 11:13 am

“practices rationality exercises every week”

Have links about those too ?

  (Quote)

gwern November 7, 2011 at 11:15 am

The problem with the CRT and others is that they are basically presenting questions or stories based on specific named biases; does reading about cognitive biases eliminate their validity? My standard example: IQ is great and important, but it’s easy to destroy the validity of subtests – study GRE vocab words with SRS! etc. – and even the matrix tests are trainable, without increasing ‘actual’ IQ.

  (Quote)

kennethos November 7, 2011 at 11:39 am

“and avoid stupid mistakes, like dying of a curable disease even though you’re a billionaire because you wasted time on alternative medicine.”

I suppose, rationally, that’s one way to look at it. The article states that Jobs didn’t want to be “opened up” and “violated” via surgery. You may disagree with him on this, and have an alternate take. But Jobs’ thoughts and attitudes on his most slothful days were still miles above most of ours (and I suspect yours as well). Perhaps he wasted time at the end, on particular methodologies in alternative medicines that didn’t pan out (alternately know by some as exploring the scientific method). His concerns about physical integrity, while not universally shared, are worthy of respect, at least. As well, what killed him may have been curable, but is no guarantee that earlier treatment would have saved his life.

  (Quote)

Bradm November 7, 2011 at 11:54 am

Off topic: I was watching a PBS special on Julia Robinson and they interviewed a student who had won the Julia Robinson Prize at her high school. The student’s name was Anna Salamon. Do you know if the Anna Salamon at the Less Wrong is the same as on the documentary? Just curious.

  (Quote)

Reginald Selkirk November 7, 2011 at 2:07 pm

kennethos: (alternately know by some as exploring the scientific method).

I didn’t understand that. Could you try rephrasing it? Thanks.

  (Quote)

Andy November 7, 2011 at 5:32 pm

Just curious, but how was it shown that LRers actually were more rational rather than simply being better practiced at the specific rationality tests?

  (Quote)

MarkD November 7, 2011 at 10:02 pm

Just caught Daniel Kahneman on KQED Forum this morning.

  (Quote)

Luke Muehlhauser November 8, 2011 at 12:46 am

Andy,

They hadn’t taken rationality tests before. However, no causal inference could be made from the way the test was done.

  (Quote)

muto November 8, 2011 at 7:25 am

Is AnnaSalamon related ro the german mathematician Dietmar Salamon? They look similar and share the name.

  (Quote)

Gregg November 8, 2011 at 10:41 am

+1, Leomar. What specific “rationality exercises” are being practiced?

  (Quote)

Reginald Selkirk November 8, 2011 at 10:51 am

If you’re going to send soldiers across the field of battle, see that they are equipped with the latest in high tech weaponry

  (Quote)

Michael Vassar November 8, 2011 at 11:39 am

Lets not deify fallen heroes Kennethos. There are a LOT of things that matter in life other than the reliability of a person’s explicit thinking, but I’d be simply shocked by any strong evidence that Jobs’ ‘thoughts and attitudes’ were consistently ‘above’ that of, well in this case, of someone who makes a living sharing his thoughts and attitudes via an intellectual blog.

BTW, this blog needs some way to create comments threads.

  (Quote)

kennethos November 8, 2011 at 11:56 am

I’m not attempting to deify Steve Jobs; I have severe disagreements with how he did things. I view him as a fallen human, alike we all. His mistakes were very publicly visible (unlike the rest of us). That said, it is hard to deny that his “thoughts and attitudes” drove the development of technology that makes it easy and simple to do such an intellectual blog. That is what I mean. Luke was using him as a negative example (i.e., don’t do what he did, died when he needn’t have, as an example of irrational thinking). I found that to be in poor taste (IMHO). (But that’s getting into a mudfight, so…) I wanted to respect Jobs’ attitudes regarding his body (though I don’t necessarily agree with), which are worthy of not dissing. Everything else, we’ll have to disagree on, I expect.

  (Quote)

PDH November 8, 2011 at 1:25 pm

I don’t think it’s distasteful to criticise Jobs’ decision. He himself came to regard it as a mistake, if I remember accurately. And it’s a fairly important issue. The general public, especially in Europe and even in Britain where I live, is deeply confused about health issues like these and it does cost lives and increase suffering. And I think that people would be healthier and happier if they learned to be able to sort quackery from the real thing more reliably. It’s something that the average person would benefit from significantly and it relates to basic and important issues like what science is and how it works. It’s low hanging fruit.

People, understandably given the way that the media treats these issues, have no idea who to trust. They are constantly bombarded with false information and subjected to a general kind of epistemic erosion – by which I mean the insidious anti-epistemology of statements like, ‘it’s all just opinion, isn’t it?’ – that causes a lot of harm.

Things like this are a good opportunity to point this out. That a failure to grasp the importance of such issues can sometimes lead to tragic deaths can only be more reason to discuss them, not less.

  (Quote)

Stephen R. Diamond November 10, 2011 at 4:25 pm

Then on what do you base this conclusion:

” Rationality is a real thing. That you can learn. With it, you can improve your life and avoid stupid mistakes, like dying of a curable disease even though you’re a billionaire because you wasted time on alternative medicine.”

It doesn’t seem you’ve been very rational in appraising the value of direct tuition in becoming more rational. Lets look at how the “Less Wrong” types have conducted their lives after devoting themselves to rationality. Was it rational to have been beguiled by some psychopath, who embezzled some huge amount of their hard-contributed cash? Was it rational, when one poster (a regular) questioned Yudkowsky’s cushy job at their Machine Intelligence Research Institute, concertedly to vote the poster down? (Is it even rational for “rationalists” to denote their system of reward by a mystical term—”karma”?) Is it rational that no one at LW brings up the conflict of interest when Yudkowsky pretends authority about the fruitfulness of “singularity research”? Is it rational that when _you_ try to apply rationality to dating, the group gets embroiled in a discussion about whether people should speak their mind on “politically incorrect” subjects—without a glimmer that they are admitting group approval is so important to them that LW might as well be a cult?

However, no causal inference could be made from the way the test was done.

  (Quote)

gwern November 10, 2011 at 4:31 pm

(Is it even rational for “rationalists” to denote their system of reward by a mystical term—”karma”?)

Seriously? Hey, here’s a rhetorical question for you too: is it rational to spend even a second going through the Reddit codebase stripping out every use of ‘karma’, replacing it with some neologism no one else understands as opposed to the standard Internet term dating back at least 14 years, for no benefit, and open us up to even more mockery?

A word of advice, throwing in every argument you can think of, even the stupidest ones, only works when people aren’t really paying attention and will be impressed by sheer volume.

  (Quote)

Stephen R Diamond November 10, 2011 at 9:08 pm

Some final (one can hope) rhetorical question: Is it intellectually honest to attack a position by attacking the admittedly weakest argument alone? (The one in parenthesis.)

I suppose my criticism of “karma” is what you mean by my throwing in every argument, but here your rejoinder is really nothing more than a vehicle for avoiding argument by expressing a vapid condescension (your “advice”). When you are in an arena like LW, where dissenters are “voted down,” you forget what argument even means; you certainly forget how to engage in it. There is no real argument on LW. It is group-based self-deception with rationalist pretenses.

But “karma” is worth discussing: the point is small but telling, and I don’t think one truly committed to rationality would want anything to do with a concept like “karma.” Is it harmful? Indeed it is. You must think a little to see how. By using a *meaningless* term like karma, you avoid a term that points the user to the precise meaning intended. The term leaves open how the penalty or reward–termed unnaturally as to be inherently unrelated to reality–is to be applied. So, all make their own choices on applying it, which quickly race to the bottom to the lowest common denominator. You issue negative karma, that is to say, when you fundamentally disagree with the comment. It matters not in the least how incisive the point is. In fact, incisiveness is a disadvantage where a comment expresses fundamental disagreement. (To be fundamental disagreement for LW, it need not be deep. I’m not talking about some crazy theist. Strong criticism of singularity thinking, for example, will suffice.)

Has anyone even stopped to think rationally about *what* precisely you want readers to respond to with karma? Should you issue karma based on your own independent evaluation of the comment or “at the margin,” that is, based on whether you think the current rating is high, low, or about right? Giving your currency a precise name (rather than a mystical and mystifying one) would force you to think through its function. But I don’t think it’s really a matter of carelessness. It is *best* left vague, when the real function is disreputable–to bludgeon. And what better route to vagueness than supernaturalist vocabulary.

(One caveat: LW is the only place I’ve seen anything like “karma.” But, I’ll take your word about its widespread use.)

Seriously? Hey, here’s a rhetorical question for you too: is it rational to spend even a second going through the Reddit codebase stripping out every use of ‘karma’, replacing it with some neologism no one else understands as opposed to the standard Internet term dating back at least 14 years, for no benefit, and open us up to even more mockery?

A word of advice, thr

(Is it even rational for “rationalists” to denote their system of reward by a mystical term—”karma”?)

Seriously? Hey, here’s a rhetorical question for you too: is it rational to spend even a second going through the Reddit codebase stripping out every use of ‘karma’, replacing it with some neologism no one else understands as opposed to the standard Internet term dating back at least 14 years, for no benefit, and open us up to even more mockery?

A word of advice, throwing in every argument you can think of, even the stupidest ones, only works when people aren’t really paying attention and will be impressed by sheer volume.

owin

Seriously? Hey, here’s a rhetorical question for you too: is it rational to spend even a second going through the Reddit codebase stripping out every use of ‘karma’, replacing it with some neologism no one else understands as opposed to the standard Internet term dating back at least 14 years, for no benefit, and open us up to even more mockery?

A word of advice, throwing in every argument you can think of, even the stupidest ones, only works when people aren’t really paying attention and will be impressed by sheer volume.

g in every argument you can think of, even the stupidest ones, only works when people aren’t really paying attention and will be impressed by sheer volume.

  (Quote)

Polymeron November 11, 2011 at 11:45 am

@Luke,
What worries me is that there isn’t strong evidence, as far as I could tell, that supports the notion that increased rationality indeed makes people more successful and/or happy. It certainly makes intuitive sense, but as a rationalist, you should reject that criterion and focus on the data.
I recall reading a good LW post expressing alarm at this.

In that sense, BTW, I think the focus on the singularity may be a bad strategy, rationally speaking. If Yudkowski at al. can become successful and well-known over the next few years by using rationality tricks, that could well boost later efforts so that the investment has been worth it. Focusing only on your final goal may be, ironically enough, pretty short-sighted.

@Stephen R Diamond,
“Karma” is indeed in wide usage. I believe it’s included in defeault SMF forums, as well as other places.

I haven’t witnessed any mass down-voting except for very incoherent or irrelevant posts on LW; but I’m not exactly a regular there, so it is possible that this happens and I missed it.

  (Quote)

Stephen R. Diamond November 11, 2011 at 1:21 pm

I find there’s no rational justification for omitting an example:

“Johnicholas26 April 2011 03:11:26PM-7 points [-]

I’m not happy about the justifying the high payouts to EY as “that’s what a programmer might make”. Instead, put him (and any other MIRI full-time employees, possibly just Michael Vassar) on half pay (and half time), and suggest that he work in the “real world” (something not MIRI/futurism related) the rest of the time. This means that his presumed skills are tested and exercised with actual short-term tasks, and also gives an approximate market price for his skills.

Currently, his market-equivalence to a programmer is decoupled from reality.”

The poster received -8 karma on this before I recently voted him up.

(I’m not claiming -8 is “mass.” But it’s an unusually high number for a regular to receive, when truly idiotic posts average about +3.)

I haven’t witnessed any mass down-voting except for very incoherent or irrelevant posts on LW; but I’m not exactly a regular there, so it is possible that this happens and I missed it.

  (Quote)

MarkD November 11, 2011 at 7:48 pm

I remain unclear as to the goals of MIRI vis a vis LW. We can be fairly certain that, given what we know now, that any path to human-grade behavioral plasticity and complexity must replicate at least aspects of the “less rational” components of cognition (that is, most of them). For instance, from what we know about semantic association, the connectivity and activation patterns are not optimal for any specific task and are influenced by a wide spectrum of genetic components (e.g., dopamine titres influence magical thinking quotients which reflect association patterns that might be argued are counterfactual; slightly lower levels result in poetic association). Yet they might be optimal for all possible adaptive landscapes (even scoping that to an evolutionary psych argument that those landscapes were hunter-gatherer ones).

Given what we know, then, and what we can anticipate to be likely, the hard naturalism is associated with discovering how this mystery of flawed and brilliant cognition arises and even allows us to try to think like good statisticians while being promiscuously creative at the same time.

Or, perhaps, I’m just much more interested in immanentizing the eschaton…or, sorry, the robot apocalypse, than in leveling up enough that I can defeat those evil robots with Yoshimi at my side ;-)

  (Quote)

Stephen R. Diamond November 11, 2011 at 9:34 pm

I remain unclear as to the goals of MIRI vis a vis LW.

I think the idea is that if you’re really rational, you’d see that the Singularity is the super-important event that we should devote all our energies to preparing for. I don’t think the argument is that studying rationality has an special relevance to doing AI.

Yudkowsky’s main vehicle for shoehorning life into “rationality” is the concept of instrumental rationality. I’d go further than Polymeron: a person successfully striving to become more rational will be apt to be _less_ happy than the typically opportunistically irrational one. I take the personal stance Trivers sets out in his new book: there are probably advantages to practicing some degree of self-deception, but this needn’t deter anyone from striving for rationality as a personal ideal. (But I also doubt that direct tuition in rationality does much for one’s rational competence; Kahneman once was optimistic about studying rationality to become more rational, but I understand he’s tempered that view considerably—I haven’t yet read his latest book.)

Yudkowsky bypasses all this, and he urges becoming more *instrumentally* rational to make money, get chicks, and live the good life (which he personally seems to have succeeded in doing—ironically, by, indeed, practicing “rationality”—by leveraging his “rationalist” followers). The benefit of “instrumental rationality,” of course, is tautological, since it merely means the proclivity of one’s means to further one’s success. But usually when we speak of someone’s rationality, we mean “epistemic rationality,” which isn’t what Yudkowsky claims he’s fundamentally dedicated to. He’s more than willing to sacrifice epistemic rationality for “instrumental rationality.” He’s a slippery one!

  (Quote)

Paul Rimmer November 12, 2011 at 5:41 am

Interesting how an advocate of rationality ends his post with such an irrational statement. Whether or not Jobs regrets using alternative medicine, there is no good reason to think that he would have survived significantly longer had he not used alternative medicine, or that alternative medicine contributed in any way to his death. The article you link entertains a hypothetical situation. This is indicated by phrases “most often reported and speculated” and “of the type Jobs is believed NOT to have had” and contains no direct medical information about Steve Jobs and no direct statement by Steve Jobs.

I suppose your last statement indicates that even very rational people are irrational sometimes. I suppose I am also not convinced that being irrational at times is such a bad thing.

  (Quote)

cl November 12, 2011 at 10:31 am

Golly gee willakers y’all are so rational you can’t even get along! It’s like that NOFX album. Just switch “rational” for “punk.”

  (Quote)

Stephen R Diamond November 12, 2011 at 12:09 pm

Whether or not Jobs regrets using alternative medicine, there is no good reason to think that he would have survived significantly longer had he not used alternative medicine, or that alternative medicine contributed in any way to his death.

So what? Luke’s point really isn’t about Steve Jobs. It’s about situations many of us know from observation, where patients delay medical treatment and, instead, seek out known forms of quackery. There’s certainly a lot of room for irrationality there. (This is one of my many gripes about the LW set [not to accuse Paul]: they often miss the point and take off on some pseudo-rational tangent.

The important point isn’t about Jobs but about whether rationality training would help someone like Luke’s _hypothetical_ Jobs. I doubt it. (If rationality is supposed to be about “winning,” Jobs, overall, can hardly be described as a loser.) Intelligent and worldly people _like_ Jobs seek alternatives to scientific medical treatment because they’d rather avoid facing some rather depressing facts; even a successful cancer treatment is no joy ride.

The source of the obstacle to seeking scientific medical treatment is affective, not cognitive. Luke has indirectly addressed affective bias (self-deception). (See http://tinyurl.com/3plrxj3) He tries to anchor his remedy in the situationist trend in social psychology, but when you get to his specific recommendation, I think it is worse than useless–it’s affirmatively harmful. Luke urges us to surround ourselves with rational people. But this just compounds one source of truly mass irrationality today: everybody, more and more, is surrounding himself with others who agree. Groups composed only of extremely intelligent people have been shown to produce crazy outcomes. I have little doubt that the same holds true when you surround yourself with highly “rational” others–particularly since most of these rationalists, whatever their many flaws, are also very intelligent.

Luke once surrounded himself with religious irrationalists. It’s often hard to draw exactly the correct conclusion from one’s mistakes. He should be seeking diversity of outlooks, not a group of self-deluded pseudo-rationalists, a new self-limiting clique to belong to. I intend this “advice” as benevolent.

  (Quote)

Polymeron November 13, 2011 at 1:55 am

@Stephen R Diamond

Groups composed only of extremely intelligent people have been shown to produce crazy outcomes.

I’d be very interested in as many examples as you can provide. This sounds interesting, and I myself can’t really think of a case I’ve heard where only very intelligent people were bunched together, excluding maybe the space programs.

(I agree with the gist of your post)

  (Quote)

Stephen R Diamond November 13, 2011 at 7:26 pm

I’d be very interested in as many examples as you can provide.

I can’t find the study I had in mind. A few suggestive speculations and findings:

1. The very intelligent lack common sense. (See”Clever sillies: Why high IQ people tend to be deficient in common sense” from Medical Hypotheses, Volume 73, Issue 6, December 2009, Pages 867-870. There’s a summary in the archive of the website “Barking up the wrong tree.”

When I mentioned this study in an LW comment, my comment drew the comment that such speculations expressed sour grapes. Not necessarily. Leaving aside valid reasons for agreeing, this position can also be used as a rationalization by the intelligent for their failures. There’s a study showing that high IQ is a associated with a greater propensity for bankruptcy.

I realize you’re interested in something directly relating to _groups_ of the very intelligent. The most conspicuous examples are the high IQ societies, some of which are demanding. (Top 1 in 1,000 and higher.) They’ve produced quite a bit of comedy, but there’s a lot of confounding of causes in the mix.

I’ll let you know if I find the article I had in mind.

  (Quote)

GradStudent November 13, 2011 at 11:44 pm

Luke has decided to stop trying to think clearly about philosophical issues. Whether Job’s decision was irrational or not is an interesting question. Answering that question will surely involve knowing how he arrived at his decision, and answering the question of whether or not the way in which he arrived at his decision was rational will involve all sorts of other interesting philosophical questions.

Suppose Jobs was justified in believing that a certain kind of medication or medicinal help would benefit him. If he was so justified, then surely his belief that this kind of help would benefit him was rational. So was Jobs justified in this belief? Luke implies that he wasn’t so justified. But Luke offers no good reason to believe that. This is typical of Luke’s recent posts. Luke is now in the business of asserting stuff rather than providing arguments for the stuff he asserts.

Worse, Luke has regressed to assimilating rational belief with the sort of beliefs Luke himself holds. It’s time to for Luke to start thinking harder about what he means by “rational belief”. Thinking hard, however, is not something Luke seems up for anymore. It’s a shame this blog has devolved as far as it has.

  (Quote)

Stephen R Diamond November 15, 2011 at 4:59 pm

Luke is now in the business of asserting stuff rather than providing arguments for the stuff he asserts.

Or (more likely), he thinks the opinions he asserts are beyond debate and that they’re best promoted by “rhetoric.” Rhetoric is legitimately only a technique to keep a reader’s attention, not—as Luke now thinks—properly an independent persuasive factor.

But rational deliberation and action also requires that deliberators distinguish between their opinions, based on independent analysis, and their beliefs, based on all evidence, including the evidence provided by others’ mere disagreement. (See series beginning at http://tinyurl.com/3lxp2eh) Maintaining the distinction between opinion and belief is hard because it’s opposed to the immediate translation of attitude into action.

  (Quote)

srdiamond November 22, 2011 at 3:37 pm

Test

What do Luke Muehlhauser and Robin Hanson have in common–besides each, significantly, being the son of a preacher? They both engage in the silent practice of locking discussants out of their blog–although Luke’s defenses are permeable, as you can see. No supporter of rational discussion could lightly contemplate “disappearing” discussants behind the backs of other participants.

  (Quote)

J. Quinton November 23, 2011 at 5:50 am

I can’t find the study I had in mind. A few suggestive speculations and findings:1. The very intelligent lack common sense. (See”Clever sillies: Why high IQ people tend to be deficient in common sense” from Medical Hypotheses, Volume 73, Issue 6, December 2009, Pages 867-870. There’s a summary in the archive of the website “Barking up the wrong tree.”When I mentioned this study in an LW comment, my comment drew the comment that such speculations expressed sour grapes. Not necessarily. Leaving aside valid reasons for agreeing, this position can also be used as a rationalization by the intelligent for their failures. There’s a study showing that high IQ is a associated with a greater propensity for bankruptcy.I realize you’re interested in something directly relating to _groups_ of the very intelligent. The most conspicuous examples are the high IQ societies, some of which are demanding. (Top 1 in 1,000 and higher.) They’ve produced quite a bit of comedy, but there’s a lot of confounding of causes in the mix.I’ll let you know if I find the article I had in mind.

Isn’t the entire point behind Luke’s original post here that high IQ doesn’t necessarily mean rational?

  (Quote)

Stephen R. Diamond November 23, 2011 at 6:36 am

Isn’t the entire point behind Luke’s original post here that high IQ doesn’t necessarily mean rational?

Read the previous posts, which will you in.

  (Quote)

Leave a Comment