Some Morals from the Study of Human Irrationality

by Luke Muehlhauser on January 18, 2011 in How-To

Whenever somebody says (or shows) that their own intuitions play a central role in how they see the world, I start by assuming they are not very familiar with the last 40 years of psychological research.

If you want to catch up with this research, one of the quickest ways may be to read Stuart Sutherland’s Irrationality.

But instead of surveying the research, let me (for now) merely list a selection of the “morals” he gives at the end of his chapters. Sutherland advises:

  • Never bad a judgment or decision on a single case, no matter how striking.
  • In forming an impression of a person (or object) try to break your judgment down into his (or its) separate qualities without letting any strikingly good or bad qualities influence your opinion about the remainder.
  • When exposed to connected material, suspend judgment until the end: try to give as much weight to the last item as the first.
  • Try to avoid obtaining information that would bias you: for example, in judging whether an article or a book should be published, remain ignorant of the author’s name until you have formed your opinion of the work.
  • Think before obeying; ask whether the command is justified.
  • Think carefully before you announce a decision publicly: you will find it harder to change.
  • Ask yourself whether you are doing something merely because others do and if so, ask whether it really furthers your own ends.
  • Don’t be impressed by advice from someone you admire unless he is an expert on the topic in question – and even if he is, remember that experts are often wrong.
  • Whether you are a member of a committee or a golf club, be careful not to be carried away by the prevailing views. Consider and express counterarguments.
  • No matter how much time, effort or money you have invested in a project, cut your losses if investing more will not be beneficial.
  • If you are persuaded to do something distasteful, try not to minimise its unpleasantness in order to justify yourself.
  • If you want someone to value a task and perform well, do not offer material rewards.
  • If you want to stop children… from doing something, try to persuade rather than threatening them with punishment.
  • Don’t take important decisions when under stress or strong emotion.
  • Remember that every time you subdue an impulse it becomes easier to do it again.
  • Search for evidence against your own beliefs.
  • Try to entertain hypotheses that are antagonistic to one another.
  • Don’t distort new evidence: consider carefully whether it could be interpreted as disconfirming your beliefs rather than supporting them.
  • Be wary of your memory: you are likely to recall whatever fits with your current views.
  • Remember that changing your mind in the light of new evidence is a sign of strength not weakness.
  • Be careful not to associate things together [just] because of your expectations or because they are unusual.
  • If you are a doctor or a patient, learn some elementary probability theory.
  • Consider whether an event could have causes other than the one you first think of.
  • Remember that in most circumstances it is as reasonable to reason from effect to cause as from cause to effect.
  • Don’t assume that others are like yourself.
  • Do not judge solely by appearances. If something looks more like an X than a Y, it may nevertheless be more likely to be a Y if there are many more Ys than Xs. [Bayes' Theorem.]
  • Remember that a statement containing two or more pieces of information is always equally likely or (most often) less likely to be true than one containing only one of the pieces.
  • Guard against believing a statement is true because you know that part of it is true.
  • Always work out the expected value of a gamble before accepting it.
  • Remember that whether you save [$10] on the cost of a house or on the cost of a radio, the saving is equally valuable to you.
  • If you are making a numerical estimate and have a given starting value, remember that the correct estimate is likely to be further away from the starting value than you may at first think.
  • Be wary of stockbrokers (or anyone else) who claim to predict the future.
  • To avoid disappointment, try to control your own overconfidence: think of evidence or arguments that are contrary to your beliefs.
  • Remember that insidious dangers may kill more people than dramatic dangers.
  • Remember that when anything extreme happens… the next happening of the same kind is likely to be much less extreme for purely statistical reasons: it reverts to the mean.
  • Suspect anyone who claims to have good intuition.
  • If you are in a profession, don’t hesitate to take decisions by using a mathematical model if it has been shown to be better than human judgment.
  • When the importance of a decision merits the expenditure of time, use utility theory or a watered-down version of it.

If you want to know his reasons for giving all this advice, read the book.

Meanwhile, it is probably worth checking this list against your judgments and decisions at the end of each day. That would be superb rationality training.

Previous post:

Next post:

{ 38 comments… read them below or add one }

Scott January 18, 2011 at 5:59 am

Are you saying in your openning paragraph that intuitions don’t make up people’s worldviews? I thought that people used intuitions all the time, which is why they/we are so often wrong.

  (Quote)

Tshepang Lekhonkhobe January 18, 2011 at 6:22 am

The only point I find controversial (of those I actually understand) is “If you want someone to value a task and perform well, do not offer material rewards.” and I’m not buying the book just to read the justification of that part. Anyone?

  (Quote)

Dept. of Corrections January 18, 2011 at 6:32 am

I think you mean “base” instead of “bad” in the first of Sutherland’s morals.
Thanks for this post! Sutherland is going on my shopping list.

  (Quote)

Steven J January 18, 2011 at 6:43 am

what a joke having Dawkins endorse it. Lets be generous and presume he read it. Then it must have had the exact effect that everything else outside the narrow parameters of his fanatic vision has: nil. More probably he didn’t read it (does he/has he ever bothered to read anything outside the field of evolutionary biology?). So he is endorsing a critque of himself: that is, has there ever been a more blinkered, biased, one-diminsional over-weening charleton presuming to spew forth opinion on subjects he knows nothing about? (well, of course, given todays zeitgeist, but then he is an exhibit class A).

  (Quote)

Bill Snedden January 18, 2011 at 7:20 am

The only point I find controversial (of those I actually understand) is “If you want someone to value a task and perform well, do not offer material rewards.” and I’m not buying the book just to read the justification of that part. Anyone?

I wondered about that one as well. It’s possible that by “value a task”, he means “value a task for itself.” If that’s the case, then a material reward certainly won’t ensure that end.

  (Quote)

ildi January 18, 2011 at 8:15 am

“If you want someone to value a task and perform well, do not offer material rewards.”

If I remember correctly, the idea is that you resolve cognitive dissonance about working hard on a task with no material gain by convincing yourself you’re doing it because it’s intrinsically valuable to you. Therefore, you’re more likely to repeat the task and work hard at it (and enjoy it).

  (Quote)

MauricXe January 18, 2011 at 8:22 am

If you want to stop children… from doing something, try to persuade rather than threatening them with punishment.

Can’t say I agree with that. Maybe a little bit of both is order but I certainly think punishment works and threatening to punish works.

  (Quote)

Adito January 18, 2011 at 10:07 am

Remember that when anything extreme happens… the next happening of the same kind is likely to be much less extreme for purely statistical reasons: it reverts to the mean.  

This is misleading. The next happening is just as likely to be extreme as the first one is. Observing one unusual happening has absolutely no effect on the following happenings.

  (Quote)

Rick M January 18, 2011 at 10:26 am

Remember that a statement containing two or more pieces of information is always less likely to be true than one containing only one of the pieces.

Guard against believing a statement is true because you know that part of it is true.

These two morals seem to be at least somewhat contradictory.

  (Quote)

Aaron Brown January 18, 2011 at 10:34 am

Thanks for putting this summary together.

Typo:

s/before announce/before announcing/

  (Quote)

Zak January 18, 2011 at 10:59 am

MauricXe,

When I was an undergrad (psychology), I remember learning multiple times that if you want to change someones behavior, the worst way to go about it is via punishment. Positive and negative reinforcement are much, much superior methods.

  (Quote)

Kevin January 18, 2011 at 11:08 am

“Remember that a statement containing two or more pieces of information is always less likely to be true than one containing only one of the pieces.”

Nitpicking here, it should always be equal to or less than, not always less than.
A=2+2=4, B=2+3=5 P(A&B)=P(A) not P(A&B)<P(A)

  (Quote)

bossmanham January 18, 2011 at 11:18 am

Whenever somebody says (or shows) that they don’t think intuition is a good source of knowledge at all based on the last 40 years of psychological reseatch, I start by assuming they are not very bright, since one has to rely on their intuition that that last 40 years of research is reliable and can lead to knowledge.

  (Quote)

Garren January 18, 2011 at 11:26 am

Does the book define ‘rationality’ and ‘irrationality’?

For now I agree with the way Philippa Foot put it before she decided morality is an essential ingredient:

“Irrational actions are those in which a man in some way defeats his own purposes, doing what is calculated to be disadvantageous or to frustrate his ends.”

Would anyone here definite it differently?

  (Quote)

Jesus In A Tutu January 18, 2011 at 12:23 pm

Regarding bossmanham,

Luke’s claim is about intution playing a “central role” in how they “see the world.”

1.) bossmanham claims that Luke said intuition is not a good source of knowledge “at all.” Luke never says this but instead is taking issue with people who say that intution plays a “central role” in how they “see the world.” The statement does not deal with throwing away intuition but with not making intution the central role in how one sees things.

2.) bossmanham claims that “one has to rely on their intuition that the last 40 years of research is reliable…”
One does not have to rely upon their intution to see this. They can instead read and research the findings published.

  (Quote)

Luke Muehlhauser January 18, 2011 at 12:38 pm

Aaron Brown,

Thanks.

  (Quote)

Luke Muehlhauser January 18, 2011 at 12:40 pm

Kevin,

Yup, thanks.

  (Quote)

Luke Muehlhauser January 18, 2011 at 12:41 pm

bossmanham,

You’re using ‘intuition’ in a different sense than I am. Your use of ‘intuition’ seems to include all human knowledge, which makes the word not very useful at all.

  (Quote)

Mjthackray January 18, 2011 at 2:09 pm

“If you want someone to value a task and perform well, do not offer material rewards.”

I remember seeing a TED talk about this one. The idea is that a task is changed when you add the reward element. The task for the person then becomes getting a reward, the quality becomes inconsequential beyond that rule. If a task is performed in of itself, it becomes a tool for expression, for measure, for creativity, for distinguishing ones self. all that jazz.

apparently this isn’t true is the task is monotonous though e.g. reward money per pieces of rubbish collected of the side of the road. I guess because it’s difficult to turn a task like that into something intrinsically valuable, unless your keen to show of your rubbish collection skills, or your huge on the environment.

  (Quote)

Michael January 18, 2011 at 2:29 pm

Tshepang Lekhonkhobe:

The only point I find controversial (of those I actually understand) is “If you want someone to value a task and perform well, do not offer material rewards.” and I’m not buying the book just to read the justification of that part. Anyone?  

I have not read this book, but I HAVE read ’59 Seconds’ which makes precisely the same point, so I will answer your question.

Simply put, by telling someone that you’ll give them a reward if they perform a certain action, they subconsciously assume that that action must be something that they don’t like doing. This is because if there is an action that they do like doing, then they don’t need any other rewards to make them do it. For an action they dislike, though, they will have to be compensated in some other way to make up for it.

If you set people an activity they enjoy and reward them for doing it, the reward reduces the enjoyment and demotivates them. You transform play into work.

NOT ONLY THIS, the same is true of activities people DON’T enjoy, ie transforming work into play!!
This is because of the following logic that people subconsciously process:
1) People usually pay me to do things I don’t enjoy/wouldn’t otherwise do (ie work).
2) I was paid a large amount.
3) Therefore I must dislike this activity.
Time and time again studies have replicated this outcome. Excessive rewards may have a detrimental effect on the attitude of the people doing the tasks. Sometimes you do get a short-run boost in performance, but in the long run these large rewards destroy the very behaviour they are designed to encourage.

So, what can you use instead to incentivise people to do a certain activity?

Well, present them with the occasional small SURPRISE reward after they complete the activity, or praise them for their labour, ie boosting their self-esteem (on a side not, research has suggested that compliments and self-esteems boosts are valued more highly in the average young adult than money, sex or food in equal quantities.)

When it’s an activity people don’t enjoy, they you should give a realistic, but not excessive, reward at the start, followed by feel-good comments and occasional pleasant surprises.

If you want to know the references to the scientific data etc you’ll have to buy the book, ’59 Seconds’ by Richard Wiseman (suitable surname).
I put it on my amazon wishlist upon Luke’s recommendation, have now read it and would THOROUGHLY recommend it to anyone too.
Hey, even my Mum just finished reading it! (She bought it for me this Christmas actually!)

  (Quote)

David January 18, 2011 at 2:54 pm

I am quite amused of the number of responses of the form “X is wrong, because my intuition says differently.” To these people, I say: OF COURSE YOU THINK THAT. It’s on the list because you think that. Your intuition or hunch or interpretation of your recollection of your experience *is not new evidence*. Anything on this list may be wrong, but your thoughts about it are irrelevant without studies to back them up.

  (Quote)

MauricXe January 18, 2011 at 3:12 pm

Hey Zak,

I am sure that is true. However, my personal experience tells me that sometimes kids need a swift “smack on the bottom”. Ofc, I think talking to your child is great and all but, as I do prefer not being told “because mommy said so”, it doesn’t have to be the only method.

  (Quote)

JS Allen January 18, 2011 at 4:59 pm

Considering that much “rationality research” used by these popular self-help authors is based on the same sorts of p-values that “proved” psi, I would be careful.

These self-help lists remind me of the superstitious eaters who are eating from a big list of “healthy foods” based on lists of “scientific studies”. It’s the sort of superstitious thinking that results from reductionist thinking — you take a random witch’s brew of a hundred little things, all which have been tenuously argued to be associated with a positive effect, and then throw them all together with the hope that there will be a net positive effect.

  (Quote)

David January 18, 2011 at 5:18 pm

It’s the sort of superstitious thinking that results from reductionist thinking — you take a random witch’s brew of a hundred little things, all which have been tenuously argued to be associated with a positive effect, and then throw them all together with the hope that there will be a net positive effect.  

I don’t see that as “superstitious” at all. Weaker than a study considering multiple variables, sure, but I think that a “witches brew” cobbled together of things that have a positive effect is statistically more likely to have a net positive effect than a similar “witches brew” cobbled together of things of unknown (or known negative) effect. That said, care should certainly be taken with large changes (in diet or life or anything).

  (Quote)

Jeff H January 18, 2011 at 7:32 pm

The point about not offering material rewards to get someone to value something is talking about intrinsic vs. extrinsic motivation. Research has shown, for example, that if you take a child who enjoys drawing pictures, and then you pay him/her money to draw pictures, the drawing of pictures now becomes focused on earning money rather than the intrinsic enjoyment of drawing pictures. However, this isn’t true in all cases and there have been various refinements to the theory. It’s only true under certain conditions.

Adito,
“This is misleading. The next happening is just as likely to be extreme as the first one is. Observing one unusual happening has absolutely no effect on the following happenings.”

If the situation follows a normal distribution, then no: regression to the mean is likely to occur. It’s precisely because the two situations are independent that one extreme situation (which is an improbable event) is not likely to be followed by another extreme situation. The second situation is more likely to regress closer to the mean, because less extreme situations are more probable. If the first situation influenced the second one, then this might not be the case. But if you roll a die ten times and get ten 6s, you’re not likely to repeat such an event the next time you roll it ten times. You’re more likely to roll about one or two 6s.

  (Quote)

bossmanham January 18, 2011 at 8:16 pm

In riposte to Jesus In A Tutu,

1) Fair enough. But the point I am making is that we rely on our intuition in thinking that induction and scientific study is a good source of knowledge. There’s no scientific test you can do to see if scientific tests are a reliable means of knowing anything.

2) They can instead read and research the findings published

To then which you must appeal to your intuition to tell you that researching the published findings are leading you to some firm conclusion.

To Luke,

You’re using ‘intuition’ in a different sense than I am. Your use of ‘intuition’ seems to include all human knowledge, which makes the word not very useful at all.

No, my definition of intuition is a sense of something that is not known by deduction or induction. It is a base knowledge that you accept before you can perform other epistemological tasks. You must, by intuition, assume that scientific tests are a good source of knowledge, along with at least half a dozen others, including the intuition that the mind independent external world exists, that you can actually know things about this external world, the laws of logic, that truth exists, that our senses and cognitive faculties are reliably relaying information to our minds, that language is adequate at describing and explaining the external world, the existence of the values used in science, the uniformity of nature, etc etc etc…

All of these things we know by intuition. All of these things have to be true for your Science® to even get off the ground.

  (Quote)

ildi January 18, 2011 at 8:39 pm

However, my personal experience tells me that sometimes kids need a swift “smack on the bottom”

Punishment does work, but it has to immediately follow the unwanted behavior, and you need to replace it with a wanted behavior.

  (Quote)

Soumendra January 18, 2011 at 9:40 pm

@Adito
You are right. But what the advice in the list means is that a bunch of extreme events are very unlikely to happen in quick succession. “Regression to mean” is just a fancy name for it.

If we already know that X and Y are extremely unlikely to happen, look at the space of all pairs of consecutive events. What are the chances of occurrence of (X, Y) or (Y, X)?

The question being asked is not “what are the chances of an extreme event given one has already happened?” It is “what are the chances of two extreme events happening consecutively?”

@ Jeff
Your conclusions are less or more right, but your reasons are more or less wrong. Individual statements are correct here or there, but what you seem to be saying collectively is mostly wrong. It seems you don’t understand “independence.” In fact, you don’t even seem to understand what “regression to mean” is.

Adito,
“This is misleading. The next happening is just as likely to be extreme as the first one is. Observing one unusual happening has absolutely no effect on the following happenings.”

I see no reason for you to disagree with this statement if you are assuming independence, because it is correct (Adito implicitly assumes independence).

Jeff
“It’s precisely because the two situations are independent that one extreme situation (which is an improbable event) is not likely to be followed by another extreme situation.”

Independence has nothing to do with your conclusion! The second extreme situation is unlikely because it is unlikely by definition (low probability of occurrence). You are contradicting yourself by saying that occurrence of one extreme event “decreases” the likeliness (you said “not likely” intending to mean “makes it less likely”, which is the same as “decrease” the likeliness) of the next event being extreme “because” the two events are “independent.” Are you listening to yourself?

Even if every event is dependant on the previous event, depending on the underlying distribution, the fact of the previous event being an extreme one may increase OR decrease the chances of the next event being extreme. I see no justification behind assuming either direction without knowing the distribution, except that you are the product of the mass culture of normal distribution.

Seriously, I see no reason for dragging normal distribution into this discussion. What was the purpose of it? What explanation or insight or justification does normal distribution offer here? In this theoretical context, it doesn’t even have any empirical evidence or justification to offer!

In fact, in spite of your pretence of probability backed rationality, you have just demonstrated that you don’t understand the very first, and in my opinion the most crucial, advice in the list given above! You violate a bunch of other advices in the list as well, but I’ll leave the task of figuring them out to you.

  (Quote)

Leon January 18, 2011 at 11:54 pm

Whenever somebody says (or shows) that their own intuitions play a central role in how they see the world, I start by assuming they are not very familiar with the last 40 years of psychological research.

Whenever somebody says (or shows) that how they see the world is well-thought-out and thoroughly rational, and that their own intuitions play only a peripheral role in how they see the world, I start by assuming they are not very familiar with the last 40 years of psychological research.

  (Quote)

Vlastimil Vohánka January 19, 2011 at 12:44 am

Raymond Nickerson’s book Cognition and Chance: The Psychology of Probabilistic Reasoning is a great survey of the empirical literature. Also great on probability, statistics, and their history in general. The book explicates how the evidence for quite popular claims about human probabilistic and statistical irrationality is often far from being conclusive, though sufficient in some specific cases. Similarly B. Koslowski’s book Theory and Evidence.

  (Quote)

MauricXe January 19, 2011 at 8:10 am

Agreed @ildi

  (Quote)

David January 19, 2011 at 8:35 am

@ildi – right, which… makes it “negative reinforcement”, no?

  (Quote)

Tshepang Lekhonkhobe January 20, 2011 at 3:10 am

Thanks for the thorough response. Makes sense… sort of.

  (Quote)

Michael January 20, 2011 at 9:17 am

Thanks for the thorough response. Makes sense… sort of.  

Lol no worries.
You should seriously consider getting ’59 Seconds’ though, it’s really short and direct!

  (Quote)

CharlesP January 20, 2011 at 12:55 pm

Has anybody read this AND Dan Ariely’s Predictably Irrational to know how much overlap there is? A lot of this seems to have been covered in Dan’s book (he’s done a TED talk or two on the subject as well… if I recall correctly). “How we decide” by Jonah Lehrer is another similar one (well, at least similar to Predictably Irrational anyway), and was also good. 59 Seconds is on my desk right now… at some point (soon?) I will be reaching critical mass on the “pop neuroscience books one can read and gain anything new” scale.

  (Quote)

ildi January 20, 2011 at 8:41 pm

David: no, negative reinforcement reinforces existing behavior to avoid a stimulus. Positive and negative reinforcement are ways to increase an existing behavior; with punishment, you want to stop an existing behavior.

  (Quote)

David January 20, 2011 at 9:14 pm

Ah, right. My bad.

  (Quote)

laura Cook September 16, 2011 at 11:23 am

Don’t be impressed by advice from someone you admire unless he is an expert on the topic in question – and even if he is, remember that experts are often wrong.

then who the heck are you meant to take advice from? This implies you dont take advice from friends either.

  (Quote)

Leave a Comment