Today I interview economist Robin Hanson. Among other things, we discuss:
- How to overcome bias
- Prediction markets
Download CPBD episode 067 with Robin Hanson. Total time is 41:19.
- Robin Hanson at George Mason University
- Robin Hanson on Wikipedia
- Hanson’s blog, Overcoming Bias
- Hanson’s interview with The Marketplace of Ideas
Links for things we discussed:
- Cognitive biases: a visual guide
- Prediction markets
- Less Wrong
- Example prediction markets: Betfair, Intrade, Iowa Electronic Markets, TradeSports
- Prediction markets using virtual money: simExchange, NewsFutures, Hollywood Stock Exchange
- Epistemology of disagreement
- Evolutionary psychology
LUKE: Robin Hanson is an associate professor of economics at George Mason University, and also a research associate at the Future of Humanity Institute at Oxford University. Robin, welcome to the show.
ROBIN: Thank you. Thanks for having me.
LUKE: Robin, you also run a popular blog called “Overcoming Bias”, and when I invited you on the show, I said I wanted to talk you about how do we overcome our biases. You said that you might not have much to say on that topic because you’re not sure that we can overcome our biases very well. What did you mean by that?
ROBIN: Well, I think what people like to have is a set of quivers in their pack, that let them sort of beat other people on the head with how much better they are. I think often when people are looking to overcome bias, they’re really looking for ways to tell themselves, “I’m less biased than those people, and therefore I’m right and they’re wrong.”
So when people look for things like that, they look for little rules of thumb or a list of fallacies, or something like that, that they can find, that maybe they can invoke in some argument. Say, “Aha, you’re falling. You’re succumbing to bias number 37, or fallacy number 22. Therefore I am right, and you are wrong.”
So people have a fair bit of motivation to do that. They like the idea of collecting these ways, in which other people could be wrong. Maybe noticing them in themselves sometimes, but only enough to give themselves the credentials for saying, “Aha, see I’m trying to find these things, and you’re not. Therefore, you’re wrong.”
Collecting examples of particular biases and particular things that people do, that are off, often go down that path. It seems fun, it seems interesting. And it feels like the kind of time that sometimes you notice it, you use it and you feel proud of yourself. And then you can feel like you’re right and the other the people are wrong.
The problem is we are just built as these mental machines that are quite prone to want to be right and have other people to be wrong. We’re not really built to be usually objective or neutral or non-partisan, in a way to just try to overcome biases and get it right.
I mean, for a matter of just a rare mistake, that every once in a while we just sort of goofed up, and usually we got things right, then it might make more sense to sort of collect a little list of things that can go wrong and watch out for them. Like if you were a bank teller and there was just a few kinds of mistakes that you could make, you might want to have a little checklist of things to watch out for.
If you were a bank teller who was trying to give money away, that wouldn’t help you very much, right?
ROBIN: If you were a corrupt bank teller who was just looking for any excuse to give money to a certain sort of people, then a little checklist for yourself of what it is that would be an awkward, bad way of doing things wouldn’t be that useful. Maybe your boss would want a checklist of things to watch out for in you that would be telltale signs. But you yourself might not be very trustworthy.
The essential problem is this bias problem is just a lot worse than you might initially think.
LUKE: Well, don’t you think there’s any hope though of maybe developing kind of like an exercise routine, where I tell myself, “OK, each day when I wake up in the morning, I’m going to check my biases against this, this and that.” That kind of thing where we decide to continuously check ourselves, and not so much other people, against these lists, or whatever else we can put together that might help?
ROBIN: There might be something in that direction, that would be sufficiently strong, that would work. The question is how many people are willing to go that far? So honestly one of the very simplest things you could do that would actually reduce bias, I’ve known for a long time, is to simply collect a track record of what you say, when, and what you think later. Compare the two. And this is almost never done.
So even in companies where they have an argument all the time around the conference table about what they should do, are the sales going to go up, or whether they should move and hire this guy. Such organizations almost never systematically go back, and say, “Well, who said what, when? Who was right, who was wrong? Tally it up.”
LUKE: This would help us because it would demonstrate to us how often we’re wrong, or how often we change our mind? And our memories are doing a poor job of that, but keeping exact records would do a better job?
ROBIN: Well first of all, it would just tell us who is right more often than who, because we’re really interested in that.
ROBIN: Once we anticipate that the records will be kept, it makes us much more cautious about what we say. We become concerned that this is going to look badly on our record.
ROBIN: And then it allows us to look for biases in our things. Are we consistently overly optimistic, consistently pessimistic, consistently self-centered, etc. Those sorts of things become easier to see if you have a track record. It allows you to be better calibrated, especially about do you have a bias in terms of bouncing back and forth or having too much of a trend, or things like that.
So once you have a track record, it becomes easy to look for lots of these sorts of things. So they are a variety of ways to make track records. Some organizations actually get in the habit of writing down people’s forecasts, weather forecasters for example. Some kinds of business forecasters actually do consistently write down their forecasts, and then have people check them later.
And that creates incentives and institutions, that get people to pay attention enough to reduce their biases. But that’s really a lot. The vast majority of social workplaces out there, or organizations out there, just don’t do that and it takes quite a lot to get them to switch over.
LUKE: Now in case some of the people listening to this interview are the type of person who would want to track their own biases and correct for them, how would that play out, not so much in a forecasting arena or a business arena, or something like that, but just throughout a normal person’s life? How would you pull that off?
ROBIN: Actually most of the things that people think about through their normal lives can be cast in the forecasting frame. Now if you talk about sort of abstract academics or philosophy, then it gets harder. Even there you can track sort of how opinions change with time, and at least track the opinion changes.
LUKE: So how do say non-philosophical thoughts, how can they end up being cast as predictions that we can then check later?
ROBIN: Well for example, if you’re looking in a store, looking at a shirt or a dress and you’re deciding whether you want to buy it. Well, implicitly you’re making forecasts about how often will I wear this thing, how many people will compliment me on it? Will it get me that job, or whatever?
Most of our concrete actions have some sort of concrete consequences later.
LUKE: Sure. And that’s why we’re doing them.
ROBIN: Right. So it’s a fair bit of work to get people to be explicit about those forecasts and then check them later.
LUKE: But if somebody was really dedicated they could choose to try to make their predictions explicit in their mind, and then write them down, and then check the list a couple of months later, something, from that particular day. Is that what you have in mind?
ROBIN: Right. Right. Or just get into the habit of consistently scoring these things. Even as simple as, how happy will I be today? And then at the end of the day, how happy was I? At the beginning of the day, I thought I would wear the red shirt. I thought the red shirt would help. And then you look later, and that’s your judgment.
Some days maybe you should wear a random shirt. Some days you wear a red shirt. It doesn’t make any difference whether you picked the shirt or whether you wear a random shirt. Things like that. But I mean, the main point here is that this is a lot of work, and it’s probably more work than most people want to go through.
So that makes us kind of stuck. So I think there’s a sense of which, if we want to try to overcome our biases, we have to focus attention somewhere, we have to choose our battles. We can’t immediately overcome our biases in all areas of our life. But we can pick a few important things and try to make sure we get it right there.
LUKE: Yeah. Well, it would seem to me like if you’re investing time doing a project like that, then that’s operating at a fairly high level in say a hierarchy of needs, Maslow’s hierarchy of needs, or something like that. Where most people, there’s no way they are going to have time to invest on developing this part of themselves.
ROBIN: Well, everybody has some free time. I don’t think it is so much a matter of how high the need is, it’s just focusing on a few more important things. So, even if you’re just worried about who you want to date, and who you should ask out, you can still bother to write down who your candidates were, and what your guesses were about each one, and if they would say yes, etc., something like that.
LUKE: Well, I do imagine that if we did this, we would be quite humbled by the actual success of our predictions and all of that.
ROBIN: Yeah. Well of course that’s part of the point is that we would in fact overcome some of our biases. Now, there’s a separate issue that many of our biases are there for a reason, and we would be thwarting that reason. And we have to come to terms with whether we want to thwart the reason that’s there for our biases.
Overconfidence, for example. So people tend to like confident people, say in dating where we like confident men. So if a man actually collected a track record of who he was going to ask out and what the chances were, looked at that record and got more calibrated about what his chances were, that might reduce his confidence.
LUKE: And lower his dating prospects.
ROBIN: For example, right. It’s one of many examples. Similarly, sales people are often overconfident, and that works for them in the sense that their overconfidence helps them sell more. So another thing to realize is our biases are there for many reasons. They might not be socially laudable reasons, but they’re often these personally beneficial reasons.
So again, we come back to I think the issue of choosing your battles. Rather than decide, I’m going to be honest about everything, or I’m going to overcome my biases in everything asked. What’s important? Where would this be worth the bother? Where do I think that I’d rather know the truth?
Then get some benefit from overconfidence or self-flattery, or things like that. And so one way of thinking about this maybe bigger questions: politics, society, business. Some bigger questions that we all have to continue to think about. That’s where this concept of prediction markets fits in, with basically betting markets on important questions. Or institutions that collect track records on what people say about important questions.
The idea there is that creates better incentives for people to pay attention on them. And also the ability to self-select, so they can speak up about the topics they know the most about.
LUKE: Yeah. So how does this work? It’s a prediction market around ideas and money’s involved?
ROBIN: Money doesn’t have to be involved, although it could. But something as simple as betting markets on who is going to be the next politician from office or who is going to win that election.
You could also have market soft say whether certain bills will pass Congress. You can also have more interesting things on whether if a bill passes Congress what are the consequences of that, or if a candidate is elected, what will the consequences of that be.
You could say if a Democrat wins the next presidential election cycle in the U.S., what will the unemployment rate be three years later, or oil prices, or troops over broad, or things like that.
LUKE: Now my immediate reaction to this idea when I first stumbled upon your work on it was, well, this is a great way to keep politicians and pundits accountable. And basically say hey, if you really think you know what you’re talking about, put your money where your mouth is, or put something else of value where your mouth is. And we’ll see if you know what you’re talking about.
Because too often pundits and politicians are not held to account for the fact that they have no idea what they’re talking about, and they’re constantly wrong.
ROBIN: More often, they do have an idea. They just are more willing to blow smoke. So, people are willing to breathe it. Take for example, the last presidential cycle. The war in Iraq and Afghanistan was a big deal and many people thought by voting for Obama, they were voting for a reduction of the war. It didn’t turn out to be true.
Now, was that just an accident, and Obama really intended to, and things just got in his way? Or did he never really intend to, and people were just willing to say that because people were willing to hear it?
Now, clearly you could have had a betting market on if Obama was elected, how many troops would be overseas at war three years later. Again, with going with the small correction idea, I don’t want to initially presume, a world interested in this, or even politicians are interested.
I don’t think most of the world really wants to overcome their biases that much. If you think there’s a small fraction of people, and you’re part of them who do, then you want to think about how you could work with other people in this group together.
And so these sorts of markets could be a way for you to talk to each other, you could bet on the markets, and then you could follow the markets and you could see the results. You don’t necessarily presume that everybody else cares. And that’s a problem, because now you need some legal permission, legal infrastructure to support these things.
At the moment, they’re illegal. That’s likely to continue for a while. That’s in part evidence that not very many people really do want to overcome these biases.
LUKE: Yeah. There’s been a lot of opposition to this idea. Obviously in the first case, that in many contexts, it’s illegal, but also there have been some attempts to implement this in certain ways, and to my knowledge, most of them have been struck down in one way or another. Is that right?
ROBIN: Well there’s ways this is legal within corporations. So now there is a spattering of corporations out there that are experimenting with this inside. It’s not like an overwhelming trend or anything, so that suggests a large level of disinterest. But still, it’s legal inside companies, and a lot of companies are doing it.
So that’s a nice interesting trend, and hopefully that will build momentum and a track record there that can be used to stand out in other areas. But outside of small organizations like firms, it is mostly illegal. For example, the last finance bill that just came before the U.S. Congress, there were explicit terms added at the insistence of movie companies that were there to prohibit betting markets on movies.
That has just been approved by the CFTC. So previous regulations and the regulatory process have just been improving. So markets will bet on how popular movies will be. And the movie industry disliked that so much that they lobbied Congress to put language into the financial bill to prohibit that.
LUKE: Why would they be against that?
ROBIN: Well, as you may have noticed, the movie industry puts a lot of effort into hyping new movies. They feel they have this intricate, detailed process that they are in control of. Celebrities mention it on their talk show, and TV ads and previews, create hype all over.
They have a whole hype machine working out there to figure out how to hype their movies. And a betting market would really… If people start listening, it would really do an end run around that. So anyway, again we come back to how many people really want to overcome these biases?
And of course that should make you wonder how special you really are. You’re different.
LUKE: Is there something like these betting markets within the communities of people who do want to work on their biases, that is implemented now, but in some legal way?
ROBIN: There is the key question of: Are there any communities now who want to work on their biases? So part of the problem here is that it’s a standard story that almost all groups tell about themselves, that they are special in this regard. That they are more honest with themselves, or more reality-based than other groups.
This is standard across the political spectrum, people across standard religion. A lot of religions talk about how they are the truth. People, almost all groups talk as if they think they really want to know what’s true and talk about what’s true. And those other people are not very honest and care more about politics, or themselves, or their own self-flattery, etc.
So everybody wants the mantle of low-bias, of having overcome bias, of being more reality-based. And so it’s hard for any group to actually take on that mantle without lots of other people challenging it. Or become slightly successful without other people trying to grab their authority.
For example, academic science, at least for a time in some ways took on itself the mantle of being more neutral and more authoritative, having overcome more biases. So, that produces a strong incentive for lots of people to try to grab science for themselves.
Governments want to fund science, so they enact advisory panels that tell people to do what they say. People want to teach science in a classroom and teach them the science that says that we’re right, etc. So, whenever you have something that’s perceived to be more accurate or authoritative, then it becomes a focus of lobbying for people to sort of grab that and use that as an authority.
Same way for newspapers, of course. People believe what they read in the newspapers and other people try to lobby to influence the newspapers to say what they want them to say. So people who claim to want to better understand what’s true, they have to develop some sort of internal institution.
Some norms that put some teeth into that. Reporters try to have norms for good reporting. Scientists try to have norms of good scientific procedure, etc. And then the question is: That may help someone, but how far can it go, or how easily can it be corrupted?
LUKE: Well, when I think of communities that are working on their biases and trying to overcome their biases, I think of communities focused around certain websites, such as your own, “Overcoming Bias.” And for example, Eliezer Yudkowsky’s “Less Wrong” and places like that.
But certainly they fall under your description of being places where people will often say, “We’re trying really hard to overcome our biases and get at the truth. And other people generally aren’t.”
ROBIN: “Therefore, we’re right. They’re wrong.” I mean, I have a website and various kinds of bias topics come up often. And my website can be a resource for people who really wanted to know how to overcome biases, but that doesn’t mean I believe that most readers are actually trying very hard to do that.
ROBIN: So I think a community that together tries to overcome its biases has to have something stronger tying it together than reading the same web pages, or something. So for example, having a betting market that ties it together is a stronger institution, or having a tradition of collecting a track record and comparing it to reality.
LUKE: Where do we find these betting markets?
ROBIN: There are some public ones at, say, intrade.com. There are also ways you can create these markets inside your organization on topics that your organization wants. Like sales, project completion, things like that. Basically this is a very minor activity in the face of the things that are happening in the world.
You’ll have to choose your battles. You have to decide what’s important to you.
LUKE: That’s pretty interesting. Just now it’s occurring to me that one way to implement this, in a way that would both check one’s own biases and maybe encourage that practice among others, would be… A lot of people have blogs now and one could publish predictions on one’s own blog and then check them later.
Now, technically, one could go back and edit the posts, but you’d have to be very aware that you’re doing that. And certainly would defeat the purpose.
ROBIN: Right. Once you started a habit of that, there might be people who would archive you.
LUKE: Exactly. Yeah.
ROBIN: But it does take some effort. And honestly, most people aren’t willing to bother with the effort. Most web pages that get a substantial number of readers, the authors are paying attention to what the readers are interested in. Why would they want to read a blog whose author demonstrates how inaccurate he is?
LUKE: Yeah. [laughs]
ROBIN: What’s the point? What do they get out of that?
LUKE: They want to read somebody who is always right.
ROBIN: For example, yes. Or at least acknowledged to be wrong. So again, I do think one of the first steps you have to ask is how sure you are that you actually are one of these exceptions that really wants to overcomes biases. Because again, it’s the sort of thing that most everybody likes to think about themselves.
They like to think they are more honest, realistic, reality-based than other people, certainly at least they disagree with. And yet we see sort of overall, relatively low interest in overcoming biases. We have to ask, “What makes you so sure you’re an exception?”
I think this is an especially striking contrast in the topic of disagreement. The fact that people disagree with each other so much seems pretty clear evidence that there’s a lot of biases and that people are not that eager to overcome them.
LUKE: Is the thinking there that if we were all just rational, objective, truth-seeking machines, then there would be a lot more agreement about issues?
ROBIN: Well, if we just try to understand what’s true, even if we have very limited information and capacity to analyze, etc., and we thought other people were as well, then we just wouldn’t disagree much. That is we wouldn’t knowingly disagree. We wouldn’t be able to anticipate how people’s opinions differ from our own.
If you were going to tell them your opinion and then try to guess what they would think after you told them your opinion, then theory suggests that your best guess about what their new opinion would be after you told them your opinion would just have to be your opinion.
You wouldn’t be able to anticipate systematic differences, which way they would differ. But of course many people can anticipate the difference of opinion, all of the time, quite consistency. That suggests that people don’t treat other people, as being equally truth seeking as themselves.
People tend to think that they are better than others. That other people are biased and dishonest, and they are less so.
LUKE: This is getting into an area called the epistemology of disagreement. What’s that about?
ROBIN: Well, epistemology is largely about what are reasonable beliefs to have. Disagreements is that issue in the context of a disagreement: i.e., when somebody disagrees with you, what is a reasonable belief to have? So some say that the fact that other people disagree with you, says very little about what reasonable beliefs are.
Essentially, disagreement has very little bite in epistemology. Other people say that on the other stream, that you’re pretty much never justified in disagreeing with people.
LUKE: Now what would it mean to say that you’re never justified, or almost never justified, in disagreeing with someone?
ROBIN: Well usually it’s posed as in terms of: imagine there was somebody who was a peer of yours: a similar man, perhaps similar experience. Now you find that they have a different opinion from you on the subject. The strong point of view on this topic is that once you’ve identified a peer and a disagreement, then you should eliminate the disagreement by moving to some middle position, or both of you should.
In the philosophy literature, that’s sometimes called “equal weight view.” And so there is longstanding, probably for 15 to 20 years, philosophy literature on this rationality of disagreement. My position is that if there are say two people, or a set of people, who think of each other as similarly truth seeking – they don’t have to have similar amounts of information. They don’t have to have similar ability to analyze. They can make mistakes, they can be ignorant. They can have a variety of problems. But if they are similarly truth seeking, in the sense that they take their information and their analysis as best they can, including what other people say and think, and they combine that together as best they can, then they just will just not knowingly disagree.
LUKE: Now why on earth would that follow? I mean why would you think that if you consider another person who you think is similarly truth seeking, you shouldn’t knowingly disagree?
ROBIN: Well first of all, the opinion is based on a set of formal analyses done using formal models of information and belief. I can try to explain an intuition, which would make it sensible.
ROBIN: But the basis for the belief is the fact that there is this formal analysis that produces this theory. The key intuition is that when people are truth seeking, and even if they have limited information and analysis, other people’s opinions weigh heavily on them.
So if I think you know a lot about the subject, I may not know exactly why you’ve formed the opinion you have, or how. I of course, have to assume that you might of made some mistakes, but I still take it very seriously. I weigh it heavily. If I think that you hardly know anything about the subject, I may weigh you very little.
But on the other hand, you may weigh yourself very little too. So you may take my opinion very seriously. But as long as we’re both trying to guess what’s true, based on what we know, somebody’s going to be taken seriously. That will limit the degree to which our opinions knowingly differ.
So if I’m very ignorant for example, and I’m about to tell you my ignorant opinion, and you’re about to be a very wise, knowledgeable person, who’s going to preclude all of your extra information by making an estimate. I might not be very good at estimating your future opinion, but still my best guess about your future opinion would have to be my current one.
Because if I just heard you hint a moment ago about your opinion, if you had just spoken up about it, well then I already would of taken that very seriously, and moved a lot toward whatever that was.
LUKE: It seems like no matter how good the theorem is, hasn’t it been radically falsified by real experience?
ROBIN: Oh, sure. The question is which of the assumptions of the theory to blame it on. The basic theory started off pretty fragile, so that gave you a lot of assumptions as plausible things you could blame. But with time, the theory has become more robust. The assumptions are weaker, mostly.
So I think we’re at a point where the assumption to blame is the assumption that we’re truth seeking.
LUKE: [laughs] Right.
ROBIN: So that most people usually don’t think most people are truth seeking, or else they would agree a lot more. Well now of course you could say, “Well yeah, most people aren’t truth seeking. I’m the exception, that’s why it makes sense for me to disagree with that.” Now we have to wonder, “Well, ok, but what was your evidence for this exceptional nature of yourself?”
When it comes to their own exceptionality, people seem to be willing to accept remarkably weak evidence. They often are, “Well, you can’t disprove I’m special, or better. If you can’t disprove it, then I get to keep thinking it, right?”
LUKE: Now Robin, getting back to how we might overcome our biases. Another thing that might help is to begin to understand why we have all these biases, and why we tend to make the same mistakes again and again, even when we try not to. So I wonder first of all, what are some of the mistakes that we make again and again, and then maybe later we can talk about why do we make all of those errors?
ROBIN: Well, the cognitive psychology literature is full of descriptions of the many kind of cognitive biases that we have. In some sense, they have too many biases listed there. Sometimes they’re contradictory and cover all possible cases. But they’ve done a decent job at least of collecting lots of data, suggesting lots of kinds of biases.
So clearly for example as I’ve mentioned, we’re overconfident, we tend to favor ourselves. We tend to favor things that we can understand. Then there’s just this long list of very particular ways in which go wrong. Having that long list by itself isn’t very helpful because it’s hard to remember it all and hard to apply.
A lot of the conditions for these things are not very clear, when in fact do they apply. But I think it does help to have a sense for where it all comes from. The simplest, earliest sort of theory about these biases are just that, “The world is complicated, we make mistakes.” And if that’s the way it was, then you would expect a lot of particular biases, but you wouldn’t expect them to point in any particular direction on average.
You would just expect them to pile up, to make errors, and that we would to try to look for the errors and fix them, and we would do all right. So, for example that theory wouldn’t predict that we would disagree. It wouldn’t even predict that we would systematically favor ourselves, in these biases.
It just that sometimes we kick ourselves, sometimes we hurt ourselves, so it would just a big random mess. Certainly the world is complicated, and certainly we do make errors. Certainly we don’t get it exactly right. But I think the evidence is pretty overwhelming, that they aren’t just random mistakes.
An awful lot of our mistakes are [indecipherable 32:44] , that is they’re there for reasons, that they help us. If our mistakes are motivated, then well we expect a lot more of them. We also expect it would be harder to get rid of them. There would be a cost of getting rid of them. And we might try to pretend to get rid of them, but not actually really do so.
ROBIN: There’s a whole field of, as you probably know, evolutionary psychology focused on the kinds of ways in which we would expect minds to evolve. And part of that literature is about ways we expect certain biases to evolve. And one of the first things to just notice is that there are a lot of topics that we talk and have opinions about, that don’t seem to impact our personal lives very much.
Obviously if we’re driving a car and have to turn right or left, and have to know whether the stoplight is a red light, or we’re skiing and we have to decide if that’s a tree. There is a lot of very concrete decisions we make.
Topics on which we have opinions where it seems like evolution would have prepared us to be relatively accurate there, or at least be looking for accuracy and attending accuracy. Not that we wouldn’t, we’d never make a mistake, but that at least we’d be trying and that would be a useful thing, that we’d roughly get it right.
But humans seem to also have this inclination to have opinions about and talk about things that are relatively far from their personal experience. Talk about cosmology or foreign aid or art, just a wide range of things where it’s not clear that having accurate opinions on the topics would actually be the most helpful thing.
ROBIN: So that’s a big red flag right there. Well then, you shouldn’t necessarily expect having some opinion on the subjects to be unbiased, or to be trying to target the truth. The truth is a narrow target. It’s hard to hit, you have to be trying.
LUKE: Another idea here is that, I’ll anthropomorphize here, but genes don’t care whether you get at the truth so much, they care whether or not they reproduce themselves.
ROBIN: Right. In addition you can look for deviations between incentives to have true beliefs and incentives to have other beliefs. So many social beliefs, for example, you can see we have incentives away from the truth or toward other things. “Do I look fat in this dress,” is one of those standard prototypical cases, where clearly the social incentives there may not be toward brutal honesty.
LUKE: So it sounds like where we’re coming from here is that, anthropomorphizing again, our genes have significant incentive to get certain things fairly close to the truth most of the time, like “Is that a tiger chasing me?” Or something like that. Or, “Is this edible berries in front of me?”
Whereas when we start talking about cosmology or social institutions or grand moral theories or grand metaphysical theories, the truth tracking incentives are probably going to be overwhelmed by some other kinds of incentives that are more directly related to reproduction.
ROBIN: Well, there’s certainly a wide range of topics where you suspect that. And now when you have the suspicion, the question is well what do you do about it? It could, of course, be that there are larger social processes that will make those social incentives move you toward the truth.
You can’t rule that out merely because they are social incentives. But it certainly raises a suspicion. If your biases are there exactly to gain other sorts of social rewards, you may to decide that you want those social rewards so let’s not mess with them. You might pause and ask, “Well, when do I ever want to be extra suspicious of my beliefs and the processes that produce them?”
LUKE: Well Robin, you’re so pessimistic about this project of overcoming our biases!
ROBIN: Am I? [laughter]
Think about, the optimistic part of this is to say, well we’re at a point where you can sort of see the whole problem all at once in this short, hour long interview we’ve had. We’ve actually framed the entire puzzle, understand the basics of it, and have a basic idea of how to deal with it.
So if you do in fact want to overcome some of your biases, we are at a stage where you can. Pessimism might just be a caution to not assume too much or overreach. This is, I guess, the old man’s idealism, or the older person’s idealism. In some sense it’s usually relative to youth, considered a little anemic, right?
The young person wants to change the world and the universe, and all right now. The older person says, “If you work hard for 20 years, you might be able to do this one thing and that would be really worth it.” Sometimes that seems to the younger person like, “What you want me to work for 20 years and maybe just one lake will get cleaned? That’s it? I want to clean all the lakes in the whole world this year. Let’s get started!”
And the old guy says, “Sorry, that’s just not doable. If you want to do anything in the world, here’s the range of things you can do, here’s the thing you personally could do. And if you work on it this long you could actually do this and that would be good.” And to him, that’s optimism.
That says, “Hey, you can do something. Here it is. Let me tell you how. Get started.” And it only sounds like pessimism if you thought you could do a lot, lot more.
LUKE: And the good news that you’ve been sharing is that hey, we do understand a lot of these processes now. We understand a lot about evolution, a lot about cognitive science. And we even understand a few things about what can be done to overcome bias, such as tracking and betting markets and that kind of thing.
So on the flip side of the coin, that’s optimism. It’s just not youthful, change the world in a day optimism.
LUKE: Well, Robin, it has been a pleasure speaking with you. Thanks for coming on the show.
ROBIN: Thanks for having me.