Reading Yudkowsky, part 17

by Luke Muehlhauser on February 27, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Overcoming Bias (now moved to Less Wrong) are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to improve their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 113th post is How to Convince Me That 2 + 2 = 3. Earlier, Eliezer wrote that “a belief is only really worthwhile if you could, in principle, be persuaded to believe otherwise.”

But then, what situation would convince Eliezer that 2 + 2 = 3? Eliezer offers the following:

I admit, I cannot conceive of a “situation” that would make 2 + 2 = 4 false.  (There are redefinitions, but those are not “situations”, and then you’re no longer talking about 2, 4, =, or +.)  But that doesn’t make my belief unconditional.  I find it quite easy to imagine a situation which would convince me that 2 + 2 = 3.

Suppose I got up one morning, and took out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, without any earplugs having appeared or disappeared – in contrast to my stored memory that 2 + 2 was supposed to equal 4.  Moreover, when I visualized the process in my own mind, it seemed that making XX and XX come out to XXXX required an extra X to appear from nowhere, and was, moreover, inconsistent with other arithmetic I visualized, since subtracting XX from XXX left XX, but subtracting XX from XXXX left XXX.  This would conflict with my stored memory that 3 – 2 = 1, but memory would be absurd in the face of physical and mental confirmation that XXX – XX = XX.

I would also check a pocket calculator, Google, and perhaps my copy of 1984 where Winston writes that “Freedom is the freedom to say two plus two equals three.”  All of these would naturally show that the rest of the world agreed with my current visualization, and disagreed with my memory, that 2 + 2 = 3.

How could I possibly have ever been so deluded as to believe that 2 + 2 = 4?  Two explanations would come to mind:  First, a neurological fault (possibly caused by a sneeze) had made all the additive sums in my stored memory go up by one.  Second, someone was messing with me, by hypnosis or by my being a computer simulation.  In the second case, I would think it more likely that they had messed with my arithmetic recall than that 2 + 2 actually equalled 4.  Neither of these plausible-sounding explanations would prevent me from noticing that I was very, very, very confused.

What would convince me that 2 + 2 = 3, in other words, is exactly the same kind of evidence that currently convinces me that 2 + 2 = 4:  The evidential crossfire of physical observation, mental visualization, and social agreement.

The Bottom Line offers a complicated way of saying that your arguments about reality do not change the reality about which you are arguing.

What Evidence Filtered Evidence? concerns the dilemma faced when a clever arguer, who must say only true things but reveals only pieces of evidence that support a false conclusion he wants you to believe:

According to Jaynes, a Bayesian must always condition on all known evidence, on pain of paradox.  But then the clever arguer can make you believe anything he chooses, if there is a sufficient variety of signs to selectively report.  That doesn’t sound right.

I’m not sure of Yudkowsky’s answer, but here’s what he writes:

Most legal processes work on the theory that every case has exactly two opposed sides and that it is easier to find two biased humans than one unbiased one.  Between the prosecution and the defense, someone has a motive to present any given piece of evidence, so the court will see all the evidence; that is the theory.  If there are two clever arguers in the box dilemma, it is not quite as good as one curious inquirer, but it is almost as good.  But that is with two boxes.  Reality often has many-sided problems, and deep problems, and nonobvious answers, which are not readily found by Blues and Greens screaming at each other.

Beware lest you abuse the notion of evidence-filtering as a Fully General Counterargument to exclude all evidence you don’t like:  “That argument was filtered, therefore I can ignore it.”  If you’re ticked off by a contrary argument, then you are familiar with the case, and care enough to take sides.  You probably already know your own side’s strongest arguments.  You have no reason to infer, from a contrary argument, the existence of new favorable signs and portents which you have not yet seen.  So you are left with the uncomfortable facts themselves; a blue stamp on box B is still evidence.

But if you are hearing an argument for the first time, and you are only hearing one side of the argument, then indeed you should beware!  In a way, no one can really trust the theory of natural selection until after they have listened to creationists for five minutes; and then they know it’s solid.

This all leads to Rationalization:

“Rationalization.”  What a curious term.  I would call it a wrong word. You cannot “rationalize” what is not already rational.  It is as if “lying” were called “truthization”.

…What fool devised such confusingly similar words, “rationality” and “rationalization”, to describe such extraordinarily different mental processes?  I would prefer terms that made the algorithmic difference obvious, like “rationality” versus “giant sucking cognitive black hole”.

Now you might be thinking, “All this Bayesian Judo is pretty cool and pretty impressive and looks really useful, but how the heck do I learn it? Is it going to take me half a lifetime to learn it if I’m not a prodigy who was reading Feynman at age 9 like Yudkowsky?”

Never fear, dear reader, for the next post is Recommended Rationalist Reading, featuring Yudkowsky’s own recommendations and those of many others. My own attempt to narrow down the recommendations on that page looks like this:

Jaynes’ Probability Theory: The Logic of Science is the “Bible” of this kind of Bayesianism, but it’s very difficult, and certainly not a starting point! An excellent starting point is Hastie’s and Dawes’ Rational Choice in an Uncertain World or, even easier: Irrationality by Sutherland.

Previous post:

Next post:

{ 8 comments… read them below or add one }

antiplastic February 27, 2011 at 1:32 pm

Is this guy for real? Ugh, what a cock-up of an argument.

Physical “addition” is not the same operation as numeric addition.

If I *physically* add a litre of milk to a pot with two litres of milk, I have no idea what the *numeric* total is until I perform the completely different operation of numeric addition.

Whereas I can numerically add 2 litres to 1 litre without even touching them, because physical “addition” is not the same operation as numeric addition. There is not even a requirement that the two things being added even be in the same galaxy. There is not even a requirement that the numbers being added represent any physical objects at all!

You *physically* added 2 pairs of earplugs, and later on there were only 3, and your first thought is that mathematics itself is fundamentally flawed? And not, I don’t know, maybe that one of them fell off, or you miscounted? Sheesh.

EY has no idea what he means when he says he can clearly imagine 2+2 being 3. What does that entail for 2+1+2? Are commutativity and associativity also annihilated by his misplaced earplug?

I’m sure his material on computer science and decision theory is brilliant, but I’ve not seen anything even remotely related to philosophy that wasn’t either a hash or a truism.

  (Quote)

Alex February 27, 2011 at 1:52 pm

You *physically* added 2 pairs of earplugs, and later on there were only 3, and your first thought is that mathematics itself is fundamentally flawed? And not, I don’t know, maybe that one of them fell off, or you miscounted?

You missed the crucial part of his thought experiment: ”Suppose I got up one morning, and took out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, *without any earplugs having appeared or disappeared* ”

  (Quote)

Alex February 27, 2011 at 1:57 pm

EY has no idea what he means when he says he can clearly imagine 2+2 being 3.

He specifically states that he can’t conceive 2+2 equalling 3. He says that he can conceive a scenario in which he was *convinced* that 2+2 equals 3.

  (Quote)

Alex February 27, 2011 at 2:23 pm

He specifically states that he can’t conceive 2+2 equalling 3. He says that he can conceive a scenario in which he was *convinced* that 2+2 equals 3.  

Eh, maybe I’m wrong about this. He says he couldn’t conceive a scenario that would make 2+2=3.

  (Quote)

antiplastic February 27, 2011 at 4:17 pm

p.s. I just got eaten again.

  (Quote)

Steven R. February 28, 2011 at 11:01 am

You missed the crucial part of his thought experiment: ”Suppose I got up one morning, and took out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, *without any earplugs having appeared or disappeared* ”  (Quote)

Yep, I think you’re in the right.

  (Quote)

antiplastic February 28, 2011 at 12:07 pm

OK, it looks like my eaten post is not coming back (of course, the one time I don’t keep a backup…)

You missed the crucial part of his thought experiment: ”Suppose I got up one morning, and took out two earplugs, and set them down next to two other earplugs on my nighttable, and noticed that there were now three earplugs, *without any earplugs having appeared or disappeared* ”  

1) If there are only 3, then by definition one of them has disappeared! This is an example of when someone claims they can conceive of something, when in fact they only *imagine* that they can conceive of something. “Of course I can imagine some complete silences being louder than others. I listen to two absolute silences in a row and the second one is much louder!”

2) The temporal wrinkle is irrelevant to my devastating and obvious point that physical addition is not the same operation as arithmetic addition. I just added my pile of laundry to my roommate’s pile of laundry. OMG! Only one pile! 1+1=1!!!

His example just simply isn’t an example of what he claims it’s an example of.

3) Mathematical facts are not isolated but are rather radically entangled with all other mathematical facts. EY has no idea what it is he’s being convinced of. If 2 and 2 are 3, can he still believe that 2 and 1 are 3, and therefore that 2=1? What about associativity and commutativity? If I count items as they are being “added”, 2… then 2 more, and you count those same items 3…. and then one more, do we get different answers?

Of course not, because physical addition is not the same operation as arithmetic addition.

  (Quote)

Rufus March 1, 2011 at 12:55 pm

Is this guy for real? Ugh, what a cock-up of an argument.Physical “addition” is not the same operation as numeric addition.If I *physically* add a litre of milk to a pot with two litres of milk, I have no idea what the *numeric* total is until I perform the completely different operation of numeric addition.Whereas I can numerically add 2 litres to 1 litre without even touching them, because physical “addition” is not the same operation as numeric addition. There is not even a requirement that the two things being added even be in the same galaxy. There is not even a requirement that the numbers being added represent any physical objects at all!You *physically* added 2 pairs of earplugs, and later on there were only 3, and your first thought is that mathematics itself is fundamentally flawed? And not, I don’t know, maybe that one of them fell off, or you miscounted? Sheesh.EY has no idea what he means when he says he can clearly imagine 2+2 being 3. What does that entail for 2+1+2? Are commutativity and associativity also annihilated by his misplaced earplug?I’m sure his material on computer science and decision theory is brilliant, but I’ve not seen anything even remotely related to philosophy that wasn’t either a hash or a truism.  

I would like to second this comment. Frege, I think, presents some powerful reasons to reject an empirical/inductive foundation for mathematics. Beaney summarizes the point nicely:

“That 2 unit volumes of liquid added to 5 unit volumes of liquid make 7 unit volumes of liquid only holds if the volume does not change as a result, say, of some chemical reaction; and ‘+’, for example, does not mean a process of heaping up, since it can be applied in quite different situations… Induction itself, if understood as involving judgments of probability, presupposes arithmetic” (Beaney, M. 1997. The Foundations of Arithmetic. In THE FREGE READER, Malden, MA: Blackwell 94-95).

This is not to say that Frege had the solution to this problem. His logicism ultimately failed. Nonetheless, there seems to be a circularity problem if one bases mathematical truths on a Bayesian analysis. No?

  (Quote)

Leave a Comment