Reading Yudkowsky, part 49

by Luke Muehlhauser on June 26, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Less Wrong are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to “level up” their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 411th post is The Fear of Common Knowledge, about lying and knowing that another knows. My Kind of Reflection attempts to sum up Eliezer’s philosophical method, which is massively influenced by his work in AI.

Next is a post on The Genetic Fallacy:

In lists of logical fallacies, you will find included “the genetic fallacy” – the fallacy attacking a belief, based on someone’s causes for believing it.

This is, at first sight, a very strange idea – if the causes of a belief do not determine its systematic reliability, what does?  If Deep Blue advises us of a chess move, we trust it based on our understanding of the code that searches the game tree, being unable to evaluate the actual game tree ourselves.  What could license any probability assignment as “rational”, except that it was produced by some systematically reliable process?

The genetic fallacy is formally a fallacy, because the original cause of a belief is not the same as its current justificational status, the sum of all the support and antisupport currently known.

Fundamental Doubts is a nice examination of the phenomenon of doubting. Next is Rebelling Within Nature:

you can’t fight Nature from beyond Nature, only from within it.  There is no acausal fulcrum on which to stand outside reality and move it.  There is no ghost of perfect emptiness by which you can judge your brain from outside your brain.  You can fight the cosmic process, but only by recruiting other abilities that evolution originally gave to you.

And if you fight one emotion within yourself – looking upon your own nature, and judging yourself less than you think should be – saying perhaps, “I should not want to kill my enemies” – then you make that judgment, by…

How exactly does one go about rebelling against one’s own goal system?

From within it, naturally.

Eliezer tells of the story when he discovered his parents’ guidebook to parenting:

It described the horrible confusion of the teenage years – teenagers experimenting with alcohol, with drugs, with unsafe sex, with reckless driving, the hormones taking over their minds, the overwhelming importance of peer pressure, the tearful accusations of “You don’t love me!” and “I hate you!”

I took one look at that description, at the tender age of nine, and said to myself in quiet revulsion, I’m not going to do that.

And I didn’t.

My teenage years were not untroubled.  But I didn’t do any of the things that the Guide to Parents warned me against.  I didn’t drink, drive, drug, lose control to hormones, pay any attention to peer pressure, or ever once think that my parents didn’t love me.

In a safer world, I would have wished for my parents to have hidden that book better.

But in this world, which needs me as I am, I don’t regret finding it.

Probability is Subjectively Objective addresses a common dispute among Bayesians:

E. T. Jaynes, master of the art, once described himself as a subjective-objective Bayesian.  Jaynes certainly believed very firmly that probability was in the mind; Jaynes was the one who coined the termMind Projection Fallacy.  But Jaynes also didn’t think that this implied a license to make up whatever priors you liked.  There was only one correct prior distribution to use, given your state of partial information at the start of the problem.

How can something be in the mind, yet still be objective?

After a long discussion of how minds and calculators compute, Eliezer returns to the question:

Is probability, then, subjective or objective?

Well, probability is computed within human brains or other calculators.  A probability is a state of partial information that is possessed by you; if you flip a coin and press it to your arm, the coin is showing heads or tails, but you assign the probability 1/2 until you reveal it.  A friend, who got a tiny but not fully informative peek, might assign a probability of 0.6.

When you think about the ontological nature of probability, and perform reductionism on it – when you try to explain how “probability” fits into a universe in which states of mind do not exist fundamentally – then you find that probability is computed within a brain; and you find that other possible minds could perform mostly-analogous operations with different priors and arrive at different answers.

But, when you consider probability as probability, think about the referent instead of the thought process – which thinking you will do in your own thoughts, which are physical processes – then you will conclude that the vast majority of possible priors are probably wrong.  (You will also be able to conceive of priors which are, in fact, better than yours, because they assign more probability to the actual outcome; you just won’t know in advance which alternative prior is the truly better one.)

If you again swap your goggles to think about how probability is implemented in the brain, the seeming objectivity of probability is the way the probability algorithm feels from inside; so it’s no mystery that, considering probability as probability, you feel that it’s not subject to your whims.  That’s just what the probability-computation would be expected to say, since the computation doesn’t represent any dependency on your whims.

But when you swap out those goggles and go back to thinking about probabilities, then, by golly, your algorithm seems to be right in computing that probability is not subject to your whims.  You can’t win the lottery just by changing your beliefs about it.  And if that is the way you would be expected to feel, then so what?  The feeling has been explained, not explained away; it is not a mere feeling.  Just because a calculation is implemented in your brain, doesn’t mean it’s wrong, after all.

Your “probability that the ten trillionth decimal digit of pi is 4″, is an attribute of yourself, and exists in your mind; the real digit is either 4 or not.  And if you could change your belief about the probability by editing your brain, you wouldn’t expect that to change the probability.

Therefore I say of probability that it is “subjectively objective”.

Next, Eliezer recommends Lawrence Watt-Evans’ Fiction and makes a news post.

Previous post:

Next post:

{ 10 comments… read them below or add one }

MarkD June 27, 2011 at 8:48 pm

Despite my previous observations of why I generally don’t read LW, I did get excited by encounters with Marcus Hutter’s AIXI efforts seeded through LW and other sources. There is a certain obviousness to the notion that universal induction can be combined with optimality of agent responses to environmental signals, but Hutter’s work formalizes it in a nontrivial way. He also works it into a Theory of Everything that is worth reading: Hutter’s TOE . The critiques of NFL are particularly interesting.

  (Quote)

TretiaK June 28, 2011 at 8:46 pm

Anymore this blog is becoming nothing but a mouthpiece for Less Wrong. You may as well just redirect your domain name to their blog, or Eliezer’s posts in particular.

  (Quote)

cl June 30, 2011 at 11:04 pm

I don’t mind the Yudkowsky posts, really, but I do think the guy is overrated. It usually doesn’t take long to find an example of a Yudkowsky statement that fails. Sure, he says some things I agree with, but:

Once an idea gets into our heads, it’s not always easy for evidence to root it out. Consider all the people out there who grew up believing in the Bible; later came to reject (on a deliberate level) the idea that the Bible was written by the hand of God; and who nonetheless think that the Bible contains indispensable ethical wisdom. They have failed to clear their minds; they could do significantly better by doubting anything the Bible said because the Bible said it.

Hilarious. And that in an article promoting vigilance against fallacious reasoning! Sheesh. It seems to be an example of precisely that which he warns against in the next paragraph… go look if you don’t believe me.

  (Quote)

TK July 1, 2011 at 5:03 am

cl, with due respect, you have not understood yudkowsky’s point, which is not that one should simply reject anything the bible says, but merely that to do so (shitty though it may be) would be better than hanging on to the idea that it has ethical wisdom, in light of the fact that it is so catastrophically wrong elsewhere.

  (Quote)

cl July 1, 2011 at 9:29 am

…you have not understood yudkowsky’s point, which is not that one should simply reject anything the bible says…

Oh please. I didn’t think that was his point. How could I, when I referenced the following paragraph where he clearly states that is NOT his point?

…but merely that to do so (shitty though it may be) would be better than hanging on to the idea that it has ethical wisdom…

Yes, exactly. That’s precisely the ironic, fallacious nonsense I’m talking about. Does he give any empirical evidence or to support his truth claim? No. He just asserts it, apparently as a result of the same conceptual analysis decried elsewhere, then implies that “they” — notice the complete lack of nuance there — have “failed to clear their minds.” Is that cautious, calculated, rational reasoning, in your honest opinion? Because it’s a bunch of self-superior, pseudo-rationalist crap in mine.

  (Quote)

Rufus July 1, 2011 at 1:35 pm

cl,

It usually doesn’t take long to find an example of a Yudkowsky statement that fails.

This is true. Initially I was excited to see Luke post these synopses. I was not familiar with EY and as a student and teacher of rational thinking, I thought I might benefit from his writing and “level-up” along with Luke. It was not long before I found serious logical issues with EY’s statements, metaphysical assertions, linguistic obfuscations, proof surrogates, and arrogance. I continue to read it for the occasional pearls and to occasionally discuss issues with which I disagree. I find those discussion fruitful.

But, if you really want to level-up your rationality, start with Plato’s Theaetetus and Aristotle’s Organon, go to Frege’s Über Sinn und Bedeutung, then take on some Russell, Wittgenstein, C.I. Lewis, David Lewis, and Saul Kripke. I’m sure others could add to my meager list here. In fact, that would make for an interesting thread, “Which philosophers have helped you ‘level-up’ your rationality?”

I’ve noticed that you, cl, often ask the question of why it seems so many issues are considered settled by this fantastic blogger. I suspect it may be due to the fact that he is an avid reader of textbooks. While textbooks offer great introductions into an areas of study, they often present material as dead an uncontroversially settled rather than living, fruitful, and still debated. Perhaps this is why those biases tend to creep up on him. One should not assume that an issue is settled simply because some survey indicates that the majority of professional scholars tend to think a certain way, or because a textbook presents an issue as settled. Dig into the history of the arguments yourself and you will find brilliant responses on all sides of these issues. Dedicated study of these areas does not take months or years, but a lifetime (or more). I’ve been at it for a little over a decade and I am always surprised to discover how naive my understanding of an issue or argument can be. Humility is a mark of the wise person, and I think it would serve EY and his followers well to practice this virtue. It is to have the beautiful soul of Socrates, who knew that he did not know.

Finally, cl, I would agree that EY has committed a fallacy here. There is no need to come up with our own fancy neologism for this fallacy, as EY often does, it is simply a prime example of the genetic fallacy. Whether the ethical claims of the Bible are true stands independent whether it is the inerrant word of God. Though I am not a Muslim, I would not assume to throw out all ethical claims in the Qur’an simply because I do not think it is inspired scripture. Rather, I would examine the claim in order to determine whether or not it contains good moral wisdom.

TK offers a methodological interpretation, i.e. that we should throw out all claims in the Bible since the accuracy of certain claims have been undermined. But this is precisely why the genetic fallacy is so dangerous. If we universalize this claim as good methodology, then we could trust any authority, since nearly all authority is fallible. Good luck reinventing the wheel. Perhaps that is why they are so busy reinventing wheels over at Less-Wrong. But consider this, I have posted at least a half dozen times regarding logical errors in EY’s claims. So if I am right, then EY has been undermined, and so by his own reasoning here, it would be methodologically better for Less-Wrongians to throw out everything he says simply because some of what he says is wrong. But even I would not assume to throw out EY’s babies with his bathwater–and I am obviously not a fan.

Pax,

Rufus

  (Quote)

Rufus July 1, 2011 at 1:39 pm

The last paragraph should read “we couldn’t trust any authority…”

  (Quote)

cl July 1, 2011 at 5:34 pm

Rufus,

You rock. I learn from every single comment you post. They are all measured and humble, I seriously aspire to where you’re at. For whatever reason, I just can’t purge the gadfly in me. I just get so sick of the same old fallaciousness masquerading as reason, that I can’t NOT react the way I do. Oh well.

One should not assume that an issue is settled simply because some survey indicates that the majority of professional scholars tend to think a certain way, or because a textbook presents an issue as settled.

Yeah, I agree. That’s why I mock Luke in this regard. It’s a respectful mocking, don’t get me wrong, but I mean… let’s be real. Remember the Marcel Brass CPD? What are we to think when Luke — the ostensible champion of cautious rationalism — throws it all to the wind to score points for materialism? Marcel Brass specifically warned against taking too much liberty with the Libet experiments. Did Luke listen? Nope. Just like any other dogmatist, he latched mouth wide open onto the tit of “that which itching ears want to hear.” Anyways, enough of that.

Finally, cl, I would agree that EY has committed a fallacy here. There is no need to come up with our own fancy neologism for this fallacy, as EY often does, it is simply a prime example of the genetic fallacy.

That’s exactly it. I mean, what irony! This in a post about the genetic fallacy! This is why arrogance is so freakin’ dangerous. It doesn’t take much to puff a man up, and boy… watch out once it happens! This is seriously why I think most atheists I encounter are at a disadvantage when discussing religion, and especially deconverts. They get trapped into this idea that they’re all of a sudden more rational, and that their newfound opinions are supported by rationalism whereas theism is wholly supported by irrationalism. Oh! What a dangerous place to be! Brains are brains regardless of beliefs.

But consider this, I have posted at least a half dozen times regarding logical errors in EY’s claims. So if I am right, then EY has been undermined, and so by his own reasoning here, it would be methodologically better for Less-Wrongians to throw out everything he says simply because some of what he says is wrong.

But of course, neither EY nor anyone else with a rational brain in their head actually endorses this strategy, so we know that’s not going to happen.

Enjoy the weekend.

  (Quote)

danso July 2, 2011 at 12:52 pm

But, if you really want to level-up your rationality, start with Plato’s Theaetetus and Aristotle’s Organon, go to Frege’s Über Sinn und Bedeutung, then take on some Russell, Wittgenstein, C.I. Lewis, David Lewis, and Saul Kripke.

Thank you, Rufus! The LW cult often loves to sneer at “older” philosophy as being irrelevant and being able to teach us little. Definite chronological snobbery.

Yet, how many of them really engage with Kripke, C.I. Lewis or Plato? Barely any of them and often with contempt. It makes one wonder if they understand what they dismiss.

  (Quote)

Multiheaded July 6, 2011 at 5:09 am

Thank you, Rufus! The LW cult often loves to sneer at “older” philosophy as being irrelevant and being able to teach us little. Definite chronological snobbery.

Yet, how many of them really engage with Kripke, C.I. Lewis or Plato? Barely any of them and often with contempt. It makes one wonder if they understand what they dismiss.

What a remarkably measured, rational and fallacy-free comment.

  (Quote)

Leave a Comment