Reading Yudkowsky, part 63

by Luke Muehlhauser on August 9, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Less Wrong are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to “level up” their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 700th post is Wrong Tomorrow, which points to a site (that no longer exists) for tracking how often pundits are wrong. Next, Eliezer offers some thoughts on Selecting Rationalist Groups.

Eliezer then points out that Rationality is Systematized Winning, followed by Incremental Progress and the Valley and Extenuating Circumstances and Whinning-Based Communities and Mandatory Secret Identities.

Eliezer considers the controversy over whether it helps to send aid to Africa or not. Other short posts include Real-Life Anthropic Weirdness, Newcomb’s Problem Standard Positions, Of Lies and Black Swan Blowups, Rationality Quotes April 2009, Great Books of Failure, This Didn’t Have to Happen, Special Status Needs Special Support, Willpower Hax #487: Execute by DefaultOpen-Mindedness: the video, and What is Wrong With Our Thoughts.

There are also some news posts in here: 1, 2, 3, 4, 5.

Beware Other-Optimizing warns:

I’ve noticed a serious problem in which aspiring rationalists vastly overestimate their ability to optimize other people’s lives.

That Crisis Thing Seems Pretty Useful revisits how to create a crisis of faith.

The Unfinished Mystery of the Shangri-La Diet applies rationality to dieting, continued in Akrasia and Shangri-La.

Bystander Apathy and Collective Apathy and the Internet consider the well-known bystander effect.

Considering the community again: Bayesians vs. Barbarians.

Of Gender and Rationality tackles the problem: Why are rationalists overwhelmingly male?

In My Way, Eliezer explains that people are different, and the path to rationality cannot be the same for everyone.

In The Sin of Underconfidence, Eliezer encourages rationalists to be more confident:

Well-Kept Gardens Die By Pacifism argues in favor of certain kinds of censorship in internet communities. More on community: Go Forth and Create the Art!

Practical Advice Backed by Deep Theories admonishes:

But practical advice really, really does become a lot more powerful when it’s backed up by concrete experimental results, causal accounts that are actually true, and math validly interpreted.

The Craft and the Community summarizes that sequence. And then: The End of Sequences and Final Words.

But that doesn’t mean Yudkowsky stopped posting to Less Wrong. Next up is This Failing Earth, which actually ends on a note of hope:

It may be that in the fractiles of the human Everett branches, we live in a failing Earth – but it’s not failed until someone messes up the first AI.  I find that a highly motivating thought.  Your mileage may vary.

Previous post:

Next post:

{ 7 comments… read them below or add one }

orthonormal August 9, 2011 at 8:50 am

Typo: the post is “Whining-Based Communities”, not “Winning-Based Communities”.

  (Quote)

soupsayer August 10, 2011 at 2:53 am

Having read through most of the links in this summary, I’m convinced that Yud is a sasquatchy version of Ayn Rand.

Goals can be part of the value equation, but is their mere achievement (winning) the primary use or benefit of rationality? If so, Yud and Charlie Sheen can have all the “winning”, or at least my share.

Yud wrote:
“I have behaved virtuously, I have been so reasonable, it’s just this awful unfair universe that doesn’t give me what I deserve.

“Rationalists should win!” Not whine, win. If you keep on losing, perhaps you are doing something wrong. Do not console yourself about how you were so wonderfully rational in the course of losing. That is not how things are supposed to go.

This strikes me as a simplistic, arrogant, and childish way to approach life. Hey, why behave virtuously if it gets in the way of what you want?

If along your path to “winning”, you exercise the opportunity for compassion or sacrifice, you must just be an irrational loser. No wonder you didn’t get what you wanted, you failed to crush your competitors when you had the chance.

It is somewhat of a clichéd notion, but the more that I achieve, the more I believe that the journey really is the destination. The notion of winning that scary Ayn Rand types like Yud endorse makes for a frightening community. And Yud is the guy who is going to code friendly AI?

Yud wrote:
Said Miyamoto Musashi: “The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him.”

Said I: “If you fail to achieve a correct answer, it is futile to protest that you acted with propriety.”

This is the distinction I had hoped to convey by saying, “Rationalists should win!”

  (Quote)

Furcas August 11, 2011 at 6:54 pm

soupsayer,

Suffice it to say that you’ve completely misunderstood what Yudkowsky wrote. I mean, you couldn’t be more wrong if you were actively trying to misrepresent his words.

  (Quote)

Skeptic August 11, 2011 at 10:20 pm

^ that. You have also made bold claims w/ underlying metaphysical assumptions that you didn’t even attempt to back up.

An error theorist could very well disagree with your claims that the end doesn’t justify the means (though he wouldn’t have to, necessarily). A utilitarian would also vehemently disagree with you. Seems to me like you’re putting to much stock in your initial intuitions.

Ayn Rand’s egoism is also quite different from Yud’s beliefs. Perhaps you should review your basics.

  (Quote)

soupsayer August 12, 2011 at 3:11 pm

Furcas & Skeptic

I have summited a response comment however it must have contained some content which caused it to be filtered, moderated, deleted, or unable to be be posted for some other reason.

  (Quote)

Jephalopod August 14, 2011 at 6:49 pm

Shameless off-topic spam: When is Conversations from the Pale Blue Dot coming back???

  (Quote)

Luke Muehlhauser August 14, 2011 at 10:37 pm

Keeps getting delayed. Hopefully by the end of the year.

  (Quote)

Leave a Comment