FacingTheSingularity.com

by Luke Muehlhauser on November 25, 2011 in News

I’ve moved Facing the Intelligence Explosion to its own website: FacingTheSingularity.com.

I’m using the Melville theme for WordPress, but I need to modify it. Anybody wanna help?

Here are the changes I’d like to make to that theme:

  1. This header image above the blog title.
  2. Larger, darker body text, but same side-margins width.
  3. No date or comments or category mention at the bottom of posts.
  4. No newer/older posts link at the bottom.
  5. No italics for title of blog or links in the footer.
  6. ‘Next chapter’ link at the bottom of each post, which will lead to the most recent post.
  7. Home page shows the oldest post (‘Preface’).
  8. ‘Contents’ page (archives page, renamed) auto-generates a numbered list of all posts, in descending order of age (i.e., oldest post first).
  9. ‘RSS’ link in the footer (which links to the RSS feed, not to a page).
  10. This elaborate dinkus below the blog title.
Update: Huon Wilson jumped in and has already implemented the changes! Wow!

Previous post:

Next post:

{ 13 comments… read them below or add one }

Zeb November 25, 2011 at 1:53 pm

Why no comments? That will be much less interesting.

  (Quote)

Luke Muehlhauser November 25, 2011 at 1:58 pm

I’ll make an external page for comments; I don’t want the clutter of comments on that site.

  (Quote)

dbaupp November 25, 2011 at 7:39 pm

I can have a go at this.

(I’ve done two mock-ups very quickly: http://oi42.tinypic.com/2w6rxi0.jpg and http://oi43.tinypic.com/hwjn80.jpg)

  (Quote)

Dante November 26, 2011 at 6:33 am

The header image has 200 fKB, instead of a few tens, when converted to JPG :(

  (Quote)

nope November 26, 2011 at 9:35 am

“I don’t want the clutter of comments on that site.”

In other words, you don’t want people to shoot down your bizarre hypocritical nerd fantasy world of the singularity. Allow me to be blunt, because you don’t seem to respond to criticism that other people lay out of your ideas, which makes me think you are a bit of a fake when it comes to rationality.

You are committed to rationalism for some ideas, while committing yourself to philosophizing about techno-utopian ideologies in others (notice I said philosophizing, because really, when it comes down to it, you and some of more vocal LessWrong types are philosophers, not engineers. All talk and argument, little action. BTW, what has Eliezer or yourself produced in the world of AI?).

If you were are as committed to rationality as you say you are Luke, you’d apply it to yourself, and the LessWrong Singularity crowd with a little more frequency. Somehow, I don’t think you will (especially with your latest comments over at LW about wanting to ‘convert’ others to whatever flavor of the week psychology you’ve been reading).

  (Quote)

Kellyn November 26, 2011 at 8:03 pm

@nope,

I guess if you want to shoot down Luke’s “bizarre hypocritical nerd fantasy world of the singularity,” you’ll have to do it on the external comments page, or here, or on LessWrong. That hardly seems like censorship.

In what sense do you consider belief in the Singularity irrational? Or hypocritical, for that matter?

As for what Luke has accomplished in the field of AI, well, he’s been working at SingInst for all of four months, right? If he was the world’s most competent AI researcher and completely right about everything, what would you expect him to have accomplished?

As for what Eliezer has accomplished, that’s a very reasonable question, since he’s had 10 years. It seems to me that developing the entire concept of Friendly AI, founding SingInst, writing the papers available there, especially Timeless Decision Theory, and founding the LessWrong community with two years of groundbreaking philosophy are fairly impressive accomplishments. I am aware that lots of people disagree with this assessment, and I am withholding judgement on the issue until SI publishes in some academic journals and demonstrates that their contributions are valuable to their field.

What specific criticism do you think Luke has not responded to?

  (Quote)

mpg November 27, 2011 at 12:29 am

I have no idea on this earth whether this Singularity business is credible or not. I would need to know what the definition of intelligence is in greater specificity (I guess I should investigate). But, if the probability is significant, I guess I feel more comfortable that there are people out there taking the possibility seriously. I wish Luke and his community well.

  (Quote)

Beelzebub November 27, 2011 at 1:47 am

@mpg

A great place to sit back and listen to the founders and experts attempt to convince you is the page Luke linked to a few posts back:

http://lesswrong.com/r/discussion/lw/8hl/comprehensive_list_of_all_singularity_summit/

Kurzweil himself is particularly adept at shooting down objections. The more I listen to him, the more I respect him. He’s taken more than his share of vicious attack, yet he’s still calm and reasoned. I guess that’s why he’s Ray Kurzweil.

I have to admit, I had never seen the Watson Jeopardy contests. They are truly amazing. It shows that computational achievements will track the exponential curve of increasing power, and the moment you become complacent in your view of what is and isn’t possible, you will be surprised.

  (Quote)

Luke Muehlhauser November 27, 2011 at 5:14 pm

Note that Kurzweil’s singularity is different than the singularity I’m talking about – but I’ll get to that eventually.

  (Quote)

MarkD November 27, 2011 at 11:26 pm
freddy November 29, 2011 at 5:48 pm

Note that Kurzweil’s singularity is different than the singularity I’m talking about – but I’ll get to that eventually.

How many singularities are there altogether?

  (Quote)

Kellyn November 29, 2011 at 8:40 pm

@freddy: Three, at least according to Yudkowsky: http://yudkowsky.net/singularity/schools

  (Quote)

Adito December 1, 2011 at 4:11 pm

You mention on that page that this will happen in the next 100 years. Didn’t you decide that this kind of optimism made AI people sound pretty kooky?

  (Quote)

Leave a Comment