Reading Yudkowsky, part 43

by Luke Muehlhauser on June 8, 2011 in Eliezer Yudkowsky,Resources,Reviews

AI researcher Eliezer Yudkowsky is something of an expert at human rationality, and at teaching it to others. His hundreds of posts at Less Wrong are a treasure trove for those who want to improve their own rationality. As such, I’m reading all of them, chronologically.

I suspect some of my readers want to “level up” their rationality, too. So I’m keeping a diary of my Yudkowsky reading. Feel free to follow along.

His 353rd post is No Safe Defense, Not Even Science:

Of the people I know who are reaching upward as rationalists, who volunteer information about their childhoods, there is a surprising tendency to hear things like:  “My family joined a cult and I had to break out,” or “One of my parents was clinically insane and I had to learn to filter out reality from their madness.”

My own experience with growing up in an Orthodox Jewish family seems tame by comparison… but it accomplished the same outcome:  It broke my core emotional trust in the sanity of the people around me.

Until this core emotional trust is broken, you don’t start growing as a rationalist.

But you must also be able to break from the emotional comfort you might now have with Science. Eliezer had to break with Science in favor of Bayes.

But in Changing the Definition of Science, Eliezer says he is not in favor of simply redefining Science to mean Bayesianism.

Next is a news post, and then Eliezer says Bayesianism is Faster Than Science, and illustrates this with a post about Einstein’s Speed, followed up by That Alien Message.

My Childhood Role Model tells a story of when Eliezer was disappointed by a childhood hero, and explains how reading science fiction can be helpful for futurism.

Mach’s Principle returns to quantum mechanics. A Broken Koan shares one of Eliezer’s own Zen Buddhist koans.

The next several posts build on the quantum mechanics sequence to argue that time does not “flow” at all, but is static:

Einstein’s Superpowers and Class Project encourage us:

What Einstein did isn’t magic, people!  If you all just looked at how he actually did it, instead of falling to your knees and worshiping him, maybe then you’d be able to do it too!

All these posts and sequences will eventually allow Eliezer to talk about his own area: Artificial Intelligence. But for now, he offers A Premature Word on AI:

never expected AI to be easy.  I went into the AI field because I thought it was world-crackingly important, and I was willing to work on it if it took the rest of my whole life, even though it looked incredibly difficult.

Let’s say that the problem of creating a general intelligence seems scary, because you have no idea how to do it. You could run away by working on chess-playing programs instead.  Or you could run away by saying, “All past AI projects failed due to lack of computing power.”  Then you don’t have to face the unpleasant prospect of staring at a blank piece of paper until drops of blood form on your forehead – the best description I’ve ever heard of the process of searching for core insight.  You have avoided placing yourself into a condition where your daily workmay consist of not knowing what to do next.

But “Computing power!” is a mysterious answer to a mysterious question.  Even after you believe that all past AI projects failed “due to lack of computing power”, it doesn’t make intelligence any less mysterious.

Eliezer goes on to explain what progress in AI looks like.

Previous post:

Next post:

{ 30 comments… read them below or add one }

JS Allen June 8, 2011 at 10:42 am

OK, I can’t resist commenting on this one:

Of the people I know who are reaching upward as rationalists, who volunteer information about their childhoods, there is a surprising tendency to hear things like: “My family joined a cult and I had to break out,”

When you constantly say things like “reaching upward as rationalists”, is it any wonder that you attract people who are susceptible to cults? Am I the only one who finds the discussion boards at lesswrong.com to be downright creepy and cult-like?

  (Quote)

Garren June 8, 2011 at 11:05 am

I’m still confused about people identifying themselves as ‘rationalists,’ given the inadequacy of reason that isn’t a peer to empiricism and philosophical presuppositions.

  (Quote)

Reginald Selkirk June 8, 2011 at 12:49 pm

This description obviously does not apply to people who are already rationalists, and are thus not “reaching upward.”

  (Quote)

antiplastic June 8, 2011 at 9:17 pm

No, JS Allen, you are not alone, but sadly I fear this ship may have already sailed.

One of the first things I noticed years and years ago about the half-dozen or so Fyfists running around the internet is the sort of proto-cult character of the discourse. When you argue against Kantians or utilitarians or error-theorists you never see the sort of eerie, robotic character in the prose you get from the endless repetitions of the same tired handful of empty slogans and tautologies and shibboleths (“desires are reasons for actions that exist”; “conceptual analysis is like arguing whether Pluto is a planet”; “people generally have reasons…”)

One place you do see this kind of thing is in the talk of the sycophants of Ayn Rand, the 20th century’s crackpot philosopher par excellence, or the deluded Scientologists’ talk of “suppressives” and “reactive minds”. It’s no coincidence that Luke has participated in and made some measured praise of the latter’s version of “rationality and awesomeness bootcamps”.

What I see in the move to Less Wrong is just a step up from the minors to the majors. In the new mythology, Fyfe is now a sort of John the Baptist figure whose metaethical “theory” heralds He Who Is To Come, the messiah who will deliver us from the Judgment of the Robo-pocalypse. Yudkowsky must be submerged in the Jordan River of Desire Utilitarianism to signify his annointing to his rightful place as the savior of mankind.

Yudkowsky’s manifest incompetence in technical philosophy, his utter lack of engagement with the professional computer science community, his petulant dismissal of even mild criticism from Luke, his endless invocation of the same handful of catchphrases repeated reflexively by his followers (“taboo your words!” “bayes!” “Bayes!!!” “BAYES!!!!”), the creepy interest these “bootcamps” seem to take in the moral and spiritual railroading of the participants — none of these things seems to register.

Just have a look at these Common Characteristics of Cranks, or the classic Baez Crackpot Index, and see how much of the Fyfist-Yudkowskist enterprise leaps out at you, like when everything comes together moments after Verbal Kint leaves the cop’s office.

Speaking as the most militant atheist you will ever meet, it would have been better if Luke had remained some sort of moderated Christian with a host of silly supernatural beliefs who had nonetheless managed to outgrow the cult-mentality of fundamentalism, than to lose the former while hanging on to the latter like we’ve seen over the last few months. It’s especially painful because when he first appeared on the scene he was one of the more articulate and passionate and promising advocates for “my” side, and now he’s gone all Anakin on us.

Hey, I had my share of silly post-deconversion beliefs “when I was that age” (sorry for the hoary rank-pulling cliche) that make me just cringe when people remind me of what I used to be so excited about. So it’s not like there aren’t years and years for this to work itself out. I rarely post here anymore, mostly because the cultish material which it’s pointless to try to argue with has more and more overshadowed the fun stuff, like debate reviews and creationism and stuff. I just have an “intuition” that Luke’s too fundamentally intelligent and good-willed for this not to all blow over in, say, five years or so. We’ll all be waiting.

  (Quote)

Garren June 9, 2011 at 12:30 am

After reading more of Eliezer’s posts about ‘Science’ vs ‘Bayescraft’, it appears to me that he’s defining Science as only having to do with hypothesis testing…and only dull hypotheses at that. Meanwhile, he’s assigning inductive reasoning and originality to this other non-Science practice in which heroic non-mainstream thinkers see the truth about the world that dull scientists will take forever to finally notice.

Think I prefer Hugh Gauch’s approach of educating scientists on the general principles of scientific method, which do include everything Eliezer is lauding as better than science.

  (Quote)

CharlesR June 9, 2011 at 7:00 am

Garren,

You’re not using the word ‘rationalist’ in the same way the community over at Less Wrong does. Here is how they define rationality.

Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory. The art of obtaining beliefs that correspond to reality as closely as possible. This correspondence is commonly termed “truth” or “accuracy”, and we’re happy to call it that.

Instrumental rationality: achieving your values. Not necessarily “your values” in the sense of being selfish values or unshared values: “your values” means anything you care about. The art of choosing actions that steer the future toward outcomes ranked higher in your preferences. On LW we sometimes refer to this as “winning”.

The rest of the post is here.
http://lesswrong.com/lw/31/what_do_we_mean_by_rationality/

  (Quote)

PDH June 9, 2011 at 7:15 am

Garren wrote,

I’m still confused about people identifying themselves as ‘rationalists,’ given the inadequacy of reason that isn’t a peer to empiricism and philosophical presuppositions.

Don’t you think that there’s an interesting sense in which Bayes unifies rationality and empiricism (as the terms were traditionally conceived)?

  (Quote)

Garren June 9, 2011 at 9:31 am

PDH,

Rationalism is concerned with the knowledge we can have without needing to check with experience. It’s fairly common to consider math a domain that’s separate from experience, so I suppose any application of mathematical analysis to experience could count as combining this fruit of rationalism with empiricism. ‘Unifies’ would be going too far though.

It makes more sense to call one’s self a Bayesian than a rationalist, if experience plays a significant role. But beyond the internal-to-science debate about which inductive method is better, it makes more sense to identify with science than Bayesianism. (Where ‘making sense’ is a matter of communicating clearly with established terminology instead of being idiosyncratic.)

Maybe the problem is the lack of a positive word to identify people as supporting a scientific outlook who aren’t necessarily professional scientists. ‘Scientism’ would be a great term if it weren’t already used for, well, scientism.

CharlesR,

I was referring to ‘rationalism’ not ‘rationality.’ But maybe you meant to imply that Eliezer is appropriating the term ‘rationalism’ from its more specialized use in the history of Philosophy for any approach adhering to rationality. That would certainly explain why people would want to self-apply it who don’t approach learning about the natural world by pure reason.

‘Instrumental rationality’ is a mainstream view of rational action. I don’t think there needs to be a separate category for ‘epistemic rationality,’ since taking on the goal of truth-finding already covers that.

  (Quote)

Adito June 9, 2011 at 3:45 pm

Antiplastic,

I think that’s a little too harsh. I agree there’s a sort of fellowship forming over at LW but it’s based entirely on avoiding systematic biases like group identification. It’s about getting to correct answers in spite of all our failings. So even if they do end up going overboard and become cultish (something Yudkowsky is explicitly trying to avoid) a self-righting mechanism is built in to correct for it.

The rest of your comments had several general criticisms like Yudkowsky’s ignorance and Fyfes tautologies (the latter of which do seems to serve a purpose) but without more specific examples it’s hard to evaluate whether or not your arguments hold up. From my own reading, they appear to have interesting things to say and they manage to do it clearly but perhaps I’m missing something that you are not.

  (Quote)

Reidish June 9, 2011 at 5:06 pm

JS Allen: Am I the only one who finds the discussion boards at lesswrong.com to be downright creepy and cult-like?

No.

  (Quote)

soupsayer June 9, 2011 at 6:06 pm

Adito wrote:
there’s a sort of fellowship forming over at LW but it’s based entirely on avoiding systematic biases like group identification.

I suppose group identification labels like “Less Wrongers” and “Less Wrongians” are benign enough if they are taken simply to refer to people who frequent the web site. Though it still seems that they were awfully quick to adopt these labels, and it is not so common a thing to see for most other web sites.

But there is little doubt, to me at least, that many “Less Wrongers” have elevated their own narrow values to a highest universal good. This self-exaltation, this elevation of “us” above others, is a kind of moral superiority that relegates other people to inferior levels.

From Epistle to the New York Less Wrongians:

“Stay on track toward what?” you ask, and my best shot at describing the vision is as follows:

“Through rationality we shall become awesome, and invent and test systematic methods for making people awesome, and plot to optimize everything in sight, and the more fun we have the more people will want to join us.”

The very particular value monolithic value they seem to hold is that of “becoming a rationalist” or “growing as a rationalist”, and either of these will often be used interchangeably with the word “awesome”.

So where does that leave those who don’t consider rationality as the one path to awesomeness? Those who think that there is more to the human condition? Those who think there is more to wisdom? Those who don’t want to join “us” (them)?

  (Quote)

JS Allen June 9, 2011 at 7:55 pm

@Adito – The sad thing is that you don’t even realize you sound like a cult member defending a fundamentalist leader. “Fellowship forming”, “self-righting mechanism”, “something the great leader is explicitly trying to avoid”. The whole thing is like a parody of itself — it’s an organization built around the cult of personality of a single leader, with its own set of shibboleths and hobby horses, distinctive language and culture, organized group activities, a collective name for themselves — and when we point this out, the defenders come out of the woodwork to tell us “it’s based entirely on avoiding systematic biases like group identification”.

It’s like you’ve rendered yourself incapable of seeing your group identification biases, simply by incanting the spell that “it’s based entirely on avoiding systematic biases like group identification”.

The message board is like Hogwarts Academy: the participants spend the entire time trying out various “rationality spells” on each other and back-patting one another for being “rational”. They spend so much time convincing themselves that they are superheroes of rationality that they literally become incapable of seeing their own biases. Whenever Eliezer does something dumbfoundingly clueless (like his response to Chalmers in the post on ‘Zombies’ that Luke recently linked), none of the Hogwarts seems to notice. “He couldn’t possibly be clueless. He taught me every spell I know! DON’T YOU KNOW HOW MANY TIMES HE HAS GENUFLECTED TO THE GOD OF RATIONALITY!?!”

  (Quote)

jon milton June 9, 2011 at 8:40 pm

OK, I can’t resist commenting on this one:

When you constantly say things like “reaching upward as rationalists”, is it any wonder that you attract people who are susceptible to cults?Am I the only one who finds the discussion boards at lesswrong.com to be downright creepy and cult-like?

I do agree- although it is interesting that an ex- atheist baptist is saying that! Seriously; How many books have you read on the OT, philosophy, etc while you were an atheist? Although I don’t blame you for joining christianity (pascals wager is convincing, regardless of the evidence, which is pretty good), I am truly stunned by… well… BAPTIST?

  (Quote)

CharlesR June 9, 2011 at 9:50 pm

I was referring to ‘rationalism’ not ‘rationality.’ But maybe you meant to imply that Eliezer is appropriating the term ‘rationalism’ from its more specialized use in the history of Philosophy for any approach adhering to rationality. That would certainly explain why people would want to self-apply it who don’t approach learning about the natural world by pure reason.

‘Instrumental rationality’ is a mainstream view of rational action. I don’t think there needs to be a separate category for ‘epistemic rationality,’ since taking on the goal of truth-finding already covers that.

Garren,

I wasn’t arguing one definition (or the other) is more right. The folks over at Less Wrong will be the first to tell you that such an argument is a total waste of time. I just wanted to point out that your definitions are different.

  (Quote)

MarkD June 9, 2011 at 9:55 pm

@antiplastic:

Mostly in agreement except: “his utter lack of engagement with the professional computer science community.” I think there is some traction on that front:

http://www.springerlink.com/content/v4r27715g14w1581/
http://intelligence.org/upload/LOGI.html

I can certainly critique this kind of broad and rather vaguely formulated argument (I always try to have at least one empirical result in anything I do), but I think it is a contribution that merits a bit of consideration as an integrative review of where we are today. Idiosyncratic, yes, but not utterly unengaging ;-)

  (Quote)

JS Allen June 10, 2011 at 12:00 am

@jon – I’ve been getting that a lot lately, and I don’t have a great answer. My move from atheism to Christianity was based on far more substantial reasoning than Pascal’s wager (and I thought Pascal’s wager was a terrible argument when I converted). But the affiliation with Baptist denomination was largely arbitrary. I guess I should think that through, huh?

  (Quote)

Adito June 10, 2011 at 12:00 am

Soupsayer,

So where does that leave those who don’t consider rationality as the one path to awesomeness?

It leaves you simply as someone who disagrees. There are hundreds of posts over there about why they pursue goals in the way they do. You’re criticizing them for putting their reasoning into action without addressing the reasons themselves. This is unfair. It would make as much sense for me to criticize a Christian for the act of praying without mentioning why I think gods don’t exist.

Allen,

The sad thing is that you don’t even realize you sound like a cult member defending a fundamentalist leader

I’m registered over at LW but I’ve never even posted there so I don’t think that label applies. I simply admire their general goals.

it’s an organization built around the cult of personality of a single leader, with its own set of shibboleths and hobby horses, distinctive language and culture, organized group activities, a collective name for themselves

I don’t find anything objectionable about this list of characteristics except possibly the first (even that is fine so long as the personality encourages learning rather than blind obedience). In fact I think you’ll find the same characteristics in pretty much every group you can imagine.

They spend so much time convincing themselves that they are superheroes of rationality that they literally become incapable of seeing their own biases.

Can you point out some specific examples? Perhaps how Yudkowskys response to Chalmers fails and where people at LW refused to acknowledge it? If there’s a good reason to look at LWs project with more skepticism then I’d like to hear it but so far your argument hasn’t been specific enough in its criticisms to evaluate.

  (Quote)

JS Allen June 10, 2011 at 12:25 am

I simply admire their general goals.

Then why didn’t you say that, instead of this:

There’s a sort of fellowship forming over at LW but it’s based entirely on avoiding systematic biases like group identification. It’s about getting to correct answers in spite of all our failings. So even if they do end up going overboard and become cultish (something Yudkowsky is explicitly trying to avoid) a self-righting mechanism is built in to correct for it.

That sounds like a lot more that simply admiring their general goals. I’ve rarely seen a more credulous endorsement from a committed cult member.

In fact I think you’ll find the same characteristics in pretty much every group you can imagine.

Wrong. Robin Hanson, and Katja Grace, and the old CSA have all been affiliated with LessWrong, but are completely different from LessWrong in the aspects I mentioned. In fact, almost every discussion forum I can imagine is completely different from LessWrong in those aspects.

There probably are some groups that share those undesirable characteristics with LessWrong, but I can’t think of any right now. Maybe you can help me.

Can you point out some specific examples? Perhaps how Yudkowskys response to Chalmers fails and where people at LW refused to acknowledge it?

Yeah, examples like that seem to occur in every second post. Sometimes I feel like responding, but then I realize that the room is filled with Stepford Wives and I figure that if he doesn’t even respond to a spanking from Chalmers, I would just be wasting my time. Apparently I’m not the only person who feels more comfortable posting criticisms of the LW cult over here on a now-defunct blog rather than getting overrun by Eliezer’s zombie army on LW.

  (Quote)

Jon Milton June 10, 2011 at 1:23 am

@jon – I’ve been getting that a lot lately, and I don’t have a great answer. My move from atheism to Christianity was based on far more substantial reasoning than Pascal’s wager (and I thought Pascal’s wager was a terrible argument when I converted). But the affiliation with Baptist denomination was largely arbitrary. I guess I should think that through, huh?

Shit- I must of sounded like a total ass- sorry for being do rude! I think I may have drunk a bit too much before posting that message, cause I dont remember it coming off as badly. Anyways, I noticed Luke’s interesting friends back on less wrong a while back, so I stopped going here. Too bad there aren’t a whole lot of good atheist blogs.

I do agree with you at least that the bible needs to be taken either all or nothing. There isn’t much of a middle ground. But then again, that’s why I’m an atheist.

BTW- I know pascals wager is a stupid argument- but hey- eternal hellfire is not a pretty site. I used to believe it- and I am still terrified by it. Sorry bro, and I’m happy for ya!

  (Quote)

Garren June 10, 2011 at 9:16 am

CharlesR,

I wasn’t arguing one definition (or the other) is more right. The folks over at Less Wrong will be the first to tell you that such an argument is a total waste of time. I just wanted to point out that your definitions are different.

Most words are used to mean somewhat different things within an established range. I’m suggesting the use of ‘rationalism’ at LessWrong is outside that range. For the sake of clear communication, it’s worth making an effort not to do that.

I’m pretty sure the ‘waste of time’ response is meant as a way of accepting the fact of multiple, established meanings; not as an excuse to add to the confusion and dodge criticism for doing so.

“I don’t know what you mean by ‘glory,’ ” Alice said.

Humpty Dumpty smiled contemptuously. “Of course you don’t—till I tell you. I meant ‘there’s a nice knock-down argument for you!’ ”

“But ‘glory’ doesn’t mean ‘a nice knock-down argument’,” Alice objected.

“When I use a word,” Humpty Dumpty said, in a rather a scornful tone, “it means just what I choose it to mean—neither more nor less.”

  (Quote)

JS Allen June 10, 2011 at 10:26 am

@jon – No, you were totally fair. It’s something I should think about more thoroughly. I’m not a Biblical literalist or strict inerrantist, and thankfully my church’s doctrine statement requires neither. We’re not like SBC-Baptist or Fred-Phelps Baptist. But I see things I like in denominations as different as Lutheranism and Eastern Orthodox. I need to think about my decision process there.

  (Quote)

Garren June 10, 2011 at 12:28 pm

JS Allen,

May I recommend the Church of Christ sect, so you never have to listen to a worship band again?

Seriously, though, if you’ve written something up about your transition from atheist to Christian, I would like to read it. I didn’t see an obvious pointer on your blog.

  (Quote)

Jon Milton June 10, 2011 at 9:17 pm

Garren: That would be great. You seem like a well read guy. Plus, I’d like to see the light- well, if it’s there anyways. I’ve done a lot of research into it- but I will admit that I don’t know everything. I’d like to see a new perspective.

  (Quote)

Adito June 11, 2011 at 1:23 am

Allen,

In fact, almost every discussion forum I can imagine is completely different from LessWrong in those aspects.

You said “organization” the first time and “discussion forum” this time. Which one is more accurate? I would definitely agree that it’s a unique forum but not that it’s a unique organization. On that note…

There probably are some groups that share those undesirable characteristics with LessWrong, but I can’t think of any right now. Maybe you can help me.

Two things. First, could you explain why those characteristics are undesireable? And second, I don’t have to look any further than the nearest subculture to find a group with those characteristics. Examples off the top of my head are police officers, firemen, gamers, goths, the LGBT community, and philosophers.

Yeah, examples like that seem to occur in every second post

I need a clear example, ideally several, to make the sort of judgement you’re asking me to make. Pointing out that there are many examples does not help me.

I’ve read ~20 posts over there and found that I agree with most of what Yukowsky is trying to get across. If there are general mistakes going through all or most of his posts that I haven’t noticed then I’d like to know.

  (Quote)

Garren June 11, 2011 at 10:50 am

Jon Milton,

Garren: That would be great. You seem like a well read guy. Plus, I’d like to see the light- well, if it’s there anyways.

Your comment seems a little confused. I was asking JS Allen if he had written up something about going from Atheist to Christian. I’ve only gone the other direction.

But I’m definitely with you on the attitude of always being open to hearing what ‘the other side’ has to say. Unless it’s Kantians.

  (Quote)

Jon Milton June 11, 2011 at 3:31 pm

Jon Milton,Your comment seems a little confused. I was asking JS Allen if he had written up something about going from Atheist to Christian. I’ve only gone the other direction.But I’m definitely with you on the attitude of always being open to hearing what ‘the other side’ has to say. Unless it’s Kantians.

lol on the kantians. No- your right- my post is written poorly. I was originally going to right it out like this:

Garren: That would be great
Allen: You seem like a well read guy. Plus, I’d like to see the light- well, if it’s there anyways. I’ve done a lot of research into it- but I will admit that I don’t know everything. I’d like to see a new perspective.

However, for some reason, I decided to drop the “Allen:” part. I should of identified him in this way instead. Sry for the confusion.

  (Quote)

JS Allen June 11, 2011 at 9:47 pm

@Garren, @Jon: I wrote a little bit about my thought process in this post, in response to ‘smiley’, who asked me to help him salvage his faith.

After caveating that post by saying that I was the last guy on earth to salvage anybody’s faith, I promptly proved the disclaimer by ignoring smiley’s response. I didn’t even notice his response until just now, since WordPress e-mail notification is flaky, and I think he misinterpreted a major point of my post. Oooops!

In that post, I explain the broad brushstrokes, but I don’t deal with specific arguments for/against atheism. I just nakedly assert that the arguments are inconclusive on either side, and go from there. So if you’re looking for an enumeration of arguments, you’ll be disappointed. There are multiple reasons for this omission:

1) I’m not trying to convince anybody. I’m not an apologist. I might become a troll if I think someone is being stupidly illogical, but I don’t think anyone can make a decision for/against atheism on logic alone. Whether it’s because of autism, sociopathy, or Calvinism, I figure that other people’s belief about theism is not my problem.
2) My opinion about certain arguments has changed over time, and I’m continually trying to understand counter-arguments. For example, when I was an atheist, I thought “problem of evil” was the most retarded argument for atheism ever. In the past few years, I’ve come to regard it as a much more potent argument. Conversely, I was 100% certain that compatibilism was the only rational position on free will. Now I am only 89.88% certain :-)
3) I mostly only blog about arguments that seem potent to me. So I’m not going to blog solid refutations of anything, when I feel that solid refutations are available. If someone wants to believe atheism for stupid reasons, that’s not my concern. This means you’ll see more posts toying with problem of evil or simulation (for example), whereas I just toss up my hands and say “think what you want” when it comes to compatibilism.
4) I cling to a perverse pride that most of the people I deconverted are still atheists. I had a number of arguments for atheism that I have never seen argued anywhere else, and which were stronger (IMO) than most current arguments. Until I see someone else making those same arguments, I don’t see a point in explaining why I abandoned them. They factor in my personal story, but not yours.

  (Quote)

JS Allen June 11, 2011 at 10:22 pm

BTW, the above should be taken as a passive description of my thought process (since you asked), and not an arrogant presumption that my thought process is correct. My comments about “autism and sociopathy” should indicate that I’m not certain about the correctness of process.

The most interesting question to me is: What if someone can arrive at the right answer through the wrong process, or the wrong answer through the right process? Do you want to be the person who arrives at the right answer through the wrong process?

  (Quote)

Garren June 12, 2011 at 10:57 am

JS Allen,

Thanks for the link. I didn’t ask so I could argue about it, but so I can read it and I have now. Cool beans.

Do you want to be the person who arrives at the right answer through the wrong process?

It depends. Process is important to me, but I doubt all my values/desires/etc. are subordinate to it.

  (Quote)

Jon Milton June 12, 2011 at 1:14 pm

JS ALLEN- Thanks for the story. It reminds me of… well… me, to be honest. I don’t mean that condescendingly- Its just that I have pondered forever on what I think justified belief really is. Evidentially, I see no problem with either argument. But then I think to myself- and wonder if the most rational belief would be the one with the least consequences or not. I know it’s not really an arument- but I do deeply fear it. I don’t know why. I feel aweful knowing that if christianity is true (well, the ones that don’t teach some form of universalism), then fate of so many I love is immanent. And theres nothing I can do but watch the ignorant live out there lives saying ridiculous shit like how god just wants you to be a good person”. I don’t now how I’m supposed to cope with that.

That’s my current spiritual journey. There’s no knockdown argument are anything like that. Just mere pondering.

  (Quote)

Leave a Comment