Could you elaborate? How can it wreck one's thinking? I mean, maybe I'm wrong but your writing as if it is dangerous to read for the untrained to read.
I agree with singularity being like a religion. I think the singularity is nonsense. On the otherhand I don't think the rest of your accusations are accurate.
First, I'm not sure I would name Yudkowsky the leader of the singularity culture. Of lesswrong certainly, but lesswrong is dead. Yudkowsky is about as much a leader of the singularity stuff, as P.G. is of "hacker" culture. A big name with writings on philosophy and the culture. Honeslty I would name Kurzweil as a far bigger person w.r.t singularity stuff. He is far more known.
Furthermore no where does Yudkowsky say anything like "if you follow me you'll be saved". In fact he probably believes the singularity will save everyone (if friendly a.i).
This is where I expect you to bring up Roko's Basilisk. For those unaware, Roko's Basilisk is the idea that a A.I may retroactively punish people who didn't assist in it's development. Therefore if you don't want to suffer you should donate money and time to creating A.I.
First of all, this was idea was created by a guy called, Roko. Not Yudkowsky. Yudkowsky doesn't believe it. Yudkowsky's response among other things was to say that it was not ethical of Roko to write about the basilisk, if Roko really believed it to be true.
Kind of like how sometimes atheists will say that Christians shouldn't proselytize to people who have never heard of Christianity, because if they don't know about it, God can't very well send them to hell for their disbelief.
Anyways I am interested to hear why you felt so strongly you had to warn us off him. The hate Yudkowsky gets is something I don't understand but want too.
I was gullible enough once to eat up his philosophy. Because he's not trivially wrong. He tries to take credit for developments that aren't his, present good ideas mixed in with nonsense and you just swallow it all naively, unless you've previously thought through the stuff. His casual dismissing of the mainstream can become your style as well, and you can feel you become part of something special, something that transcends most people's levels. It's really not unlike Scientology.
Sorry if I come across as warning too much. It probably depends on your personality type. If you are a very reflective and self-critical person, you can tie yourself up with his philosophy I can tell you that much. Now, granted, the consequences of something don't tell much about the truth of that thing (just like the "God exists because if God exists then I feel safe and happy and gives meaning to my life" is a bad argument.), still - keeping in mind that we are discussing whether a novice should be advised to read Yudkowsky - I think there are better choices.
I absolutely think that if you're above average smart but lack the factual, lexical knowledge, then reading his stuff can be detrimental to your intellectual development. You're better off without it. Yeah, you can make the case for going through such experiences, just like you can make the case that going through drug addiction and recovery can make you a better/stronger person in some sense, but as a first approximation it's better to stick with mainstream literature. That's the main point I'm trying to argue for.
And I have to add that it's weird and frustrating to argue about this because obviously those defending his ideology are smart enough to come up with good counterarguments. It's nothing like debating with, say, Young Earth Creationists. These people (perhaps including you) are often intelligent, tech/math/CS-literate people. The point of disagreement is of a finer nature that is hard to even discuss because that whole aspect is usually dismissed as "useless philosophy".
It's somewhat akin to https://xkcd.com/793/ where someone who masters one technical field feels an immense power or superiority and becomes a bit smug.
Feynman had the cred to be able to say "If you don't like it [the way nature works], go somewhere else; to another universe maybe", but talking in this kind of confident way requires a very solid background. I'm not talking about prizes, simply actual results. And I know it's difficult due to the nature of the topic (studying a hypothetical unknown-probability event that can wipe out everything), but this sort of cop-out is just not convincing.
> His casual dismissing of the mainstream can become your style as well, and you can feel you become part of something special, something that transcends most people's levels. It's really not unlike Scientology.
Um. You might want to read up on what Scientology is; they're not even remotely in the same league. For example, Yudkowsky has never been accused of war crimes. Also, his brainwashing technique seems to be a bunch of essays, rather than, say, a biofeedback device coupled with personalized verbal abuse and a systematic dismantling of peoples' personal relationships.
The becoming part of a special in-crowd who will save the world is the similarity. Not the torture and crimes of course. I've seen and read quite a lot about Scientology, but you are right that they are nowhere near the same league. Scientology is a massive and extremely rich bunch, who do some very bad things on purpose.
>It's somewhat akin to https://xkcd.com/793/ where someone who masters one technical field feels an immense power or superiority and becomes a bit smug.
That's what really put me off of the LW community as a whole - that condensing attitude of superiority over anyone not in the in-group. There was a telling comment in one of the articles that went along the lines of:
"Oh yeah, LW is not for everyone. It takes a particular kind of person to discuss the intricacies of the Newcomb's problem day in and day out".
That made me cringe so much when I read it. Like, jeez, just get over yourself, no one cares how "smart" you are.
I agree with singularity being like a religion. I think the singularity is nonsense. On the otherhand I don't think the rest of your accusations are accurate.
First, I'm not sure I would name Yudkowsky the leader of the singularity culture. Of lesswrong certainly, but lesswrong is dead. Yudkowsky is about as much a leader of the singularity stuff, as P.G. is of "hacker" culture. A big name with writings on philosophy and the culture. Honeslty I would name Kurzweil as a far bigger person w.r.t singularity stuff. He is far more known.
Furthermore no where does Yudkowsky say anything like "if you follow me you'll be saved". In fact he probably believes the singularity will save everyone (if friendly a.i).
This is where I expect you to bring up Roko's Basilisk. For those unaware, Roko's Basilisk is the idea that a A.I may retroactively punish people who didn't assist in it's development. Therefore if you don't want to suffer you should donate money and time to creating A.I.
First of all, this was idea was created by a guy called, Roko. Not Yudkowsky. Yudkowsky doesn't believe it. Yudkowsky's response among other things was to say that it was not ethical of Roko to write about the basilisk, if Roko really believed it to be true.
Kind of like how sometimes atheists will say that Christians shouldn't proselytize to people who have never heard of Christianity, because if they don't know about it, God can't very well send them to hell for their disbelief.
Anyways I am interested to hear why you felt so strongly you had to warn us off him. The hate Yudkowsky gets is something I don't understand but want too.