> And after you explain that, explain how having something vaguely in common with religion automatically means it's wrong.
No one is saying it's wrong, only that the discussion isn't scientific.
It is not only unscientific and quasi-religious; there are strong psychological forces at play, that muddy the waters further. There are so many potentially catastrophic threats that the addition of "intelligence" to any of them seems totally superfluous. Numbers are so much more dangerous than intelligence: the Nazis are more dangerous than Einstein; a billion zombies obliterate humanity; a trillion superbugs are not much more dangerous if they are intelligent or even super-intelligent; we intelligent humans are very successful for a mammal, but we're far from being the most successful (by any measure) species on Earth.
This fixation on intelligence seems very much like a power fantasy of intelligent people who really want to believe that super-intelligence implies super-power. Maybe it does, but there are things more powerful -- and more dangerous -- than intelligence. This power fantasy also helps cast a strong sense of irrational bias over the discussion. This power fantasy is palpable and easily observed when you read internet forums discussing the dangers of AI. This strong psychological bias tends to distract us from less-intelligent, though possibly more dangerous, threats. It is perhaps ironic, yet very predictable, that the people currently discussing the subject with the greatest fervor are the least qualified to do so objectively. It is not much different from poor Christians discussing how the meek are the ones who shall inherit the earth. It is no coincidence that people believe that in the future, power will be in the hands of forces resembling them; those of us who have studied the history of religions can therefore easily identify the same phenomenon in the AI-scare.
No one is saying it's wrong, only that the discussion isn't scientific.
It is not only unscientific and quasi-religious; there are strong psychological forces at play, that muddy the waters further. There are so many potentially catastrophic threats that the addition of "intelligence" to any of them seems totally superfluous. Numbers are so much more dangerous than intelligence: the Nazis are more dangerous than Einstein; a billion zombies obliterate humanity; a trillion superbugs are not much more dangerous if they are intelligent or even super-intelligent; we intelligent humans are very successful for a mammal, but we're far from being the most successful (by any measure) species on Earth.
This fixation on intelligence seems very much like a power fantasy of intelligent people who really want to believe that super-intelligence implies super-power. Maybe it does, but there are things more powerful -- and more dangerous -- than intelligence. This power fantasy also helps cast a strong sense of irrational bias over the discussion. This power fantasy is palpable and easily observed when you read internet forums discussing the dangers of AI. This strong psychological bias tends to distract us from less-intelligent, though possibly more dangerous, threats. It is perhaps ironic, yet very predictable, that the people currently discussing the subject with the greatest fervor are the least qualified to do so objectively. It is not much different from poor Christians discussing how the meek are the ones who shall inherit the earth. It is no coincidence that people believe that in the future, power will be in the hands of forces resembling them; those of us who have studied the history of religions can therefore easily identify the same phenomenon in the AI-scare.