Hacker News new | past | comments | ask | show | jobs | submit login

Well, okay...

There are at least two important facts to consider, when contemplating a conscious entity's perception of reality, as an individualized experience.

  1. A distinct entity's experience is owned with individuality as a singleton object, while the entity persists in a recognizable, and integral state. (the thing is the thing while it's still the "same" thing, and remembers that it's the thing)

  2. Consciousness is not a characteristic constrained in any way to biology or any other preferential form of implementation. ("things" aren't necessarily people, or even animals or plants)
I'll never deny the concept of Boltzmann Brains or the potential for artificially intelligent sentient beings born out of man-made (or any other form of) electronics.

I'll happily recognize that nuclear medicine has identified the reality that the atomic-scale contituents of our corporeal forms transition through a complete turn-over of all molecular members in less than a decade; That by the time we age to 20 years, we are effectively completely replaced by different matter.

With all these ideas in mind, I don't take lightly the idea of jumping the gap from one state of existence to another.

At astronomical scales, life seems to be very rare. If life actually is rare, relatively speaking, considering ratios of void to matter, ratios of matter to life, and then ratios of life to self-aware, perceptive, communicative intelligence, then we must approach this whole scenario with the mindset that it is very difficult to provoke living entities, and very easy to render the living as dead and inert.

I know that I was dead before I was alive, because there was a period of time pre-dating my existence. The period before my birth can be regarded as a period when I was dead. Given that fact, it's not that I fear death, or wring my hands about my "soul." I was once dead before, and so I shall be again. Once that truth is squared away, the only thing left to worry about is getting swindled out of this only life I'm capable of living and remembering at the moment.

I am the medium, and I care about the otherwise abivalent information. Me.

There's a major reason why I think it's dangerous to casually bandy about the idea of swapping out neurons with electronics, however small and unobtrusive, however perfect a replacement. It's the same reason why we'd hesitate to firebomb a city of humans and replace it with a city of robots.

Economically the robot city performs as an equal to, or better than the human city it replaces, but the robot city is likely to be fundamentally and profoundly different from the human settlement it has supplanted. The country and its global trading partners around the world won't feel the difference, and human life goes on everywhere beyond the robot city, without skipping a beat.

The broader scope of human civilization will survive, even if we trim a whole city from the herd. Thus, the decision to eliminate the city is obviously a simple one, isn't it? But then, there's also the ethical concept of robbing the human inhabitants of their own agency, simply because we can easily replace them with an equivalent. Chopping out our own brains, and substituting them with what might be little more than an elaborate textile mill is questionable behavior indeed.

We haven't gotten to the bottom of what life is yet, so I don't think we shouldn't be so trigger-happy about replacing it, even in exchange for all our mortal foibles.

At the moment, all I'm prepared to identify about life, and the life I lead, is that my sense of self rides on top of a collective of amoeba-like cellular animals. My sense of self is unaware of this community of cellular creatures (neurons) without the assistance of surgery, microscopes, and unwitting/unwilling, sacrificial third-party predecessors as test subjects. Everything other than these curious neuron creatures seems to be optional to my sense of self (arms, legs, heart, lungs, liver...). After that, the rest of my existence is, pretty much, mysterious so far.

I think we have a lot of ground to cover before we start killing off our neurons. Even if they're bound to die off anyway.

There are likely to be other packs of real, live, human neurons still roaming about, after mine die off, and if I leave behind a heap of machines as a placeholder for what might no longer actually be me, how will I know for sure that I'm not bequeathing a huge, serious, weird problem for my normal human peers after I'm gone?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: