Hacker News new | past | comments | ask | show | jobs | submit login

Never take the arguments of a side from their opponent's mouths.

The arguments I offered have nothing to do with any of the three he claims they all boil down to.

If you think I made one of these three, please tell me which one so I can clarify the argument.

Assuming it's a side effect of processing- known as an epiphenomena- immediately commits you to answering the question- does a badly programmed computer have a form of consciouness? Does a thermostat have a primitive form? Is it specifically impossible to create AI which emulates human thinking to the last detail, but has no consciouness, i.e. really is just an empty machine with zero experience? Is that an impossible task which could not be achieved by anyone by any means?

Suppose I debate with someone who has a computer programmed to be conscious. Here's what I'm going to do. I'm going to very very slightly change the programming so whatever output it's producing which is proving, my opponent claims, the computer is conscience, starts to degrade.

I'm going to do that then ask my opponent- still conscious? I'm going to do this and I'll guess my opponent will say "less so perhaps" , which would be his best reply.

Then I'm going to repeat until I get a "probably not" and then a "no" from him, which by his own hypothesis has to happen.

Then I'm going to diff the conscious program and the unconscious program and ask him if he really thinks those slightly altered lines of code are the difference between consciousness and a humdrum computer.

Because that's where this goes, this idea that a certain type of computation is consciousness.

It also goes to consciousness being granted to a machine like a Turing Tape. You may not think that squishy biological matter should be bequeathed with a "magical" property which hosts consciousness, but tell me, how do you feel about a Turning Tape?




I'm going to very very slightly change the programming so whatever output it's producing which is proving, my opponent claims, the computer is conscience, starts to degrade.

[...]

Then I'm going to diff the conscious program and the unconscious program and ask him if he really thinks those slightly altered lines of code are the difference between consciousness and a humdrum computer.

Is that not equivalent to giving a human being alcohol, observing that they become progressively less conscious, and asking if you really think that a few centiliters of alcohol is the key to consciousness?


It is somewhat, yes. Or I could cause gradual cell death in someone's brain. Same idea.


Doesn't that kind of reasoning lead you down a path towards panpsychism or panexperientialism?

Either there is a phase transition of consciousness or there is not. If there is, we have no idea where it is because we can't prove that another being has subjective experience the way we can ourselves. If there isn't, then something roughly panexperientialist follows and even, say, a gas cloud has (very occasionally and very limited) experience. But which is it?

The problem to science itself seems to be that we can't make any comparison of the "what it is like to be" sense of experience. I experience things right now. I can't tell you what it's like with the kind of certainty that is usually associated with science. I can't even tell my future self with that certainty, because memory is a sense in itself and when I recall something, I'm just experiencing something with the sense of memory.

If whatever experience is can't be "frozen", then science has nothing to work on, apart from trying to get at it from the objective side of things. But it seems like it's very easy to get sidetracked, hence the argument that Dennett just redefines consciousness as executive function and then proceeds to explain the latter in a materialist framework.


My replies to each of your comments, in order:

panpsychism or panexperientialism can't be right because they're not weird enough- to paraphrase Bohr. Would it surprise anyone to find out that, in our exploration of the brain we come across something as weird and upsetting to standard theory as QM is to physics ?

__________

If we do a gradual, over a long time, brain cell by brain cell replacement of a living human's brain, that human's self-report is our best bet to get around the impenetrability of the subjective experience of other minds. It is also the biggest challenge to people like me and could point strongly to consciousness as a thing supportable by machines.

__________

I agree that this is a problem that science, as it is right now, can't deal with. But that doesn't mean it's not real. The Big Scientific Inquiry, the spirit of science, seeks to explain and understand everything. Many really dramatic upheavals come out of corner cases in science; the things that are slighly off or not accounted for in an otherwise productive theory.

_______________

It's not important to anyone's brain research, but it is important to society because making a mistake about what is and is not conscious has the potential for huge negative repercussions.

When Dennett dismisses the issue and effectively assumes the consequent of the argument he's supposedly engaged with, not only is he making an error but the consequences of that error are far-reaching into how we act towards one another.

What I am arguing, to the extent I am arguing for anything, is that people like Pat Chruchland have a point and it's not an "academic" one; it's substantive. We are making a mistake if we ignore it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: