View Single Post
Old 12-14-2009, 07:55 PM   #104
Xplo
 
Join Date: Apr 2006
Default Re: Is Transhuman Space a "silly" genre?

Quote:
Originally Posted by whswhs View Post
And it's not what I said. My continuity of existence is primarily my continuity of bodily existence. My consciousness is something that my body does. If a different body does it, it's a different consciousness, except under very specialized circumstances which do not occur in THS.
Funny, I don't remember you saying that before. But then, my memory is a sieve.

Nevertheless, you say here that "my continuity of existence is primarily my continuity of bodily existence", suggesting that it is secondarily.. what? And I understand from your other posts that you would still consider yourself "you", and alive, if the majority of your body - say, everything from the neck down, leaving aside Loki's Wager - were replaced either with cybernetics or foreign tissue; is that correct?

Quote:
What I said was that the fact that memories can be ported from one AI to another very freely means that the personal identity of an AI cannot reside in its memories.
I don't think that anyone's identity resides solely in his or her memories. I think personality also plays a factor. It is not enough that your ghost thinks it is Bill in order to assume Bill's identity; it must also think like Bill. For example, someone else talked about the possibility that their ghost could decide to "live it up" and waste all their money; to that I say, does that sound like something you'd do, and if so, why haven't you done it already?

To the extent that AIs are different from each other, they are individuals with individual identities. As long as they take their "personality" with them as well as their memories when they switch shells, I think they retain those identities. An AI is probably less likely to have a stubborn sentimental attachment to their initial shell, though.

Quote:
I don't see why you are tasking me with something that Ze' said.
Wasn't, really. I was commenting on mysticism when I happened to review the thread and see that Ze' was adding his own. Rather than go digging for the most representative bits to quote and reply to, I thought I would issue a general reply to the thread on the subject.

Quote:
Do you know the concept of qualia? They are thought be philosophers to be the essential hard point about objective accounts of consciousness: My sensation of yellow as I look at this screen, or your sensation of pain if someone hits your thumb with a hammer, are distinctively mine or yours. But I think that that has it backward! What makes my pain distinctively mine is not that it's hidden in the inner recesses of my mind, but that it's the pain of my body.
That's a quaint observation. The pain is yours because it forms part of your experience. It is one of the many things that shape and define you. If you were to cut off your arm and suffer phantom pain, or imagine pain in a dream, or suffer pain as the result of direct brain stimulation, it would still be your pain even though your body was taken out of the equation.

What is your body, anyway? Is it a doll-shape made of warm meat that senses temperature, pressure, and injury? If your personal identity relies on being able to feel pain with your own body, then replacing any part of you with cybernetics would be a fraction of murder. What if you were given an artificial leg; would you be 20% dead? Would you lose your old identity when you lost your old meaty perception?

Quote:
A disembodied intelligence might not have consciousness at all; if it did have consciousness, it might be consciousness without qualia. I'm not certain that a digital ghost would be conscious.
I assume that a ghost running in emulation, with such parameters as to render the ghost awake, would be capable of hearing itself think even if it were receiving no sensory input. Can you experience qualia in such a state? I don't know. If you experience doubt, or curiosity, or have an epiphany, are those qualia?

Quote:
I can know that other people are conscious; I can even perceive that other people are conscious. What I can't do is share their viewpoint.
Well, with the right hardware, you could. For instance, you could replace your vision with the vision of a person who's broadcasting their retinal data. Your viewpoint, visually at least, would be theirs. What then?

Quote:
Well, to start with, dogs don't have language, or anything that substitutes for language.
This is only true if you define "language" as requiring a sophisticated, standardized lexicon, conversational speech, oral tradition*, or some other feature that dog communication doesn't generally exhibit.

(*Or whatever form of communication replaces the role of speech.)

It's also rather irrelevant to the larger question of self-concept. We don't know if dogs have self-concept, and we never really will until we're able to read dog minds or one of them starts talking to us. What we do know is that they exhibit behavior appropriate to entities that do have self-concept.

Quote:
There's also experimental research that seems to tap the idea of a self-concept. If you anesthetize a chimpanzee and paint his earlobes with red varnish, and after he wakes up you allow him to see a mirror, he'll look surprised and feel his ears; a baboon won't. That looks to me like a very rudimentary manifestation of the idea of "it's me."
Perhaps, or maybe it just demonstrates a difference in intelligence, in that one animal is able to recognize that an external image is a reflection of itself and one isn't, or a different value system. Maybe the baboon doesn't care if you paint his ears. We don't know.
Xplo is offline   Reply With Quote