Steve Jackson Games - Site Navigation
Home General Info Follow Us Search Illuminator Store Forums What's New Other Games Ogre GURPS Munchkin Our Games: Home

Go Back   Steve Jackson Games Forums > Roleplaying > Transhuman Space

Reply
 
Thread Tools Display Modes
Old 12-14-2009, 05:01 PM   #101
vicky_molokh
GURPS FAQ Keeper
 
vicky_molokh's Avatar
 
Join Date: Mar 2006
Location: Kyïv, Ukraine
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by whswhs View Post
(a) There's a terminological confusion here, which is partly my fault: I don't think that a dog has a worldview, but I do think that a dog has a viewpoint. For that matter, I think a flatworm has a viewpoint. Viewpoint is a question of simple sensory awareness; worldview is conceptual and requires a self-concept.

(b) I don't agree that viewpoint can't be scientifically detected. Viewpoint exists, ultimately, because communication does not take place at infinite speed or with infinite bandwidth, and therefore you can have separate nexuses of information processing. If two of them are engaged in modeling the physical environment, they will build models that differ, if only in which parts are at the highest resolution. But if one nexus's model includes the existence of the other nexus, and the fact that the other nexus is engaged information processing, then its own modeling can take into account the fact that the other has a different model; it can, for example, request more detailed information from the other on something on which the other is better informed, or act in such a way as to induce the other to form an inaccurate model.

In short, I can know that other people are conscious; I can even perceive that other people are conscious. What I can't do is share their viewpoint. My viewpoint is created by the information I process; your viewpoint is created by the information you process. And ultimately, it's important that the information is being processed in different bodies; if I throw vodka into your open eyes, I may know that you're in pain but I won't be in your pain.

Bill Stoddard
I'm sorry for the confusion between viewpoint and worldview. I suspect I'm too sleepy for a coherent answer, but the in short: my point about scientific unobservability of viewpoints is a bit more complicated than this; it is partially related to the 'at which point does it cease', and I'll try to make my point clearer in the morning.
__________________
Vicky 'Molokh', GURPS FAQ and uFAQ Keeper
Also, GURPS Discord is a nice place for (faster) Q&A and overall GURPS dicussion.
vicky_molokh is offline   Reply With Quote
Old 12-14-2009, 05:05 PM   #102
sir_pudding
Wielder of Smart Pants
 
sir_pudding's Avatar
 
Join Date: Aug 2004
Location: Ventura CA
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by whswhs View Post
(a) There's a terminological confusion here, which is partly my fault: I don't think that a dog has a worldview, but I do think that a dog has a viewpoint. For that matter, I think a flatworm has a viewpoint. Viewpoint is a question of simple sensory awareness; worldview is conceptual and requires a self-concept.
I really don't see how a dog cannot have a self concept. They seem to have a very clear idea about how they personally relate to other individuals and to the environment. They carry out actions intended to cause specific response. I had a dog that used to fake a leg injury when he didn't want to do something. I'm not sure how he could do that if he wasn't able to conceive of himself.

If dogs don't have a worldview, I'm fairly sure I don't either.
sir_pudding is offline   Reply With Quote
Old 12-14-2009, 05:43 PM   #103
whswhs
 
Join Date: Jun 2005
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by sir_pudding View Post
I really don't see how a dog cannot have a self concept. They seem to have a very clear idea about how they personally relate to other individuals and to the environment. They carry out actions intended to cause specific response. I had a dog that used to fake a leg injury when he didn't want to do something. I'm not sure how he could do that if he wasn't able to conceive of himself.
Well, to start with, dogs don't have language, or anything that substitutes for language. And without language, you can't have more than the barest rudiments of concepts of anything. If I say "dog" you can think not only of any dogs that happen to be present now, or that might be present if you looked, but of dogs you remember encountering, of dogs you may encounter in the distant future, and of dogs that existed before your birth or after your death; it's an open-ended series. But language is part of what makes it possible. If you don't have concepts then you can't have a concept of yourself.

There's also experimental research that seems to tap the idea of a self-concept. If you anesthetize a chimpanzee and paint his earlobes with red varnish, and after he wakes up you allow him to see a mirror, he'll look surprised and feel his ears; a baboon won't. That looks to me like a very rudimentary manifestation of the idea of "it's me."

As to the dog faking an injured leg, I don't think that requires a concept of "I'll appear to be injured." I think it's simple pragmatic learning: "When I moved like this I got rewarded, so I'll move like this again." Take a look at the case of Clever Hans for how far this sort of thing can go.

Bill Stoddard
whswhs is offline   Reply With Quote
Old 12-14-2009, 07:55 PM   #104
Xplo
 
Join Date: Apr 2006
Default Re: Is Transhuman Space a "silly" genre?

Quote:
Originally Posted by whswhs View Post
And it's not what I said. My continuity of existence is primarily my continuity of bodily existence. My consciousness is something that my body does. If a different body does it, it's a different consciousness, except under very specialized circumstances which do not occur in THS.
Funny, I don't remember you saying that before. But then, my memory is a sieve.

Nevertheless, you say here that "my continuity of existence is primarily my continuity of bodily existence", suggesting that it is secondarily.. what? And I understand from your other posts that you would still consider yourself "you", and alive, if the majority of your body - say, everything from the neck down, leaving aside Loki's Wager - were replaced either with cybernetics or foreign tissue; is that correct?

Quote:
What I said was that the fact that memories can be ported from one AI to another very freely means that the personal identity of an AI cannot reside in its memories.
I don't think that anyone's identity resides solely in his or her memories. I think personality also plays a factor. It is not enough that your ghost thinks it is Bill in order to assume Bill's identity; it must also think like Bill. For example, someone else talked about the possibility that their ghost could decide to "live it up" and waste all their money; to that I say, does that sound like something you'd do, and if so, why haven't you done it already?

To the extent that AIs are different from each other, they are individuals with individual identities. As long as they take their "personality" with them as well as their memories when they switch shells, I think they retain those identities. An AI is probably less likely to have a stubborn sentimental attachment to their initial shell, though.

Quote:
I don't see why you are tasking me with something that Ze' said.
Wasn't, really. I was commenting on mysticism when I happened to review the thread and see that Ze' was adding his own. Rather than go digging for the most representative bits to quote and reply to, I thought I would issue a general reply to the thread on the subject.

Quote:
Do you know the concept of qualia? They are thought be philosophers to be the essential hard point about objective accounts of consciousness: My sensation of yellow as I look at this screen, or your sensation of pain if someone hits your thumb with a hammer, are distinctively mine or yours. But I think that that has it backward! What makes my pain distinctively mine is not that it's hidden in the inner recesses of my mind, but that it's the pain of my body.
That's a quaint observation. The pain is yours because it forms part of your experience. It is one of the many things that shape and define you. If you were to cut off your arm and suffer phantom pain, or imagine pain in a dream, or suffer pain as the result of direct brain stimulation, it would still be your pain even though your body was taken out of the equation.

What is your body, anyway? Is it a doll-shape made of warm meat that senses temperature, pressure, and injury? If your personal identity relies on being able to feel pain with your own body, then replacing any part of you with cybernetics would be a fraction of murder. What if you were given an artificial leg; would you be 20% dead? Would you lose your old identity when you lost your old meaty perception?

Quote:
A disembodied intelligence might not have consciousness at all; if it did have consciousness, it might be consciousness without qualia. I'm not certain that a digital ghost would be conscious.
I assume that a ghost running in emulation, with such parameters as to render the ghost awake, would be capable of hearing itself think even if it were receiving no sensory input. Can you experience qualia in such a state? I don't know. If you experience doubt, or curiosity, or have an epiphany, are those qualia?

Quote:
I can know that other people are conscious; I can even perceive that other people are conscious. What I can't do is share their viewpoint.
Well, with the right hardware, you could. For instance, you could replace your vision with the vision of a person who's broadcasting their retinal data. Your viewpoint, visually at least, would be theirs. What then?

Quote:
Well, to start with, dogs don't have language, or anything that substitutes for language.
This is only true if you define "language" as requiring a sophisticated, standardized lexicon, conversational speech, oral tradition*, or some other feature that dog communication doesn't generally exhibit.

(*Or whatever form of communication replaces the role of speech.)

It's also rather irrelevant to the larger question of self-concept. We don't know if dogs have self-concept, and we never really will until we're able to read dog minds or one of them starts talking to us. What we do know is that they exhibit behavior appropriate to entities that do have self-concept.

Quote:
There's also experimental research that seems to tap the idea of a self-concept. If you anesthetize a chimpanzee and paint his earlobes with red varnish, and after he wakes up you allow him to see a mirror, he'll look surprised and feel his ears; a baboon won't. That looks to me like a very rudimentary manifestation of the idea of "it's me."
Perhaps, or maybe it just demonstrates a difference in intelligence, in that one animal is able to recognize that an external image is a reflection of itself and one isn't, or a different value system. Maybe the baboon doesn't care if you paint his ears. We don't know.
Xplo is offline   Reply With Quote
Old 12-14-2009, 08:21 PM   #105
whswhs
 
Join Date: Jun 2005
Default Re: Is Transhuman Space a "silly" genre?

Quote:
Originally Posted by Xplo View Post
This is only true if you define "language" as requiring a sophisticated, standardized lexicon, conversational speech, oral tradition*, or some other feature that dog communication doesn't generally exhibit.
There is a well understood definition of language in the field of linguistics, which quite clearly does not apply to anything that dogs are capable of.

Bill Stoddard
whswhs is offline   Reply With Quote
Old 12-14-2009, 09:17 PM   #106
combatmedic
Banned
 
Join Date: Oct 2006
Location: a crooked, creaky manse built on a blasted heath
Default Re: Is Transhuman Space a "silly" genre?

Quote:
Originally Posted by Ze'Manel Cunha View Post
I would counter that Technoworship and the mystical belief that you can create and upload a copy of "you" which then lives on after your body dies is not any different from the mystical belief in a "soul" which lives on after your body dies.

Uploaded sentient ghosts and souls are all equivalent mystical wishful thinking, though with the upload you also become legion instead of merely having an immortal soul.
As I see it:

Transhumanism = atheism + fear of death + a desire to replace a 'dead' God with 'Trans-humanity'

There is serious speculation behind the technological assumptions and predictions of transhumanism, but the ideological thrust of the 'movement' is definitely mystical and religious in nature.

YMMV, of course.

Last edited by combatmedic; 12-14-2009 at 11:45 PM. Reason: tone, clarity
combatmedic is offline   Reply With Quote
Old 12-15-2009, 12:54 AM   #107
whswhs
 
Join Date: Jun 2005
Default Re: Is Transhuman Space a "silly" genre?

Quote:
Originally Posted by combatmedic View Post
There is serious speculation behind the technological assumptions and predictions of transhumanism, but the ideological thrust of the 'movement' is definitely mystical and religious in nature.
I would not quite call it "mystical." But it's true that science fiction came into being more or less in the first period in post-classical Western history when it was possible to deny religion and not be harshly punished, and when a nonsupernaturalistic worldview was gaining strength. So instead of imagining yourself as an eternal soul that would outlive the natural world, you had the option of seeing yourself as a brief spark against endless night. See the dying earth in The Time Machine, the dying universe in The Star Maker, or Lovecraft's nightmarish visions.

Science fiction then gives us imaginative visions of faster-than-light travel (giving us access to the entire physical universe), time travel (giving us access to the past and future), paratime travel (giving us access to what might have been), longevity (giving us access to the future by a different means), psi powers (expanding our minds to gain all those things), and so on. That is, it's a series of imaginative images of humanity being as big as the cosmos and perhaps more durable. Transhumanism is just taking the sfnal mythmaking and bringing it back into the real world as an ideological program.

On the other hand, I think of Robert Anton Wilson thinking that when he was asked if he wanted to immanentize the eschaton, his answer was, "Yes, by Wednesday if possible." I really think that's a sensible answer.

But then, there's C. S. Lewis's counterpoint:

Far too long have sages vainly
Glossed great Nature's simple text.
He who runs may read it plainly:
"Goodness = what comes next."
By evolving, Life is solving
All the problems we perplexed.

On, then! Value means survival-
Value. If our progeny
Spreads and spawns and licks each rival,
That will prove its deity
(Far from pleasant, by our present
Standards, though it well may be).

Bill Stoddard
whswhs is offline   Reply With Quote
Old 12-15-2009, 12:58 AM   #108
Flyndaran
Untagged
 
Join Date: Oct 2004
Location: Forest Grove, Beaverton, Oregon
Default Re: Is Transhuman Space a "silly" genre?

Quote:
Originally Posted by whswhs View Post
There is a well understood definition of language in the field of linguistics, which quite clearly does not apply to anything that dogs are capable of.

Bill Stoddard
Yes, but who says that language is absolutely necessary for a sense of self?
I had a dog that created her own hard cider by knocking down and burying apples. She would wait until they fermented and then gorge herself until quite drunk. She would recognize herself in the mirror as well. She knew that just because someone has always been allowed inside when owners are present does not mean that they are part of the pack and can enter at any other time.

We must judge based on observed behavior, and if all means cannot tell the difference between an animal having a "human" trait and one that does not, then I think we should assume they do.
Flyndaran is offline   Reply With Quote
Old 12-15-2009, 01:18 AM   #109
whswhs
 
Join Date: Jun 2005
Default Re: Is Transhuman Space a "silly" genre?

Quote:
Originally Posted by Xplo View Post
I don't think that anyone's identity resides solely in his or her memories. I think personality also plays a factor. It is not enough that your ghost thinks it is Bill in order to assume Bill's identity; it must also think like Bill. For example, someone else talked about the possibility that their ghost could decide to "live it up" and waste all their money; to that I say, does that sound like something you'd do, and if so, why haven't you done it already?
Neither of those is sufficient. If my ghost has all my memories, AND has my personality, that still doesn't make it me; it makes it someone else who thinks like me and remembers being me. Memory is no criterion of identity in a world where memories are fungible; and similarity of personality isn't either . . . my clone might have emotional patterns and cognitive style virtually identical to mine, but that wouldn't make him me.

Continuity of identity, for me, rests partly in continuity of conscious experience, but also partly in continuity of physical existence: When I go to sleep, I wake up with the same familiar body. My consciousness is an activity of my body. If a different body engages in a similar activity, that's ITS consciousness and not mine.

There is a possible handoff here. Suppose that you upload me, not by putting my brain into nanostasis, but by shaving off the top layer of my cortex, emulating it digitally, integrating the simulated neural input and output into the real input and output of the original brain, leaving the brain+simulation in control of the body. And suppose that you keep doing that, layer by layer, iteratively, until the digital simulation replaces the brain completely as controller of the body. And suppose then you acclimitize me to occupying a cybershell or bioshell or even a virtual environment instead of my physical body. If you maintained continuity of consciousness all the way through, I would say that my consciousness had been handed off to a new substrate.

An interesting converse question is, given that memory is fungible to digital entities, under what circumstances can we say that digital entity A at time 1 and digital entity B at time 2 are the same entity? It can't be a question of B remembering being A! What would you require to say that B was the same individual as A?

Bill Stoddard
whswhs is offline   Reply With Quote
Old 12-15-2009, 01:26 AM   #110
whswhs
 
Join Date: Jun 2005
Default Re: Is Transhuman Space a "silly" genre?

Quote:
Originally Posted by Flyndaran View Post
Yes, but who says that language is absolutely necessary for a sense of self?
I had a dog that created her own hard cider by knocking down and burying apples. She would wait until they fermented and then gorge herself until quite drunk. She would recognize herself in the mirror as well. She knew that just because someone has always been allowed inside when owners are present does not mean that they are part of the pack and can enter at any other time.

We must judge based on observed behavior, and if all means cannot tell the difference between an animal having a "human" trait and one that does not, then I think we should assume they do.
That way lies Clever Hans.

I didn't say "sense of self"; I said "self-concept." Could you say to your dog, "Tell me how you figured out that trick with the apples?" and have her tell you the story? Could you ask her, "Do you think it's desirable to intoxicate yourself on fermented apples?" and have her discuss the tradeoffs being immediate pleasure and long-term displeasure, or long-term consequences from doing stupid things when drunk? Could you discuss with her whether she was drunk or not, or ask her how often she got drunk?

Bill Stoddard
whswhs is offline   Reply With Quote
Reply

Tags
verhängnisthread

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Fnords are Off
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 12:40 AM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2018, vBulletin Solutions, Inc.