Steve Jackson Games - Site Navigation
Home General Info Follow Us Search Illuminator Store Forums What's New Other Games Ogre GURPS Munchkin Our Games: Home

Go Back   Steve Jackson Games Forums > Roleplaying > Transhuman Space

Reply
 
Thread Tools Display Modes
Old 08-13-2011, 03:31 PM   #601
teviet
 
Join Date: May 2005
Default Re: Ghosts and Mind Copies - The Identity Question

Out of respect for Vicky's request as OP in the other thread, I'll post this reply here, though I'm not sure if Flyndaran has followed us...

Quote:
Originally Posted by Flyndaran View Post
I don't need to know what is possible. You have to prove to me that a computer can feel in the same manner as complex living creatures. The thing is that I have no idea how to make something non-organic have real feelings. But just making it pretend to have them is nowhere near enough.
I'm not sure what the burden of proof is here. What would Flyndaran accept as "proof" that a being was "really" feeling pain or emotion?

E.g. an octopus has a very different neural architecture than a mammal; its brain evolved completely independently. It is very intelligent, and has an obvious aversion to harmful stimuli. But by Flyndaran's logic, since its neural structures are not the same as a human's or a cat's, it is only behaving like it feels pain; it can't actually feel pain.

As far as I'm concerned, though, if a being behaves in all ways as if it were feeling pain, or some other emotion, then it is. It either has the same cognitive structures, or it has some other cognitive structure that serves the same purpose. Either way I would treat the emotion as "real". Otherwise you are headed towards a metaphysical dead end: "Are you really upset or do you just think you're upset?"

TeV

Last edited by teviet; 08-13-2011 at 04:33 PM. Reason: spelling
teviet is offline   Reply With Quote
Old 08-13-2011, 05:01 PM   #602
whswhs
 
Join Date: Jun 2005
Location: Lawrence, KS
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by teviet View Post
As far as I'm concerned, though, if a being behaves in all ways as if it were feeling pain, or some other emotion, then it is. It either has the same cognitive structures, or it has some other cognitive structure that serves the same purpose. Either way I would treat the emotion as "real". Otherwise you are headed towards a metaphysical dead end: "Are you really upset or do you just think you're upset?"
Well, what counts as "behaving in all ways"?

When a human being feels pain, it's often a result of tissue destruction or damage. It produces bodily movements, often rapid and often involuntary, to get away from the damaging agent. Moreover, it produces a large shift of attention away from whatever the human being was paying attention to previously. It results in the flooding of the bloodstream with adrenalin and the hyperactivation of the sympathetic nervous system, with the end result of mobilizing the body for intense physical effort. In a learning situation, it typically results in behavior that's followed by pain being extinguished.

I think that I would take anything that fit all of those statements as being an example of pain. Perhaps some of them could be removed and you could still have at least borderline examples of pain. What do you think?

Bill Stoddard
whswhs is online now   Reply With Quote
Old 08-13-2011, 11:14 PM   #603
teviet
 
Join Date: May 2005
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by whswhs View Post
Well, what counts as "behaving in all ways"?

When a human being feels pain, it's often a result of tissue destruction or damage. It produces bodily movements, often rapid and often involuntary, to get away from the damaging agent. Moreover, it produces a large shift of attention away from whatever the human being was paying attention to previously. It results in the flooding of the bloodstream with adrenalin and the hyperactivation of the sympathetic nervous system, with the end result of mobilizing the body for intense physical effort. In a learning situation, it typically results in behavior that's followed by pain being extinguished.

I think that I would take anything that fit all of those statements as being an example of pain. Perhaps some of them could be removed and you could still have at least borderline examples of pain. What do you think?
I would place more emphasis on things like observable behaviour and higher-level cognitive functions (to the extent that these can be measured separately from behaviour), rather than low-level physiological responses.

Back to the cephalopod example: I don't actually know how much their biochemical and metabolic response to "pain" resembles that of vertebrates. I would expect it to be quite different (their blood has a very different chemistry from ours, and epinephrine apparently decreases their heart rate). But I would consider them to feel pain if they (a) withdraw from the stimulus, (b) take action to succor or favour the afflicted member (even in the absence of actual damage), and (c) adopt behaviour that avoids future exposure.

Naturally I've included (b) and (c) to discriminate between a reflex and "real" pain. But I've known humans who show an apparent deficiency of (c), which leaves me to wonder whether their subjective experience of pain is actually similar to my own.

TeV
teviet is offline   Reply With Quote
Old 08-14-2011, 12:41 AM   #604
whswhs
 
Join Date: Jun 2005
Location: Lawrence, KS
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by teviet View Post
I would place more emphasis on things like observable behaviour and higher-level cognitive functions (to the extent that these can be measured separately from behaviour), rather than low-level physiological responses.

Back to the cephalopod example: I don't actually know how much their biochemical and metabolic response to "pain" resembles that of vertebrates. I would expect it to be quite different (their blood has a very different chemistry from ours, and epinephrine apparently decreases their heart rate). But I would consider them to feel pain if they (a) withdraw from the stimulus, (b) take action to succor or favour the afflicted member (even in the absence of actual damage), and (c) adopt behaviour that avoids future exposure.

Naturally I've included (b) and (c) to discriminate between a reflex and "real" pain. But I've known humans who show an apparent deficiency of (c), which leaves me to wonder whether their subjective experience of pain is actually similar to my own.
I think you are slightly missing the point of the adrenalin. Let me compare: When a human being experiences tissue destruction, it causes the firing of certain specific neurons, whose signals the brain interprets as "pain." Cephalopods presumably don't have those specific neurons. But they have some sort of receptors for tissue damage, which are functionally comparable. And also, to escape from tissue damage, they're going to need to mobilize their physiological functions for rapid and energetic action (for example, using their siphon for rapid movement); the fact that the specific compound adrenalin isn't involved in that doesn't change the basic functional point.

Now, I think that for a state to count as "pain" it needs to have a bodily manifestation, not just in awareness of physical damage to the body, but in bodily functioning changing as a whole, in the whole state of the body being transformed. Otherwise it's a purely mental and cognitive decision-making process. And I think it's essential to pain that it represents the direct demands of the body overriding cognitive processes.

Bill Stoddard
whswhs is online now   Reply With Quote
Old 08-14-2011, 02:37 AM   #605
teviet
 
Join Date: May 2005
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by whswhs View Post
I think you are slightly missing the point of the adrenalin. Let me compare: When a human being experiences tissue destruction, it causes the firing of certain specific neurons, whose signals the brain interprets as "pain." Cephalopods presumably don't have those specific neurons. But they have some sort of receptors for tissue damage, which are functionally comparable. And also, to escape from tissue damage, they're going to need to mobilize their physiological functions for rapid and energetic action (for example, using their siphon for rapid movement); the fact that the specific compound adrenalin isn't involved in that doesn't change the basic functional point.
Actually my point was (I think) the same as yours: that we need to compare functionally equivalent responses, rather than requiring precise physiological correspondence, in order to identify whether an organism is experiencing pain.

Quote:
Originally Posted by whswhs View Post
Now, I think that for a state to count as "pain" it needs to have a bodily manifestation, not just in awareness of physical damage to the body, but in bodily functioning changing as a whole, in the whole state of the body being transformed. Otherwise it's a purely mental and cognitive decision-making process. And I think it's essential to pain that it represents the direct demands of the body overriding cognitive processes.
This is an interesting point. It's certainly an important aspect of our experience of pain, but it might be a challenge to map onto other sentient beings.

In particular, it assumes that the sentient being will have its cognitive functions partitioned into levels like our own: with a "top level" awareness controlling voluntary responses, and underlying autonomic neural processes governing involuntary responses. If a sentient mind was not partitioned in this way, there might be no distinction between voluntary and involuntary responses. It could require some extensive mapping of cognitive functions to determine if this were the case.

This relates to Flyndaran's other concern: whether an AI could actually feel emotions or just act as if it did. Operationally, it seems to me that the distinction between a "real" and a "fake" emotion is mostly that real emotions are persistent and involuntary, whereas an actor can (usually) turn a "faked" emotion on and off at will. Again, if a sentient being did not have separate voluntary and involuntary cognitive processes, the distinction might not be meaningful.

TeV
teviet is offline   Reply With Quote
Old 08-14-2011, 04:02 AM   #606
vicky_molokh
GURPS FAQ Keeper
 
vicky_molokh's Avatar
 
Join Date: Mar 2006
Location: Kyïv, Ukraine
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by teviet View Post
This is an interesting point. It's certainly an important aspect of our experience of pain, but it might be a challenge to map onto other sentient beings.

In particular, it assumes that the sentient being will have its cognitive functions partitioned into levels like our own: with a "top level" awareness controlling voluntary responses, and underlying autonomic neural processes governing involuntary responses. If a sentient mind was not partitioned in this way, there might be no distinction between voluntary and involuntary responses. It could require some extensive mapping of cognitive functions to determine if this were the case.

This relates to Flyndaran's other concern: whether an AI could actually feel emotions or just act as if it did. Operationally, it seems to me that the distinction between a "real" and a "fake" emotion is mostly that real emotions are persistent and involuntary, whereas an actor can (usually) turn a "faked" emotion on and off at will. Again, if a sentient being did not have separate voluntary and involuntary cognitive processes, the distinction might not be meaningful.

TeV
Do Second Stage Lensemen have emotions/feel pain? IIRC they have no such thing as an unconscious or involuntary response, e.g. even if drugged, their mind spookily acts no different than if sober. Or perhaps I'm misremembering.
__________________
Vicky 'Molokh', GURPS FAQ and uFAQ Keeper
vicky_molokh is offline   Reply With Quote
Old 08-14-2011, 10:10 AM   #607
whswhs
 
Join Date: Jun 2005
Location: Lawrence, KS
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by teviet View Post
In particular, it assumes that the sentient being will have its cognitive functions partitioned into levels like our own: with a "top level" awareness controlling voluntary responses, and underlying autonomic neural processes governing involuntary responses. If a sentient mind was not partitioned in this way, there might be no distinction between voluntary and involuntary responses. It could require some extensive mapping of cognitive functions to determine if this were the case.

This relates to Flyndaran's other concern: whether an AI could actually feel emotions or just act as if it did. Operationally, it seems to me that the distinction between a "real" and a "fake" emotion is mostly that real emotions are persistent and involuntary, whereas an actor can (usually) turn a "faked" emotion on and off at will. Again, if a sentient being did not have separate voluntary and involuntary cognitive processes, the distinction might not be meaningful.
*I think that there are two aspects to this separateness of emotional processes. One is that they start up spontaneously, without waiting for a conscious decision. The other is that they have a kind of "inertia," so that once started up, they continue; living as an emotional being is kind of like steering a heavy truck down a steep road.

*Functionally, I think, it's important to have some preset responses that will come up automatically, not waiting for cognitive processing. Imagine a government where the police were not allowed to stop a fight without reporting to the mayor for orders—or the president!

*With pain, though, I think there's a further level that is not even "cognitive" at all: The level of bodily response. Not everything the nervous system does is "cognition." The sheer physicality of pain is part of the experience of pain.

*I'd also note that, even viewed as sensory experience, pain cannot be under the control of the higher cognitive processes—precisely because it IS sensory experience. Cognition only operates meaningfully in relation to the real world. To do so, it needs to have information about the real world. If sensory processes operated in a manner dictated by higher cognitive processes, they would not be a source of independent data; it's precisely because they function mostly automatically, reacting to physical stimuli, that they are a source of such data. Data on the integrity of your body are just as much data about the physical world as data about light or sound or taste. (Consider the etymology of "data.")

Bill Stoddard
whswhs is online now   Reply With Quote
Old 08-14-2011, 04:12 PM   #608
teviet
 
Join Date: May 2005
Default Re: Ghosts and Mind Copies - The Identity Question

I agree with most of those points, pretty much. I would address one of your points though:

Quote:
Originally Posted by whswhs View Post
...
*With pain, though, I think there's a further level that is not even "cognitive" at all: The level of bodily response. Not everything the nervous system does is "cognition." The sheer physicality of pain is part of the experience of pain.
...
I imagine that psychologists and neurologists could define many different layers of consciousness and other types of neural processing. But I tend to lump them into fairly broad categories: "voluntary" and "involuntary", for lack of better terminology. Finer gradations run the risk of being too human-specific.

That is, there is some distinct set of neural processes that maintain an internal model of the world, correlate memories and perceptions, and initiate planned or "voluntary" actions -- the "rational" or "conscious" mind. There are other processes that are not controlled by this unit, that either operate independently or provide input to it. Obviously "pain" is a particularly high-priority input that has a unique capacity both to trigger involuntary reactions and to override other factors in deciding on voluntary actions.

The rapid and involuntary responses to pain are clearly useful from a survival point of view, since going through the higher-level associative processes takes time. In an AI of sufficient speed, this might not be necessary, and there might be no distinction between reflexive and voluntary responses. The persistence or "inertia" of pain, as well as its tendency to dominate other considerations, might also be considered an unnecessary relic by transhumanists. If a transhuman had the capability (by genetic manipulation, neurosurgery, or non-biological nature) to judge rationally the significance of the pain perception along with other sensory data, or to preempt involuntary responses, then I assume their experience of pain would be subjectively different from my own.

Quote:
Originally Posted by vicky_molokh View Post
Do Second Stage Lensemen have emotions/feel pain? IIRC they have no such thing as an unconscious or involuntary response, e.g. even if drugged, their mind spookily acts no different than if sober.
Or someone like Nick Stavrianos in Quarantine, who still has unconscious responses by default, but can choose to invoke a mental state that preempts them: in anticipation of situations where he knows he will need to act rationally in spite of painful stimuli. While in this mental state he does not feel emotions, or "pain" as we would define it; bodily damage is presented to his awareness on a level comparable to other sensory input.

TeV
teviet is offline   Reply With Quote
Old 08-14-2011, 06:28 PM   #609
whswhs
 
Join Date: Jun 2005
Location: Lawrence, KS
Default Re: Ghosts and Mind Copies - The Identity Question

Quote:
Originally Posted by teviet View Post
The rapid and involuntary responses to pain are clearly useful from a survival point of view, since going through the higher-level associative processes takes time. In an AI of sufficient speed, this might not be necessary, and there might be no distinction between reflexive and voluntary responses. The persistence or "inertia" of pain, as well as its tendency to dominate other considerations, might also be considered an unnecessary relic by transhumanists. If a transhuman had the capability (by genetic manipulation, neurosurgery, or non-biological nature) to judge rationally the significance of the pain perception along with other sensory data, or to preempt involuntary responses, then I assume their experience of pain would be subjectively different from my own.
I think that if it can be processes cognitively, and prioritized rationally, it doesn't count as pain: It's not sufficiently like our experience of pain to merit a common name. I also think that if there is not a physical body with its own urgencies—if, for example, you expect simply to be able to upload to a different CPU if the shell you're in is losing functionality—then the experience is not meaningfully like pain. "Negative value" is a broader category than "pain."

Bill Stoddard
whswhs is online now   Reply With Quote
Old 08-14-2011, 10:17 PM   #610
ErhnamDJ
 
Join Date: Aug 2009
Location: OK
Default Re: Ghosts and Mind Copies - The Identity Question

So these robots are like someone with CIP?
ErhnamDJ is offline   Reply With Quote
Reply

Tags
verhängnisthread

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Fnords are Off
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 05:10 PM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2024, vBulletin Solutions, Inc.