Steve Jackson Games - Site Navigation
Home General Info Follow Us Search Illuminator Store Forums What's New Other Games Ogre GURPS Munchkin Our Games: Home

Go Back   Steve Jackson Games Forums > Roleplaying > GURPS

Reply
 
Thread Tools Display Modes
Old 02-06-2018, 02:12 PM   #41
Flyndaran
Untagged
 
Join Date: Oct 2004
Location: Forest Grove, Beaverton, Oregon
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Such ghost programs are one way to have machine intelligences of a "humans in funny suits" type somewhat realistically. Just copy formerly living people even if you have no way to fully understand how we work or think and can't make ground-up A.I.s that are remotely as sapient.
__________________
Beware, poor communication skills. No offense intended. If offended, it just means that I failed my writing skill check.
Flyndaran is offline   Reply With Quote
Old 02-06-2018, 03:24 PM   #42
sir_pudding
Wielder of Smart Pants
 
sir_pudding's Avatar
 
Join Date: Aug 2004
Location: Ventura CA
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Quote:
Originally Posted by AlexanderHowl View Post
Why? A picture is not a painting, a hologram is not a statue, a 3D movie is not reality.
A representation isn't a full resolution simulation either.

Quote:
On a more serious note, the amount of computing power required to simulate the activity of a human brain would be literally incalculable, as the program would have to 'know' which of the 100 billion neurons activate for each thought, feeling, action, etc. Since we are talking about 100 billion elements, we are talking about a 100 billion factorial (100,000 factorial is nearly 3 * 10^456,573 potential combinations and 100 billion factorial is close enough to infinity that it might as well be infinity). In addition, the connections matter, so you are probably talking about 10 trillion connections within the average human brain, so you need a minimum 10 TB of data just to simulate the biology of a single emotion or thought for a specific individual.
If its impossible to simulate a brain, it probably isn't possible to make strong AI at all.
sir_pudding is offline   Reply With Quote
Old 02-06-2018, 03:37 PM   #43
Anthony
 
Join Date: Feb 2005
Location: Berkeley, CA
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Quote:
Originally Posted by sir_pudding View Post
If its impossible to simulate a brain, it probably isn't possible to make strong AI at all.
That doesn't necessarily follow; there could be a reason the simulation is impractical that is not directly required for intelligence.
__________________
My GURPS site and Blog.
Anthony is online now   Reply With Quote
Old 02-06-2018, 03:41 PM   #44
sir_pudding
Wielder of Smart Pants
 
sir_pudding's Avatar
 
Join Date: Aug 2004
Location: Ventura CA
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Quote:
Originally Posted by Anthony View Post
That doesn't necessarily follow; there could be a reason the simulation is impractical that is not directly required for intelligence.
If it is impossible because human intelligence requires a number of elements that exceed practical computational limits as AlexanderHowl is suggesting, then human-equivalent AI might be expected to as well.
sir_pudding is offline   Reply With Quote
Old 02-06-2018, 03:48 PM   #45
whswhs
 
Join Date: Jun 2005
Location: Lawrence, KS
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Quote:
Originally Posted by AlexanderHowl View Post
Why? A picture is not a painting, a hologram is not a statue, a 3D movie is not reality. Simulation is not reality and, beyond an approximation, a simulation will never be reality. It is the reason why physicists were capable of proving that we do not live in a virtual universe (https://cosmosmagazine.com/physics/p...ter-simulation).
If our universe were not the real universe, but a simulated one, then we would not know what the real universe was like, and could not judge whether our universe resembled it or not. Physics deals in factually based theories, and there are no observed facts about the putative "real world."

That's not to say there aren't grounds for disregarding the idea. But they're philosophical rather than scientific.

Quote:
On a more serious note, the amount of computing power required to simulate the activity of a human brain would be literally incalculable, as the program would have to 'know' which of the 100 billion neurons activate for each thought, feeling, action, etc.
As Anthony points out, that's not at all required. After all, you don't consciously know which neurons to activate if you want to think of the concept of a simulation, nor does your brain have a central register of neurons and their assigned meanings where you could look such a thing up.

What determines "meaning" is in large part muscular output: vocalizations, eye movements, facial expressions, body postures and movements, and so on. But that drops out of the hypothetical simulation automatically, I think.

Quote:
Since we are talking about 100 billion elements, we are talking about a 100 billion factorial (100,000 factorial is nearly 3 * 10^456,573 potential combinations and 100 billion factorial is close enough to infinity that it might as well be infinity). In addition, the connections matter, so you are probably talking about 10 trillion connections within the average human brain, so you need a minimum 10 TB of data just to simulate the biology of a single emotion or thought for a specific individual.
I don't see why you're using factorials. That would be logical if you were trying to determine every possible firing sequence of those neurons. But neurons don't fire one at a time, and "which neuron comes after which" doesn't seem to code for much of anything. It would be slightly more logical to do 2^100,000,000,000 for every possible of/off state of all neurons. Except that neuron coding isn't on/off anyway; it's more "how frequently is this neuron firing?" And you can probably rule out a large fraction of neural states as things that will never occur in a functioning human brain.

As for 10 TB of data, you can buy a 1TB hard drive for under $100 currently. That's not a lot of data any more. And transhumanism often tacitly assumes that computer speeds and capacities will continue to improve for some decades to come.
__________________
Bill Stoddard

I don't think we're in Oz any more.
whswhs is offline   Reply With Quote
Old 02-06-2018, 03:53 PM   #46
sir_pudding
Wielder of Smart Pants
 
sir_pudding's Avatar
 
Join Date: Aug 2004
Location: Ventura CA
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

There was a fairly recent estimate of the total storage capacity of human memory to around one petabyte.
sir_pudding is offline   Reply With Quote
Old 02-06-2018, 04:31 PM   #47
vicky_molokh
GURPS FAQ Keeper
 
vicky_molokh's Avatar
 
Join Date: Mar 2006
Location: Kyïv, Ukraine
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Quote:
Originally Posted by Flyndaran View Post
It is an artificial intelligence by definition though. It's a digital copy of a fundamentally different biological object, an emulation.
I'm not saying anything about their internal thoughts, feelings or behavior. But they are A.I.s. I'm not getting into the thread of doom at all by saying this.
It's not an AI because it's an emulation. If it weren't an emulation of a biohuman (or any other) emergent mind, it'd be an AI. Ontologically it has more in common with EI that spawns spontaneously than with an AI that is crafted by TL10 coders.
__________________
Vicky 'Molokh', GURPS FAQ and uFAQ Keeper
vicky_molokh is offline   Reply With Quote
Old 02-06-2018, 04:33 PM   #48
AlexanderHowl
 
Join Date: Feb 2016
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Yes, but storage capacity does not correlate to simulation requirements. In the case of digital intelligence, it may be an emergent function that arises from an adaptive digital framework (similar to the way that the brain forms during childhood). In the case of an SAI, the process of creating an artificial digital intelligence would come from creating a foundational digital framework that possesses a high probability of evolving into an artificial digital intelligence (a Rogue intelligence would form from an emergent process in an existing digital framework). In the case of a Bioroid, the process of creating an artificial biological intelligence would come from using modified human brain tissue as a foundational biological framework in order to guarantee the evolution of an artificial biological intelligence (human intelligence forms as an emergent process from the human biological intelligence). Of course, this is supposition, as we are probably no closer to creating an artificial intelligence than we are to establishing mines in the Main Belt.
AlexanderHowl is offline   Reply With Quote
Old 02-06-2018, 04:34 PM   #49
Anthony
 
Join Date: Feb 2005
Location: Berkeley, CA
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Quote:
Originally Posted by vicky_molokh View Post
It's not an AI because it's an emulation.
This seems to be mostly a semantic issue. Current usage tends towards calling any machine intelligence an AI, which would include emulations, but you can certainly come up with other dividing lines between 'artificial' and 'natural' intelligence.
__________________
My GURPS site and Blog.
Anthony is online now   Reply With Quote
Old 02-06-2018, 04:38 PM   #50
sir_pudding
Wielder of Smart Pants
 
sir_pudding's Avatar
 
Join Date: Aug 2004
Location: Ventura CA
Default Re: Medium Dependent Intelligence, Uncopyable Intelligence, and Transhumanism

Quote:
Originally Posted by AlexanderHowl View Post
In the case of digital intelligence, it may be an emergent function that arises from an adaptive digital framework (similar to the way that the brain forms during childhood).
If such a framework requires a computationally impractical number of discrete elements then it won't ever be constructed. If it doesn't then there's no reason to suppose that human intelligence does either.
sir_pudding is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Fnords are Off
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 01:43 PM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2024, vBulletin Solutions, Inc.