04-08-2011, 03:35 AM | #11 |
Join Date: Mar 2006
Location: Iceland*
|
Re: Three Laws of Robotics
Eh, no.
A -2 point Disadvantage (Sense of Duty to one individual) is not enough to cover being utterly unable to disobey an order from that individual, no matter how unjustified.
__________________
Za uspiekh nashevo beznadiozhnovo diela! |
04-08-2011, 10:51 AM | #12 | |
Join Date: Aug 2004
|
Re: Three Laws of Robotics
Quote:
|
|
04-08-2011, 10:57 AM | #13 | |
Join Date: Mar 2006
Location: Iceland*
|
Re: Three Laws of Robotics
Quote:
Why wouldn't Obsession alone do it? Because being Obsessed with something that comes up rarely is not as disadvantageous as being Obsessed with something that shapes every second of your life. That's not saying I necessarily think that Obsession is the best model here. But while it might be possible to squeeze 'Has to obey all orders from a wide variety of sources' into a variety of Mental Disadvantages, it will usually be far more severe than the issues that the Disadvantage is meant to cover. Feeling an obligation towards someone (i.e. Sense of Duty) still leaves you free to select the appropriate way to fulfill that obligation without interfering with your own goals. This is much less severe than being forced to follow all orders that stem from that person.
__________________
Za uspiekh nashevo beznadiozhnovo diela! |
|
04-08-2011, 11:17 AM | #14 |
Join Date: Jun 2005
Location: Lawrence, KS
|
Re: Three Laws of Robotics
I'd like to offer a couple of incidental notes:
1. It's not obvious that the Third Law is not worth points as a disadvantage. Taken by itself, it says that a robot may not take actions that endanger its own continued existence. If you magically annihilated every human being in an Asimovian world, its robots would have to behave in a way that minimized threats to their own survival. This is not identical to the behavior of human beings; as living organisms, human beings are ultimately motivated by threats to their own inclusive fitness, that is, to their reproductive success or that of their kin, and this creates psychological mechanisms that can be captured by other behavioral drivers such as religious or political ideologies. Robots would not have that set of mechanisms; they would be compulsively protective of their own survival. I suggest that the Third Law is representable by something like Cowardice (9) [-15]. The -5 for actual risk of death gives "overridden only on a 4 or less," which gives a chance of the robot's evaluating the risk in some skewed (or insightful!) way. 2. There's a hierarchal relationship among the laws: The Second Law applies only when the First Law is not in force, and the Third Law only when the First and Second Laws are not in force. That needs to be represented. I think that GURPS actually has a mechanism for doing so: the "alternate abilities" system. You have First Law [some number of points], AD: Second Law [some number of points/5], AD: Third Law [some number of points/5]. (I would allow the full point value to the First Law, because the choice among the laws is not free, as it is with normal AAs.) Bill Stoddard |
04-08-2011, 11:26 AM | #15 | |
Join Date: Mar 2006
Location: Iceland*
|
Re: Three Laws of Robotics
Quote:
Having Pacifism Cannot Kill (Humans Only) and Sense of Duty (Humanity) in combination with Duty (Involuntary) and Vow (Obey Human Commands) does not result in the latter two disdvantages being only 1/5 as limiting to a character. The minimal benefit for the robot's player that it is now impossible for humans to order him to kill other humans is more than negated by the fact that he cannot minimise the harm of the Second Law in the most logical manner, i.e. by killing all humans it encounters before they issue any commands. I cannot see that the existence of the First Law in any way justifies giving back fewer points for the Disadvantages covered by the Second Law. They are still going to be affecting the character at the usual frequency. There is not going to be any time when they will not apply to the character. Why would you deprive the player of 4/5th of their value?
__________________
Za uspiekh nashevo beznadiozhnovo diela! |
|
04-08-2011, 11:54 AM | #16 | |
Join Date: Aug 2004
|
Re: Three Laws of Robotics
Quote:
|
|
04-08-2011, 11:59 AM | #17 |
Join Date: May 2008
Location: CA
|
Re: Three Laws of Robotics
So you think that anyone coming up to you and saying 'Please kill yourself', and then you are forced to kill yourself, is only worth -30 points? That seems too low to me.
|
04-08-2011, 02:35 PM | #18 | |
Join Date: Feb 2006
Location: Not in your time zone:D
|
Re: Three Laws of Robotics
Quote:
But all I can find is: Pacifism: Self-Defence Only (Species Specific -20%) [-12]; Duty: Always On (Involuntary) [-20]; Major Vow (allow no harm to owners’ property) [-10]. The specific species depending on who built them and you could replace that with uniforms or badge holders or IFF... What about the Zeroth Law: "a robot must not harm humanity."?
__________________
"Sanity is a bourgeois meme." Exegeek PS sorry I'm a Parthian shootist: shiftwork + out of country = not here when you are:/ It's all in the reflexes |
|
04-08-2011, 02:40 PM | #19 | |
Join Date: Jun 2006
Location: On the road again...
|
Re: Three Laws of Robotics
Quote:
Of course, if the human says, "Kill yourself, because with you here I can't kill myself (or another human you're protecting)!" then the First Law goes into effect. The human just indicated that he'd harm another human, and the Robot cannot through inaction do anything that would cause harm to a human, and the order to terminate the Robot's own existence is rendered null and void. Now, in cases of "I order you, in no uncertain terms, to cease your operations and perform a positronic lobotomy on yourself because you're a useless waste of space!" (and who among those that watched the Star Wars Prequel Trilogy didn't want to say that to the battle droids? ^_^), then the Robot would have no choice but to do as ordered. Part of what makes the decision - as noted in one of the short stories in I, Robot - is the emphasis on the order. A mere "go kill yourself" could be construed as just a "go away" - unless you also have No Sense of Humor, the tone of voice when the order is given may determine whether the order was a genuine order for positronic suicide or just a request to leave the area. No Sense of Humor makes comments 100% literal - tell a robot with No Sense of Humor to "shake a leg", and he'll shake one of his locomotion limbs, not start moving faster. "Go take a long walk off a short pier" would have a Robot with NSoH looking for that short pier to walk off of; one without that disad would just leave the area until called for. Also, note that Reprogrammable is also implied in the Asimovian Three Laws. Giving a Robot that you don't own an order to destroy itself would likely have the robot seeking confirmation from its owner, not the person giving the order. </ramble>
__________________
"Life ... is an Oreo cookie." - J'onn J'onzz, 1991 "But mom, I don't wanna go back in the dungeon!" The GURPS Marvel Universe Reboot Project A-G, H-R, and S-Z, and its not-a-wiki-really web adaptation. Ranoc, a Muskets-and-Magery Renaissance Fantasy Setting |
|
04-08-2011, 05:04 PM | #20 |
Join Date: Sep 2007
|
Re: Three Laws of Robotics
Basically, you have what the robot must do (Duty), and how it must do it (Pacifism, also self-preservation clause).
Duty (Obey the Laws of Robotics, all the time, extremely hazardous, involuntary) [-25] Pacifism (Total) (Humans only, -20%) [-24] Selfless (15 or less; Toward Humans, +0%) [-10] Careful [-1] Total: -60 |
Tags |
disadvamtages, disadvantages, robots, the three laws |
|
|