|
|
|
|
|
#1 |
|
GURPS FAQ Keeper
Join Date: Mar 2006
Location: Kyïv, Ukraine
|
There's definitely a delusion involved. The question was whether the delusion was accompanied by the removal of safety systems of law-abiding programming that would be required to actually commit a mass murder (well, try to).
|
|
|
|
|
|
#2 |
|
Computer Scientist
Join Date: Aug 2004
Location: Dallas, Texas
|
If the Trolley Dillemma is really applicable, the thing to do would be to research plane crashes in France where the pilot was able to divert the plane from killing more people toward less, and see if his estate or employer was held liable for the deaths on the ground but not on the plane.
|
|
|
|
|
|
#3 | |
|
Join Date: Feb 2005
Location: Berkeley, CA
|
Quote:
|
|
|
|
|
|
|
#4 |
|
Computer Scientist
Join Date: Aug 2004
Location: Dallas, Texas
|
If the person is standing in the train yard at the switch controls, it's presumably because they *do* have prior responsibility for the train. But it's not particularly important that they be unconnected with the train, just that they not be responsible for the situation to begin with, and finding the people previously bound and laid in the way takes care of that. Similarly, if the pilot is not held responsible for the people on board dying, that indicates he's not responsible for the crashing condition, only for the choice of where to crash.
|
|
|
|
|
|
#5 |
|
Banned
Join Date: Jan 2013
|
Hm, so that programming couldn't be suppressed or corrupted even temporarily (like temporary insanity for a human) maybe living the AI fundamentally broken mentally afterward (an insanity/wracked with guilt like state).
|
|
|
|
![]() |
| Tags |
| honesty, murder, trolley dilemma |
|
|