Quote:
Originally Posted by vicky_molokh
Does letting millions die through your inaction count as unjustifiably high risk, thus making inaction count as murder and thus forcing the AI to choose between two different variants of mass murder?
|
Pulling back a little, an AI that can decide to do this has almost certainly lost the Honesty disadvantage whether the killing was justified or not. Almost all law codes, and certainly anything you'd program into an AI, would require *reporting* the situation that set up this dilemma to an appropriate civil authority and letting it make the decision.
Of course one can always set up some sort of twisted torture case where there is an unbreakable time too short for that, but it's not going to be a very realistic situation. Even the trolley version is pretty forced, something that will kill millions through inaction quickly and by surprise is going to be pretty hard to make believable.