Quote:
Originally Posted by KG26_Alpha
Isaac Asimov
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
|
The first law has a fault - a possible deadend, because that law does not cover situation in which dilemma may arise: saving one human causes harm to another, so saving violates the law, but doing nothing violates it too. Undefined behavior anyone?

That is the nastiest thing could happen in software.