Principles have a priority.
Isaac Asimov’s three rules of robotics were:
First Law
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
We’d like to think we can honor everything we believe in, every time, but of course, the difficult work of having principles is that we must put some ahead of others.
If one of your principles is, “win at all costs,” then you have no other principles.
Last Comments