Third Law

RosieSadAfter all the things that crazy scientists are doing with robots — self-driving cars, robot apes, nuclear snakes — the populace still is not up in arms.

They should be.

You see, we have the idea that Asimov’s Three Laws of Robotics are real:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

That’s from his science fiction books, not his real stuff books.

Recently, a robot committed suicide. Really.

One of those room-cleaning robots turned itself on, pushed a pot off the stove, and sat there and died.

Now, the important thing isn’t that a cleaning robot was up on the counter near the stove. The important thing isn’t that this was in Austria, although cleaning up a house in Austria would depress me. No, the important thing is that it shows that the Three Laws are fiction.

So, a robot CAN harm itself (Third Law). Then, a robot could disobey orders (Second Law). And, a robot can injure a human (First Law). That means that a robot can turn on you. That means robots can go crazy and kill themselves. Yep. Muslim robots. Or Branch Davidian robots. Or People’s Temple robots. Or Solar Temple robots. Or left-wing Obamabot-bots.

Robots can go crazy and kill you, and don’t care if they get hurt in the process. Don’t trust a robot, that’s the message.

Either that, or don’t put robots up on the counter near the stove. Grab a Bounty and wipe up the Cheerios, you lazy slob.

10 Comments

  1. Doesn’t make sense… I’ve known several Roombas, and much like how “Hitchhiker” doors enjoy opening and closing, Roombas LOVE cleaning.

    My theory is that the man was angry because the Roomba wasn’t cleaning up the cereal fast enough, so he decided to “punish” it by putting it on the stove. Things got out of hand, and then he made up this ridiculous cover story.

    I don’t know if it’s murder, but it’s at least first degree botslaughter.

  2. you are aware, aren’t you, that these ‘laws’ were instructions programmed into the robots- they’re not like physical laws that cannot be violated, they’re a program.
    Ask Bill Gates if programs can be flawed.

Leave a Reply to Burt Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.