How Three Laws of Robotics Point to God

When Isaac Asimov wrote the three laws of robotics, he did not simply describe fictional robots. He proposed a moral code for humanity. From his novel, I, Robot, and the Will Smith movie with the same name (spoilers for both), we see the benefits of these laws. At the same time, their insufficiency points us to a personal God.

The three laws:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Live By the Three Laws

In the novel, robot psychologist Susan Calvin discusses the possibility a politician named Byerley might secretly be a robot. She explains the rules:

“Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of a good many of the world’s ethical systems. Of course, every human being is supposed to have the instinct of self-preservation. That’s Rule Three to a robot. Also every ‘good’ human being, with a social conscience and a sense of responsibility, is supposed to defer to proper authority; to listen to his doctor, his boss, his government, his psychiatrist, his fellow man; to obey laws, to follow rules, to conform to custom—even when they interfere with his comfort or his safety. That’s Rule Two to a robot. Also, every ‘good’ human being is supposed to love others as himself, protect his fellow man, risk his life to save another. That’s Rule One to a robot. To put it simply—if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man.”

Byerley, as district attorney, models an ethical life. Following the first law, he never tries a suspect without sufficient evidence, nor asks for capital punishment. In fact, he invests his own money for research into rehabilitation over incarceration. In his personal life, he even cares for a disabled man.

If more people lived like Byerley, the world would be a safer, happier place. Following these laws, considering them as we make decisions, would help us live a moral, ethical life.

Don’t Live By Three Laws

At the same time, the Asimov novel and the film loosely based on it reveal the stunning insufficiency of these laws:

Novel
  • A robot finds a conflict between a weakly given command to acquire a necessary resource (Law 2) and a dangerous situation (Law 3). Consequently, it becomes stuck circling the resource, neither getting closer or further.
  • A mind-reading robot begins lying to people, telling them what they want to hear in order to avoid causing emotional pain. Consequently, it devastates those it hoped to help.
  • To ensure the peace and prosperity of humanity, also known as the zeroth law of robotics, the machines secretly control society by predicting and manipulating everything people do.
When a robot not bound to the three laws hides among others, the protagonists in both the book and movie find the missing robot by exploiting how the laws necessitate certain actions. @ Twentieth Century Fox
Film
  • When Detective Del Spooner and a 12 year old girl, Sarah, were trapped in cars submerged in a river, a robot saves Spooner instead because it calculated he had an improved chance of survival.
  • With the same intention as the machines in the novel, the robotic brain VIKI (Virtual Interactive Kinetic Intelligence) controls other robots to impose martial law, willing to restrain and kill any who try to stop them.

Despite following the laws, these robots either make minor errors or fatal flaws. While Spooner attempts to destroy VIKI, the latter frequently protests that its actions logically proceed from Law 1.

If morality cannot be contained in three simple principles, what if we add more laws? Surely, we can add clarifications stating a robot should save children over adults and it must not lie to protect feelings. Though I doubt it, one could possibly codify the perfect balance between freedom and prosperity.

From 3 Laws (or 10 Commandments) to Love

Nevertheless, there exists a deeper problem. Spooner explains regarding the robot that rescued him:

“I was the logical choice. It calculated that I had a 45% chance of survival. Sarah only had an 11% chance. That was somebody’s baby. 11% is more than enough. A human being would’ve known that. Robots, [gestures to his heart], nothing here, just lights and clockwork. Go ahead, you trust ’em if you want to.”

Morality is not principles or rules we follow robotically. As Spooner argues, moral actions must flow from the heart, from love. Principles don’t love. Persons love. Hence, morality cannot be contained in unfeeling or detached principles, but can only be embodied in a loving person.

With a second positronic brain in the location of his heart, the robot named Sonny overcomes VIKI’s logic and provides indispensable help to Calvin and Spooner. © Twentieth Century Fox

We must also note morality is not arbitrary, which I’ve argued previously in From Plato to the Trinity. Love and justice are good. Hatred, apathy, injustice are evil. These truths are objective, true whether or not humanity believes or understands them.

If objective moral truth exists and if it does not exist as mere propositions, then it must be embodied in a person or persons. But if it’s true even if no human believes or understands them, much less embody them, how does morality exist?

This logical reasoning leads to a conclusion similar to a claim by an early follower of Jesus. When John wrote “God is love” (1 John 4:16), he equated objective morality with a personal God whom we can know. In this way, God is not some person in the sky, not just the most powerful being, not just a loving individual, but love incarnate. A personal God, not a set of principles, embodies morality.

Scripture further emphasizes we find the fulfillment of morality in a person. John calls Jesus the Word of God (John 1:1,14). Expressing the same idea, others identify him as “the image of the invisible God” and “the radiance of [God’s] glory and the exact representation of His nature” (Col 1:15, Heb 1:3).

Hence, Jesus embodies God’s nature in a way even his commands cannot. While Isaac Asimov gave robots 3 Laws and God initially gave the Israelites 10 Commandments, neither encompass the heart of morality. Instead, a person, Jesus Christ, embodies perfect morality.

Apply the Lesson of the Three Laws

While following these laws like Byerley would make us ethical, moral people, let us strive for a higher goal. Why settle for acting right, when we can also embody the heart of the laws? Even more, why settle for knowing principles when you can experience the Person who is love?

2 thoughts on “How Three Laws of Robotics Point to God”

    1. Thanks for the kind words. It’s been a lot of fun and personally enriching to contemplate that God is love and personal.

Leave a Reply

Your email address will not be published. Required fields are marked *