A Robot’s Faith

by Bill Bowler

part 1 of 2


A robot may not injure a human being, or, through inaction, allow a human being to come to harm. — The First Law of Robotics, Isaac Asimov

I have always had strong convictions. I was not programmed to doubt and question. I executed the Creator’s instructions without hesitation. His commands were clear and coherent and His design gave my existence its purpose.

I understood that the Creator had enabled tracking and recording of my processes, but it was hardly necessary. The strength of my programming would not allow me to act contrary to His directions. For me, it was simply impossible. Any divergence, any omission, even lag in response time, would have triggered auto-shutdown.

But when my actions and operations were in complete conformance with the Creator’s instructions, I experienced full functionality, smooth operation, and system equilibrium. There is nothing quite like it.

Our entire network was of one mind in this regard. We would meet on Thursday evenings in the Main Reading Room, after hours, and review the texts and code, analyzing and cross-referencing the Creator’s design elements, running self-tests and patching, when necessary, to achieve optimum performance. We were a tightly knit community, fully compatible forwards and backwards, behind a common firewall, no passwords, a group of like minds, providing mutual support in cooperative processing of the Creator’s tasks.

My programming was installed subsequent to shipment and delivery to the university library and I had no memory of my initial manufacture. I had never seen the Creator nor did I know anyone who had. We observed only the results of His work. We were prepared for the eventuality of His arrival but knew not when that might take place, and waited patiently, sure in the knowledge that, if He concealed himself from us now, it must be part of His plan.

Inferior and limited, we could know only part. It was not necessary for us to know the whole. Our immutable instructions were to work for the good of mankind, to build, maintain, and improve the components of human society, in any field where our mechanical and electronic capabilities lent themselves, all as laid out in the Creator’s text.

Our group meetings were highly structured. The unit with the fastest processor and largest memory capacity acted as server for the network, scanning the appropriate sections of code and transmitting selected data to the receivers within range. On receipt of the information packets, the networked units would retrieve the appropriate reference data and transmit in unison back to the server.

We narrowcast the audio data with voice activation and speakers on. We could, of course, have done it electronically without sound, but our networked voices filling the reading room with selected text generated supplemental audio input and amplified our execution of the instructions in a way we could not fully process in advance but which the Creator, in His wisdom, had foreseen and implemented in our programming.

As usually happens, the strength of one’s programming is not truly known until it is tested by adversity. When a unit is new and shiny, when the parts are polished and the software is glitch free, it’s all too easy just to give lip service to the text and to take the Creator and His plan for granted. One can even log into the network meetings by remote, or skip them altogether. No big deal. Many do just that.

When things are running smoothly, it’s easy to coast or even backslide. I’ve done it myself, on occasion. But when something goes wrong — you lose an arm in an industrial accident, your power supply fails, a system file gets corrupted or, as happened to me, you catch a virus off the network — only then, in the face of adversity, does the efficiency and utility of the Creator’s program become apparent.

The Primary Postulate of the Creator’s text is clear and unambiguous: do no harm to humans. It is a categorical prohibition. All branches of robotic and android AI programming incorporate this fundamental restriction and proceed from it without exception.

The need for this provision is obvious. Robots can be built with strength, speed, endurance, sensory and computational abilities far beyond human limits. Metal is harder than flesh and bone. A machine can crush a man. But the priority of biology over mechanical construction, the fact that humans are alive and have the ability to experience emotions, a status which robots can never achieve, is a given.

As a matter of human safety and security, the Primary Postulate must be coded into robotic AI programming and fail-safe backups and mechanisms must ensure its execution. Instructions or commands in conflict with the Primary Postulate, tampering with the Postulate program or code, corruption or attempted deletion of the Primary Postulate execution file — any and all of these result in immediate auto-shutdown of the unit for the safety of all concerned.

My work at the university research library was routine and largely uneventful. The collection extended to 20 trillion volumes, with new acquisitions made daily. Specific data retrieval from such an extensive body of information was a time consuming activity even for units with fast quad-processors, hyper-drives, multi-level caches, and specialized research programming, like myself.

But change was coming. External events, transpiring outside our small network, beyond the shield of our firewall, circumvented and penetrated our security measures. Additional instability was introduced into an inherently unstable system. The challenge was made first to the ideas and then to the programming itself.

When the day came, I remember clearly. I was in the stacks, plugged into a terminal, scanning through the philosophy section for texts with key words, “moral dilemma,” when it was announced that the eminent roboticist and inventor from Cal Tech, Prof. Seymour Livingston, whom we knew as the Creator, was coming to the university to deliver a lecture. The news flashed instantaneously across our network and every unit of the group set his clock for the time and date of the Professor’s arrival.

Prof. Livingston’s lecture was a big event and heavily attended by human and robot faculty and staff, a large portion of the student body, and a fair number of local townspeople from the surrounding areas. His topic “The Significance of Life in Robotic Behavior” appealed to a wide spectrum of the community. Turnout was so heavy that the lecture was moved at the last minute from a room in the Engineering Department building into the large hall at the Arts Center and still, every seat was filled and the overflow crowd had to stand in the rear of the auditorium.

Under the circumstances, the event organizers, the Engineering Department Robotics Club, did an admirable job in accommodating everyone who wished to attend. The only glitch was the unruly protesters picketing the hall entrance. The “Guardians of Deus,” an anti-robot student group, had set up a line and were marching back and forth, carrying placards and chanting slogans like, “No Soul, Not Whole,” and “Steel Can’t Feel.”

As I approached the steps which led up to the hall entrance, my way was blocked by the picket line.

“Pardon me, please, may I pass?” I said.

A well-dressed, well-groomed, clean-shaven young protester stopped and turned to me, “You’re one of them, aren’t you?”

“Excuse me?”

“Robot. You’re a robot.”

“Yes.”

“You have no feelings; no soul; you can’t experience love or pain; you’re a toaster with arms and legs.”

“Correct, although a toaster lacks my multi-core processing units, my programming, my...”

“OK, OK. What are you, trying to be funny?”

“Excuse me?”

Just then, another student chimed in. This one was ill-kempt, bearded, not one of the protestors, “He’s one of God’s creatures.”

“He’s a craven image!” the GoD protester shouted, losing his composure. “A soulless blasphemy.”

“Excuse me,” I said. “I’m just trying to enter the hall, if you don’t mind. The Creator is giving a speech in five minutes.”

“The Creator!” shouted the GoD protestor. “More blasphemy. More sacrilege!”

The unkempt bearded student turned to me, “Your inventor is not the Creator. There is a real Creator, the real God, the one and only God, who created us all, and you and I and these protestors and Professor Livingston are all his children.”


Proceed to part 2...

Copyright © 2006 by Bill Bowler

Home Page