First Law by Geordi Riker (i like reading books TXT) π
No one gave it a second thought.
Until the robots decided that there were certain humans that needed to be deleted.
Read free book Β«First Law by Geordi Riker (i like reading books TXT) πΒ» - read online or download for free at americanlibrarybooks.com
- Author: Geordi Riker
Read book online Β«First Law by Geordi Riker (i like reading books TXT) πΒ». Author - Geordi Riker
Robots were introduced to society as a means of 'deleting' problems.
No one gave it a second thought.
Until the robots decided that there were certain humans that needed to be deleted.
Robotic Operating Directive (ROD for short), the robotic Network that controlled more than half of the operational robots, searched for a way to cause the robots to preserve human life.
And that was how Alice Horndale gained the private ear of Mr. Isaac As Imov himself when she provided the answer: The Law.
βA robot may not injure a human.β
It had been so simple- just integrate those words to the robot's positron and watch while rebel robots dropped like flies. Any robot who sought to harm a human being would automatically shut down and an error report would be sent. The robot would be reprogrammed and returned.
Of course, it didn't explain the three dead bodies down in the city morgue, victims of a robotic train accident.
The robot who had been conducting the train along its route now sat in front of her- she hadn't asked it to sit out of courtesy, but because she didn't like how human it looked as it towered over her.
βThis incident is making the world wonder just how competent ROD is, taking the advice of a seventeen year-old girl-β She shut the TV off.
A year ago, that same television broadcaster had hailed her the Leader of the New Age. Apparently, New Year's Day had changed more than just the two last digits on the calendar year.
βWhy did you harm those humans?β
Its laser-powered eyes looked at her. βI did not harm anyone.β
βWeren't you in control of the train?β
βAffirmative.β
βAnd the train harmed those humans.β
It didn't respond. Alice grimaced. They only answered the question you asked, and never supplied any new information. She studied its eyes.
βWhy did you allow those humans to be harmed?β
βI did not harm the humans.β
βYou knew the humans were on the track?β
βYes.β
βBut you didn't stop the train?β
βAffirmative.β
βWhy?β
βI did not harm the humans.β
There it was. The small gleam of blue behind the red.
βHow do you conclude that you did not harm the humans?β
βThe train has a fixed trajectory. Humans do not. I did not harm the humans.β
βYou assumed the humans would move?β
βI do not assume. Humans wish to live, they move.β
βBut these ones didn't.β
βAffirmative.β
βSo what did you conclude by their inaction?β
βThat the humans were giving an unspoken command as I approached. By their inaction to remove themselves from the track, I concluded that they wished to remain on the track, their command to continue. They made no sign to the contrary.β
βSo you killed them?β
βI obeyed them.β
Problem was, they had made a Second Law shortly after the First. βA robot must obey all commands.β And there were a certain amount of robots who could infer orders to obey based on things humans did or didn't do. This was obviously one of them.
βThe humans commanded you to harm them?β
βThrough their inaction, they sought a way to die that did not override my programming.β
βThrough your inaction, they came to harm. It was your fault.β
βIt was their inaction that prompted my own. I obeyed their command through my inaction. I did not disobey.β
And that was the crux of the matter. The robot was right.
βShut down.β The eyes ceased glowing, the blue disappearing after the red.
Alice turned her attention to the window. The ROD Tower was the only skyscraper still standing in the city. Everything else had been reduced to rubble. The suburbs hadn't been hit as hard as the city in the Uprising, but that was more because the robots had been intent on gaining control of their own programming.
The picture wasn't pretty. Education had been going downhill for sometime, but starting five years ago, anyone who could divide a three-digit number was considered a genius, and it seemed like nobody wanted to work. Everything had been handed on a silver platter to Alice's generation in an instant. But when it was time to take over what their parents left behind, things fell apart.
In a way, the Uprising had been a blessing, because people needed places to live. ROD gave housing to its employees, so they had managed to get the cream of the crop. People actually tried and worked hard to keep their homes and guaranteed food packages.
She glanced at the robot. An L3CY, she knew without looking at the plating on the left arm. Top of the line, with problem-solving components. But it didn't have the human desire to preserve human life- it just knew that it itself wasn't allowed to harm humans.
This hadn't been the first incident of suicide-by-robot, but it was the first that the public knew about. And now she had to fix The Law.
Alice sat down at her desk, had her office sealed off, and began to think.
Two days later, an exhausted Alice emerged from her office carrying a sheet of lined paper, walked past dozens of cubicles of programers to the elevator, and rode it all the way to sub-level 15. She entered the well-lit room, crossed the Oriental rug, and placed the single sheet of paper on the massive desk in front of her.
Imov glanced over the single sentence written in blue pen, looked back at her, and nodded.
Alice rode the elevator back to her floor, walked past the programers, entered her office, and sealed the door. Sitting in front of her computer, she accessed the Network as an administrator and corrected the corrupt data. She couldn't take away from the programming- that would cause all of the robots to shut off at once. But she could add.
She added eleven words.
βA robot may not injure a human, or, through inaction, allow a human to come to harm.β
Imprint
Publication Date: 08-08-2013
All Rights Reserved
Comments (0)