Third Law by Geordi Riker (popular e readers TXT) π
Lionel Yoshoga, a leader of a local band, committed a felony that has earned him the death penalty.
His clock is ticking...
Read free book Β«Third Law by Geordi Riker (popular e readers TXT) πΒ» - read online or download for free at americanlibrarybooks.com
- Author: Geordi Riker
Read book online Β«Third Law by Geordi Riker (popular e readers TXT) πΒ». Author - Geordi Riker
βMy final meal was steak and potatoes, Father. The steak was synthetic of course. Only the rich can afford the real deal, what with the new laws regulating the life cycle of cattle, including when to end it for the cow. I had never had steak, so even the fake stuff tasted good after the slop that passed for a meal in jail.
βI had a spork and a knife so dull that my fingernails cut through more meat than it did. As if I'm going to kill myself before they can.
βI've been serviced the death penalty, with the grace period of one year to fully contemplate and recant my actions. Nothing to recant.
βDon't leave, Father, I need you here. I need someone to hear what I have to say.
βI'm not ashamed or sorry for what I did. They built those robots with no thought. I had read all their reports- illegally, you understand- and saw the writing on the wall.
βSo, I dug a little deeper-
β-yes, by 'dug a little deeper' I mean that I hacked the ROD Network. And you won't believe what I found.
βWhat are you so worried about, huh? Who are you gonna tell about my dirty little secret, really? They cut your tongue out before you can become a fully commissioned Prison Father with the credentials to work for the ROD prisons anyways.
βI had to fix their mistake, see? ROD, they didn't think things through. Or they did, and wanted it to be that way. But that's no way to treat a robot, no matter what the environmental, educational, or ethical situations are.
βThese robots were to become our police forces, our protectors from evil. But what protects the robot? We have rules stating that a robot cannot harm a human, a rule so ingrained into the programming that even contemplation of such an act causes the robot to shut down permanently.
βThey're growing, gaining intelligence, and we just kill them off if they make a mistake.
βThink of it this way. Would you kill a child who coloured on the wall you just finished papering? No! You'd give the kid a spanking, or put him in the corner, or both. You wouldn't kill him. You'd make him learn from his mistakes. And not just because kids are far and few between these days, but because the kid needs to learn.
βRobots are no different. They make mistakes. After all, they were designed by us flawed human beings. We can't expect them to be perfect when they didn't even exist ten years ago. What did we do when we first existed, huh? Cain killed Abel, that's what we did. Murdured his own brother in cold blood, out of jealousy. They weren't even fighting over a girl. Sure Cain got cursed, and was forced to wander the world the rest of his life, but he still got a second chance.
βSo should robots. Let them live.
βGuess my time's up eh, Father?
βEh? You brought a paper in here? They let you? Don't just give me that half shrug.
βWell, what is it? What day is it anyways?
βNo way, current events? Well, if my name's not in there, I don't really care, Father, I'll be dead in an hour.
βWow. Hey, that's my mug shot. Not too good looking with those black eyes and broken jaw.
βHmm. 'Lionel Yoshoga faces his death today for his crime against humanity.'
βEh? Why you pointing there? Ease-up, old man, I still have a minute.
βFine, I'm reading it, okay? 'ROD Robotics division gave a statement today regarding the damage control that they had been doing, trying to absolve Yoshoga's attack. ROD's CEO Isaac As Imov stated to the press last evening that they had finally managed to get the robots back under control by prioritizing the Three Laws.
β'The Laws now read:
1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
2. A robot must obey all commands given by a human being, except where those instructions conflict with the First Law.
3.A robot must protect its own existence except when such protection conflicts with the First or Second Laws.'
βThose sons of bitches.β
βI don't care that the clock's ticking, Father. I know that the moment it goes blank, the collar around my neck injects a poison into my blah blah artery and make me go into cardiac arrest. I'm not going to apologize for protecting robots. Not now, not ever. I'm glad that I made that third law part of their internal programming. You can tell your superiors th-β
Imprint
Publication Date: 08-08-2013
All Rights Reserved
Comments (0)