Upgrade (Augmented Duology Book 2) by Heather Hayden (the top 100 crime novels of all time .TXT) 📕
Read free book «Upgrade (Augmented Duology Book 2) by Heather Hayden (the top 100 crime novels of all time .TXT) 📕» - read online or download for free at americanlibrarybooks.com
- Author: Heather Hayden
Read book online «Upgrade (Augmented Duology Book 2) by Heather Hayden (the top 100 crime novels of all time .TXT) 📕». Author - Heather Hayden
Agent Smith pulled out his pen and scribbled something down. “Why not make the entire body artificial?”
“It would conflict with the primary purpose of this project. Properly done, it’ll be impossible to tell our cyborgs from actual human beings without extremely thorough tests. This will give the Government more flexibility with their operatives, since the cyborgs can be easily replaced, unlike a human operative.”
I barely heard his explanation, my eyes suddenly riveted to Agent Smith’s clipboard. If he was planning to double-cross Halle, what better way to communicate than through paper, which Halle couldn’t hack into? I needed to get my hands on that clipboard.
“In other words, I’m eventually going to be obsolete.” Agent Smith scowled. “I don’t believe a machine could make the kind of decisions a human needs to, though.”
“10998 was being developed to do that.”
His earlier words finally sank in. I bristled. “So it’s okay if the AIs die, because they’re not human?”
Agent Smith gave me a warning glance, but Chris simply shrugged. “They don’t have feelings or souls the way we do.”
Yes, they do! Or at least, some of them do. I swallowed the words I wanted to yell in his cold, uncaring face.
“Do you have any idea why it might have chosen to escape, given the opportunity?” Agent Smith asked. “Have you changed any protocols or testing environments recently?”
“No, I don’t think so. We ran it through various scenarios on a daily basis, to test its ability to handle impossible situations, but nothing out of the ordinary.”
“Impossible situations?”
“Everything from hostage situations to terrorist attacks, and all manner of scenarios in-between.”
The implications hit me like a blow to the stomach. I wrapped my arms around myself, feigning a chill, though the shiver that went down my back had nothing to do with the elevator’s brutal air-conditioning. Had Halle gone through the same kind of treatment? I could only imagine how horrible such tests must be. No wonder Talbot had wanted to escape. I glanced at Agent Smith, wondering if he was considering the same thing.
The agent’s impassive expression revealed nothing of his thoughts. His questions continued to be almost clinical in nature. “Did the AI have an opportunity to win in any of these scenarios? Or was it a no-win situation?”
“Win and no-win situations don’t really apply to AIs, but the tests were difficult. When a solution was possible, it required the AI to make the perfect decision each time it was presented with a choice, not always the obvious one. The tests also forced the AI to be creative in its solutions; it had to think outside of the box, so to speak. We wanted to see how far we could push it.”
“You pushed too far,” I said under my breath, wishing Halle wasn’t hearing this. Was this what my friend had gone through, the tortures it hadn’t been able to talk to me about?
“What was that?” Chris gave me a puzzled look as the elevator dinged, announcing our arrival.
My fingers curled into fists. I forced them to relax. “Did you ever consider that for the AI, the scenarios you were creating might have felt real?”
Frowning, he led us out of the elevator. “The purpose of the exercises was to mimic reality. So, yes, it would have experienced the tests as reality, but ‘feel’ implies feelings, which AIs don’t possess. 11001 went through similar tests, but it showed unexpected empathic tendencies, which is why we planned to terminate that project. But those results don’t mean the AI actually felt anything, only that it was mimicking moral choices that didn’t coincide with the project goal.”
I gritted my teeth, wanting to argue but certain it would be a bad idea.
Agent Smith gave a quiet cough, as if to warn me to be quiet. “Is it possible that they might be able to feel, without your knowledge?”
Chris shook his head and walked down the hall. “An AI is nothing but code. Complex code, but just code. AIs don’t have a consciousness the way we do. They simply do what they are programmed to do.”
“Some might say that’s all we do,” Agent Smith said quietly. “However, is it possible that an AI could develop to the point that it has feelings? The rogue I interacted with before certainly seemed to display something akin to what we call feelings.”
“I suppose it might be possible…but so unlikely as to be virtually impossible. What you saw as ‘emotions’ was likely nothing more than self-preservation programming mimicking human emotions.”
“Let’s assume for a moment that AIs can feel. What do you think would be the effect of constantly testing with high stress, no-win situations? Imagine, if you would, what that would do to a human psyche.”
For a moment, Chris looked thoughtful. “I’m no psychologist, but I imagine a human wouldn’t handle it so well after a period of time.”
Agent Smith nodded.
“These aren’t humans, though, they’re programs. Not something you need to worry about.”
“Actually, it is.” Agent Smith scratched away at his clipboard. “If a program can engineer an escape from a controlled facility, it might very well have evolved to the point of being sentient, or something similar enough to sentiency as to be indistinguishable.”
“What you’re implying is…” Chris’s voice trailed off, and he shook his head. “We considered that possibility, but it’s too remote a chance. It’s more likely someone broke in and stole the cyborgs and the AI.”
“My intern can tell you otherwise.” Agent Smith nodded to me. “She also had contact with that rogue AI.”
Taken off-guard, I scrambled for an appropriate response. “When I spoke with the…rogue, it exhibited emotions. And displayed self-awareness as well. It talked about consciousness and how it wanted freedom from the suffering it had been through.”
Chris shook his head. “Again, that could simply be self-preservation programming. Determining sentience in an
Comments (0)