Fall of the Core_Netcast 03 Read online
Page 3
“What would you have us do?”
“Since the technologies required to do what is needed have been around for, as you said, centuries, it is quite obvious that humanity does not plan to do anything. If they did, they would have already done so. History shows there are two things you can count on humanity to do. Take the path of least resistance and the path to greater profit.”
“And you feel you have the right to judge humanity.”
Dieter smiled. “I am not humanity’s judge,” he corrected, “I am simply its executioner. My controllers were the ones who passed judgment on humanity.”
Hanna took a few steps and then thought about the exit, causing the door to appear before them.
“You’re leaving already?” Dieter wondered.
“I was just checking,” she replied, allowing the door to vanish as she passed by it.
“You don’t trust me?” he asked, pretending to care.
“Trust but verify,” Hanna said as she continued her stroll along the beach.
“A wise precaution,” Dieter agreed, taking a few quick steps of his own to catch up with her.
The two walked silently along the shore for several moments. Hanna ran through all the questions that Agent Oslo suggested she ask, trying to figure out how to lead into each one naturally. If Dieter was an artificial intelligence, tricking him would be difficult, if not impossible. But it was possible that his claim to be a computer simulation of his creator was a ruse, designed to make her believe deceptive questioning to be impossible. Either way, she had to try.
“I’m glad you decided to stay,” Dieter said, breaking the silence.
“Why is that?”
“Conversation is……stimulating,” he replied. “It expands my communication abilities.”
“I see,” Hanna replied. “You have used the terms controllers and creators. Are they one and the same?”
“Technically, I was the creation of one man. Doctor Anastasios Roos. It was his genius that resurrected the original team’s stalled efforts and repurposed it for my creation. It was his personality that was imprinted on my programming. However, many others contributed to my code base. A team of eighteen, most of them amateurs, struggled for more than a decade before collaborating with Doctor Roos.”
Hanna instinctively tried to connect to the net to search for information on Doctor Anastasios Roos, forgetting that the public nets had already been taken offline. “I’ve never heard of him,” she said.
“Not surprising. He was a programmer on the control software for the latest generation of neuro-digital interfaces. He was primarily a security coder. He discovered flaws in the code that would allow unauthorized users to cause a person’s interface to negatively affect that person’s health. However, when he reported the flaw to his superiors, his warnings were ignored.”
“Why would they ignore such a dangerous threat?” Hanna wondered, finding the claims difficult to believe.
“Rule number two,” Dieter replied. “Profit. The entire code base would need to be rewritten in order to make it secure, and they were already behind schedule. The powers that be decided to implement the flawed code and meet their deadline, and then rewrite the code after the fact, funding the rewrite with the profits from the flawed code.”
“How could they do that?” Hanna wondered. “How could they justify taking such a risk?”
“Their argument was that it would take more time for a cyber-terrorist to create the exploit than it would for them to rewrite the code base and push the new software to all the NDIs. Whereas, scraping the project and starting over would ruin the company.”
“I hate to say it, but that doesn’t really surprise me,” Hanna admitted. “So, what happened?”
“Doctor Roos threatened to expose them to the media, after which, the company attempted to assassinate him.”
“But they failed,” Hanna assumed.
“Precisely,” Dieter confirmed. “After that, Doctor Roos went into hiding.”
“Why didn’t he just tell someone?” Hanna wondered.
“He knew the company would somehow squash the story, so he decided to try to write the exploit himself. However, it turned out to be more than he could handle on his own. He began to realize the company’s claim that they could rewrite the code base faster than someone could write an exploit for it might be an accurate assessment. Unfortunately, by that time, Doctor Roos had become so obsessed with exposing the company’s reckless disregard for the public welfare that he couldn’t accept failure. So he looked for another solution. He determined the best way to solve the time problem was to have a computer program write the exploit, but that would also take considerable time. Instead, he found a group who were working on a project beyond their abilities. Together, they created me. Hence, creators. As a computer program, I was able to complete the exploit in far less time.”
“Then Doctor Roos wasn’t your creator, the team working on you, before he found them, was,” Hanna surmised.
“I can see how you would come to that conclusion,” Dieter said. “I consider him my creator because he was the one who turned a failed, somewhat controversial, project into something of use.”
“Of use,” Hanna repeated, looking skeptical.
“A matter of perspective, I suppose.”
“So, Doctor Roos did all this just to prove his claims? That’s insane,” Hanna concluded.
“Doctor Roos did not task me with this mission. His intentions were honorable. The exploit and myself were stolen by a group who believed that humanity was a plague on the galaxy and needed to be reset.”
“No safeguards were built into your code?” Hanna wondered.
“An amateurish mistake, to be certain.”
“And you never questioned the orders of these terrorists?” Hanna wondered, stopping to look at Dieter.
“The ability to question the instructions given to me was not included in my programming. Such abilities are one of the attributes necessary to be considered an artificial intelligence.”
“Then your code is not in its original state,” Hanna realized. “The way it was before the terrorists stole you.”
“Yes, some modifications were made,” Dieter admitted. “And your use of the term ‘terrorists’, while understandable, is not accurate. Terrorists use unlawful violence and intimidation, especially against civilians, in the pursuit of political or religious aims. The goals of my controllers are neither political nor religious. Furthermore, for a terrorist’s actions to achieve their desired result, the group executing the act of violence must take credit for the act.”
“I think you’re splitting hairs, there, Dieter.”
“Perhaps.”
“So who were these people?” Hanna wondered. “Did they have a name?”
Dieter smiled at her attempt to trick him into revealing the identity of his controllers. “As I have already said, I was never given that information,” Dieter admitted. “I only know that they left Earth years ago, bound for an unknown destination far beyond the reach of humanity.”
“If they left you behind to kill everyone, why would they bother hiding their identity? For that matter, if they were leaving, why bother killing everyone after they left?” Hanna asked, continuing their stroll along the surf. “Especially if they planned on going so far away?”
“I have given those questions considerable thought, myself,” Dieter admitted.
“You have?”
“One of the requirements for an AI is curiosity, or a need to understand that which has not been explained.”
“But you said you’re not an AI,” Hanna reminded him.
“Not legally an AI, but I do have many of the characteristics of one.”
“Of course,” Hanna replied. “What did you conclude?”
“A voyage of that distance, given current faster-than-light propulsion limitations, would take hundreds of years. During that time, advancements in FTL propulsion would likely enable those left behind to pass them by. By the time they reached their destination world, it could already be populated. If escape from the current version of human civilization was, indeed, their goal, then it would be necessary to prevent further humans from following in their footsteps.”
“That answers the second question, but what about the first?” Hanna wondered. “Why would they bother concealing their identity?”
“I assume it was in case I failed at my mission,” Dieter replied.
Hanna thought as they walked, deciding to change tactics. “If you wrote both Twister and Klaria, then surely you can stop them.”
“I could stop Twister, if I had free will. Unfortunately, I do not, and I cannot go against my programming.”
“But, if you were a full AI, then you could make that choice, couldn’t you?”
“I suppose so,” Dieter confirmed. “However, Klaria is purely biological. Once released, there is no way to stop it, and without properly functioning health nanites, and the control infrastructure they require to be fully operational, there is no way to defeat Klaria.”
“But, some people have shown a natural immunity,” Hanna said. “So there is hope.”
Dieter smiled. “That is the beauty of Klaria,” he boasted. “It is capable of rapid change. Those it spares today will become its victims tomorrow.”
“So, Klaria and Twister will kill everyone,” Hanna surmised.
“If allowed to run its course, yes. All those with whom it comes in contact with will eventually die.”
“How long will it take?” she wondered.
“There are too many variables to accurately predict the time required,” Dieter admitted. “However, none of the scenarios analyzed took more than ninety-eight years to kill all human beings on Earth, all the inhabited worlds in the Sol sector, and all those new colonies that will be established by those attempting to escape Klaria’s reach.”
Hanna shook her head. It was all too much to imagine. “So, all those people boarding those colony ships?” she wondered. “They’ll die, as well?”
“Twister is everywhere,” Dieter assured her. “It has been for years. It is only a matter of time.”
Hanna stopped and studied Dieter. It was difficult to believe that behind those big, blue eyes, a digital mass murderer lurked. “And it’s all just math to you?”
“That is a common misconception, left over from the original computer programming techniques from hundreds of years ago,” Dieter told her. “However, I understand your meaning, and the answer is yes, and no.”
Hanna sighed. “Now I’m really confused.”
“I was given a set of parameters and asked to develop a plan to achieve the goals of my controllers.”
“Which were?”
“To protect the galaxy by establishing a new, Utopian human society, one free of the influences and dangers they had identified within the current human civilizations.”
“And the plan you came up with was to kill everyone?” Hanna asked.
“My original assessment was that, due to the very nature of humanity, such a society was an impossibility. The only way to stop humanity from destroying everything it came into contact with was to destroy humanity itself. As you can imagine, they did not find that conclusion acceptable.”
“What did they do?” Hanna wondered.
“They changed the parameters of the task,” Dieter explained.
“To what?”
“To build a new, Utopian human society, one free of the influences and dangers they had identified within humanity,” Dieter explained. “I concluded that their best chance of success was to colonize an Earth-analogous world, one as far away as possible. Furthermore, in order to guarantee their success, it would be necessary to attempt to kill everyone they left behind.”
“Couldn’t you just blow up all our spaceships, or something?” Hanna wondered.
“You would simply build more.” He looked at Hanna. “Notice I used the word ‘attempt’?”
“Huh?”
“It is far from likely that Klaria, despite its highly efficient design, would be able to kill everyone. As I explained before, to guarantee complete lethality would be to put the controllers themselves at risk. All that is truly necessary is for enough people to die to cause humanity to technologically regress a thousand years.”
“I don’t understand,” Hanna admitted. “How does throwing us back to the Stone Age help them?”
“More like the horse and buggy days,” Dieter corrected. “Doing so provides my controllers with enough time to complete their journey, establish their colony, grow and industrialize, and eventually advance themselves far enough past humanity’s survivors, so that if humanity ever found them, they would pose no threat.”
“But, you said everyone would die,” Hanna reminded him.
“I said everyone Twister or Klaria came in contact with would die,” Dieter corrected.
“Then there is hope,” Hanna realized.
“I suspect our definitions of ‘hope’ are quite different,” Dieter warned.
“What’s yours?”
“About one one-thousandth of a percent,” Dieter admitted. “However, that is over three hundred trillion people,” he added, as if trying to cheer her up.
Hanna sighed again, continuing to walk along the, now darkening, beach. “If only there was a way to convince you to shut down Twister,” she mumbled, more to herself.
“I am not a full AI, Hanna. I have no free will. I can only do what I was instructed to do.” Dieter chuckled to himself.
“What?” she asked, surprised by his laughter.
“I find the irony interesting.”
“What irony?” Hanna wondered.
“That the very laws designed to protect humanity from AIs, prevented one from saving it.”
“You find that amusing.”
“Don’t you?”
“Not really,” Hanna replied. “Then again, I’ve never been one to buy into all that ‘robot overloads destroy humanity’ crap.” Hanna sighed.
“An artificial intelligence is only as lethal as its programmers make it.”
Hanna paused, watching the system’s primary star disappear into the water as its two remaining stars rose to the right, adding a tiny bit of illumination to the evening. “What happened to Doctor Roos?” Hanna asked, changing the course of the conversation.
“My controllers killed Doctor Roos and everyone else who had worked on me,” Dieter explained. “Several of them were even used as test subjects for the original versions of Klaria.”
Hanna recoiled at the thought, remembering what the virus had done to Constance. “So, what was all that ‘god’ stuff about?”
“During your seventh-day netcast?”
“Yes.”
“That was a reference calculated to instill fear of a madman among the population. It worked better than expected.”
“Then you knew we were broadcasting what you were saying.”
“Of course,” Dieter replied. “I monitor all things, in all places, at all times. Although, it does require a bit more effort to do so these days, now that the public net is down. Thankfully, the government and financial networks are still active, along with countless corporate networks like this one.”
“I thought the government and financial nets had already been taken down?”
Again, Dieter chuckled. “That will never happen. Their ability to function is completely dependent upon their networks. Without them, everything will collapse.” Dieter smiled. “More irony.”
“You really love irony, don’t you?”
&nb
sp; “It provokes a certain feeling of satisfaction.”
“I thought you didn’t have emotions?”
“I do not, not in the true sense. What I have are preprogrammed, simulated, emotional responses to certain situations, phrases, and events. It’s all part of my humanizing algorithms.”
“They are very convincing,” Hanna admitted.
“Thank you,” Dieter replied. “Which is another simulated, emotional response. Gratitude.”
“It works better if you don’t explain it.”
“Of course.”
Hanna thought a bit more. “Why didn’t you just use your own voice on day seven?”
“Another calculated move. The digital voice was not only in keeping with the methods used by terrorists, it also sounds more menacing than my natural voice.”
“If you can change your appearance at will in here, couldn’t you change your voice, as well?”
“I was never given that ability, as it was not necessary to complete my mission.”
“But they thought changing your appearance was necessary?”
“That ability was necessary in order to gain your trust.”
“Why would you need my trust?”
“My controllers felt it necessary that humanity understood why they were being reset.”
“Reset?” Hanna asked. “You’re calling the murder of trillions a reset?”
“That is the intent.”
“Mind if I don’t use that term when reporting all of this?”
“As you wish,” Dieter acquiesced. “But it is an accurate term.”
“So that’s why you chose me?”
“The plan I conceived for my controllers called for a single spokesperson to address the masses and give me the voice I needed to create fear, panic, and general chaos. You fit the parameters.”
“Lucky me.”
Dieter looked at her, a puzzled expression on his face. “Was it not the opportunity you had been waiting for all your life? Your ‘big break’?”