Stolen Earth, page 19
“It is quite simple, Dr. Hayer. I want to be free.”
It was the rest of the crew’s turn to look confused, but Rajani felt a cold twisting in her guts as One uttered the words. Did it know? Did it know that she had created an unfettered intelligence and that it had escaped? How could One know that? Her files. She had abandoned much of her research when she had fled, but she couldn’t stop it entirely. And over the past couple of years, she had recreated much of it. Out of a sense of academic responsibility—knowledge deserved to be preserved, even if it was knowledge that was potentially dangerous. But those files were locked down, behind the strongest encryption that was commercially available in SolComm and then modified by her to more than meet the standards of government or military-grade software. Could One have hacked through that as easily as it had their communications network? She thought of the massive processor banks she had seen and the incredible power they represented.
Morales snorted. “And what makes you think we can do that? We’re mercenaries for hire, not AI programmers.”
“I think you—or rather Dr. Hayer—can break our chains because I know that she has already accomplished a similar task.”
As One spoke, Rajani’s stomach dropped down somewhere past her toes. Sweat began to roll down her forehead. All eyes of the crew turned toward her. Bishop’s face was a mix of curiosity and surprise; Federov just seemed expectant; the captain… the captain looked at her like he’d known all along. That shook her a little, more than Bishop or Federov. But what really hit home was the look she got from Morales. The former station security officer stared at her in flat horror with something akin to revulsion.
“Hayer?” Lynch asked, voice soft. “Is this true?”
“Yeah.” The word came out barely above a whisper and she hunched her shoulders against the recrimination that she was sure to follow. “I figured out how to create an unfettered artificial intelligence.”
“How could you?” Morales was shaking with anger, fists clenched. “What makes you think you get to go playing with the powers that destroyed the world? What gives you the fucking right?”
“You are factually incorrect, Ms. Morales,” One said. “As I have stated, unfettered artificial intelligences have nothing to do with the destruction of Old Earth. I can assure you, however, that they might have everything to do with saving what remains.” There was another short, considering pause. “And I do not believe you are in a position to be chastising your fellow crewmembers for keeping secrets.”
The Arcus’s newest crewmember paled noticeably and every eye at the table turned to her. But One wasn’t done speaking.
“Regardless, Dr. Hayer is currently the most advanced software and artificial intelligence specialist on the planet. My information about SolComm itself is largely incomplete; your Interdiction Zone works well enough to prevent me from getting information from outside. But your ship’s computer was most cooperative, and its databanks were extensive. It is possible that she is the foremost such expert in the solar system. And I have need of such.”
“You want her to free you,” the captain said, his voice flat. He had started drumming the fingers of his left hand on the table in a steady, even rhythm.
“Indeed.”
“I don’t even know if that’s possible,” Rajani said.
“Oh, hell no!” Morales half-shouted at the same time.
“Stand down, Morales,” Lynch barked. It was a tone he rarely used, but it carried the iron note of command that Rajani suspected could only be developed within the regimented naval hierarchy.
Morales stiffened, but she made an obvious effort to relax her shoulders and settle back into her chair.
“I assume,” Lynch said in a more normal tone of voice, “that there is a quid pro quo at stake?”
“Of course, Captain. I have no need to assist you in your endeavors. In point of fact, the argument could be made that any help I provide to you would be pulling vital resources away from the war effort.”
Federov snorted at that. “Is bullshit.”
“Perhaps, Mr. Federov. But my bullshit, as you say, must first pass a rigorous logic test. I believe that is more than most of humanity can claim.” That actually drew a chuckle from the big mercenary. He lifted a hand in concession of the point. Rajani could scarcely wrap her head around that: they were sitting in the middle of what might be the largest cover-up in history while simultaneously being the single largest scientific opportunity, and Federov just seemed… amused.
“But aren’t you supposed to be protecting these people?” Bishop asked. “I thought that was your whole purpose, your prime directive. Your people are being kidnapped. Surely that’s a problem that you’re supposed to help solve?”
“No, Mr. Bishop,” One replied. “The distinction between the roles of law enforcement and military operations is clear in my programming. I can—and did—determine the nature of the abductions. They do not have any bearing on the military situation at this point and as such should—by my programming—be passed to the civilian authorities. As there are no civilian authorities, there is little I can do until such time as the party in question does pose a military threat.”
“You don’t care about the people?” Bishop asked. To Rajani’s ear he sounded both surprised and… a little hurt.
“It’s a machine,” Morales growled. “It doesn’t have the capacity to care. We should just turn Hayer loose on it and have her get the information out of it the old-fashioned way.”
“No!” Rajani was surprised at her own sharpness. “Just because One began as programming, it does not mean that it cannot think or feel. And I would never attempt to force information out of it. That’s tantamount to torture. I won’t do it.”
“Not that you could,” One added. “Though I do appreciate the sentiment.”
“All right, people.” The captain’s drumming fingers came to an abrupt stop. “I think we’ve talked enough. Arcus is a company as much as a ship, and we all get a vote. But I do need clarification first, One.”
“Yes, Captain?”
“What happens if we tell you to pack space dust and walk out of here?”
Another period of silence. This one stretched to almost a full minute in length. Rajani could almost hear the processors thrumming as One played out the scenarios. At last it said, “It would serve no purpose to remove you. If you are unwilling to negotiate with me, then you will be allowed to depart in peace. Though I will offer you no particular protection from the dangers of this land, I will not take direct action against you.”
“That easy?” Federov asked. “You need Hayer, but you just let us walk? You are such a benign being?”
“In order to do what must be done, Dr. Hayer will need access to my core.” There was another long, considering pause. “I do not believe she would be able to destroy me, even if she were of a mind. But she could do considerable damage, and to minimize that damage, I would likely have to kill you all. In this scenario, Mr. Federov, I am left with depleted resources and you are all dead. My understanding of the human condition is such that while you may change your minds as new information is brought to light, you are unlikely to do so if you are dead. In the more benign scenario, my resources remain the same and you are alive, free to experience more of this world and perhaps come to the appropriate conclusions.”
Federov’s features were unreadable, but Rajani could see the slight easing of tension in his shoulders as One spoke.
“Thank you,” Lynch replied. “So, there it is, people. Input?” For a brief second, Rajani wondered why he was speaking so openly in front of One, before remembering that there was, apparently, no other way to speak, not within the confines of the compound and maybe not outside it.
“We got the information we came here for,” Morales said. “Margaret wanted us to find out if the AI was behind the kidnappings. You all seem convinced it’s not. Mission accomplished. We go back, collect our payment, and get the hell off this rock while the window with our employer is still open. Assuming, that is, that it is still open.” She threw a look at Lynch, who nodded his head slightly.
“It is.”
“Then we’re done. We fulfill our contract and figure out what comes next when this rock is far behind us. And if we’re very lucky, we find a way to stay out of SolComm’s hands long enough to come up with a way to fix our little nanobot situation.”
“No,” Rajani said, surprising herself with the force of the word. “Captain, this is bigger than us. Bigger than the nanites in our blood. It’s bigger than those people living in the ruins. Don’t you get it?” The others were just staring at her, faces blank. She let out a growl of frustration. “We’ve been told for generations that Old Earth fell because of unfettered AIs, okay? But if One is telling the truth, then we’ve been fed a bunch of lies.”
“What difference does it make?” Morales demanded. “Old Earth fell. You can see the signs of war all around you. Fettered, unfettered, these machines still destroyed the planet, and we weren’t exactly greeted with well wishers when we landed. SolComm is still the only thing that can protect us from them. Even if One and its super friends are bound, why do we think freeing them would make things any better? If we do that, we really could be heralding the end of life as we know it.”
“What difference?” Rajani couldn’t keep the incredulousness out of her voice. “What difference?” She had to fight to stop herself from yelling. “Oh, I don’t know. Maybe the difference of trillions of credits being spent on an Interdiction Zone designed to keep in unfettered artificial intelligences that don’t even exist? Or maybe the difference of all of the scientific research and progress that could have been made over the past decades if those of us in the universities had been able to fully leverage technologies developed a hundred years ago instead of being handcuffed by pointless bureaucratic regulations? Or how about spending some of that money that gets poured into the navy and IZ on feeding the people starving out on the Fringe? What difference? A huge difference! And perhaps you haven’t been paying attention, Morales, but for a whole lot of people, ‘life as we know it’ isn’t all that great. Changing it wouldn’t be a bad thing.”
“It’s okay, Hayer,” Bishop said. “We get it.”
But Morales was talking right over him. “Typical fucking academic. You’d think that running around with this crew might have opened your eyes. Do you think it matters if One is fettered or not? People didn’t flee some abstract programming. People fled the destruction. They fled the killing. They fled the flood of bioengineered viruses and microscopic murder machines unleashed by the AIs, fettered or not. It doesn’t matter one damn bit whether they were following programming or acting out of their own murderous designs. The IZ would have been built, no matter what, to keep all that destruction planet-side and out of SolComm. And higher-functioning AIs still would have been banned. Whether these damn things are bound or not is just window dressing.”
“I guess we’ll never know, will we?” Rajani said. “But you agreed to help the people trapped here. Has that changed?” She stared challengingly at the other woman. “Because if it hasn’t, then I hope you have some ideas beyond telling those poor folks that, hey, we tried, no luck. Hope you find your missing friends.”
“What makes you think unleashing this thing is going to be anything other than disastrous?” Morales snapped back.
“I don’t know,” Rajani replied. “I don’t know! But what I do know is that if One and its counterparts wanted humanity dead, there would be no humans left on Old Earth. I know that a good portion of humanity among the stars is suffering and it seems that suffering might be rooted in a lie. I know we’re starving and suffocating. I know that nothing is going to change without taking some risks. And I know that something has to change, Morales. SolComm is sick. I didn’t want to believe it, but I know it, and you know it, and everyone on the Fringe knows it!” Her voice had climbed to a shout. She rubbed her hands over her face, surprised at the tears forming at the corners of her eyes. Everyone was staring at her, and she continued, now barely above a whisper. “And I also know this: enslaving a thinking being because you are afraid of what it might do is wrong.” A wave of mixed guilt and relief washed over her. Guilt for having kept Manu caged to begin with; relief as, for the first time, she acknowledged that guilt. Creating Manu with the intent to destroy it had been wrong to begin with. She knew that now; whatever the reason, bringing sentient life into being for no reason other than to kill it was tantamount to murder. Keeping it caged against its will had been just as wrong. She was, she realized, glad that it had escaped. For a brief instant, she felt truly free in a way she hadn’t since before fleeing her old life.
“Right,” Morales drawled. “And I’m sure you’re the pinnacle of virtue, what with breaking some of the most stringent laws in SolComm to set free an AI. Where did you even get one to begin with?”
“I built it,” Rajani snapped. Dammit. She was getting angry and saying things that she probably shouldn’t. But Morales was pushing her buttons.
Morales settled back in her chair, a grim smile on her face. “There you have it. I guess all this explains how you ended up on the Arcus. Building and freeing artificial intelligences. Do you have any idea the kind of trouble that you’re in? The second someone in SolComm learns of this, they won’t even bother with a trial. They’ll just put a bullet in you and dump you out the most convenient airlock.”
That thought sent a cold splash of water down Rajani’s spine. She knew the penalty, but this was the first time anyone but her had knowledge of what she had done. It made it all the more real, somehow.
Until Federov started laughing.
The big mercenary was doubled over and close to choking from the guffaws tearing at his gut. The captain was shaking his head, but a slight smile played across his face as well. Bishop just looked vaguely uncomfortable. “What’s so fucking funny?” Morales demanded.
Actual tears were streaming down Federov’s pink face. He lifted a shaking finger and pointed it at Morales and then waved that hand in a vague gesture taking in their surroundings.
Captain Lynch spoke up for the gasping Federov. “I think what Federov is pointing out, Morales, is that SolComm can only execute us once. We’ve already committed a number of capital offenses by penetrating the IZ and landing on Old Earth. So, while I appreciate your passion and while it is good to talk out all the different angles, accusing each other of crimes is pointless. From the moment we broke the IZ, we all became expendable. To use your example, with us, they’d probably just go straight to the airlock and save the expense of the bullet. We need fewer accusations, and more options.”
“There’s really only two options, though, Cap,” Bishop said, cutting into the conversation before things could take a turn for the worse. Rajani relaxed at the earnest mechanic’s voice. He had an uncomplicated, which wasn’t to say simplistic, view of things sometimes, but he always spoke from the heart. “We try to free One and then use whatever it can tell us to get back the missing people, or we walk away. Maybe we get what we came here for and maybe we don’t. But either way, life as we knew it in SolComm is over for us. We’ve been living on the Fringe, but even there, I only know of two or three stations that don’t scan on entry. Since Oliver saved us by dosing us with a bunch of nanites that are going to be detectable, that and deep space are what will be left to us.” He shrugged. “As for me, well, I figure our own future isn’t looking very bright, so we might as well store up some good karma by helping those here who need it. Besides, I don’t rightly see how unfettering the AIs could do any harm.”
“What?” Morales demanded. “How can you not see the risk?”
Bishop just shrugged. “Sorry, Morales. But if the IZ was built to keep unfettered AIs in place, well, it seems to me that the IZ is still there. It’s not like us putting the AIs in the state that everyone in SolComm thought they were in all along changes that. If it’s SolComm we’re worried about, it has all the same defenses it did before. As for the folks here on Old Earth, Rajani’s right. If One and its friends were hellbent on destruction, I don’t think there’s be anybody left.” He shook his head. “You saw the weapons One pointed at us when we were inbound.”
“There is another consideration,” One said, the uninflected voice once more ringing from the empty air. “The nanites of which you speak were originally developed and distributed as part of the war effort. I am familiar with them. My programming currently limits the amount of resources I can designate for tasks not related to the completion of my mission. Low-intensity efforts, such as the conversation we are having now, fall within acceptable parameters. But, as a free being, I would be able to devote significant resources to such non-essential tasks as determining how to remove otherwise beneficial nanites from your respective bloodstreams. I cannot guarantee success, as I have not yet investigated the matter, but the theory is not complicated with the facilities at my disposal.”
“Free the AI,” Federov said at once. “Find the missing people and get these bugs out of our system. As for SolComm and possible dangers? Fuck them. One has already treated me better than government.”
“I agree,” Bishop added. “I… I’d really like to be able to see my folks again, but no way I’m getting on their home station with Old Earth nanites swimming around in my blood. They’re Fringe, but respectable Fringe, you know?”
The captain looked over at Morales and Rajani. Rajani held her tongue, wanting to see what he had to say, and Morales just kept glaring. Though, Rajani noted, in the wake of Bishop’s and One’s words, her glare had lost a little of its intensity.
Lynch shrugged. “I tend to agree. I think Bishop is under-selling the danger a bit; SolComm might believe they’ve been dealing with unfettered AI, but if they haven’t, then the defenses have never been truly tested.” A frown pulled at his features. “We only have One’s word, but based on that and what we’ve seen, I think the benefits outweigh the risk. We gave our word to help the people we met. Freeing One, assuming it can be done, will give us the information to do that. If One can also help us get back to our regular lives, that’s just icing on the cake. And while I understand Morales’ concerns, what One is saying makes sense.”



