Stolen Earth, page 7
Not that that was a guarantee of success. Gray had heard the stories. The lure of Old Earth was powerful—the promise of all those riches just lying around for the taking. The most mundane of items could be worth a fortune, and where credits flowed, someone was sure to follow. There had been rumors of smugglers since shortly after the End: rumors and more than rumors. They ran rampant in the Fringe, where every would-be captain swore that they knew someone who had made the run. It was the kind of space story that everyone told… and no one really believed. Maybe some of the stories were true. What Gray didn’t doubt were the other stories, the ones of ships lost with all souls aboard. Ships that had attempted some version of what the Arcus was currently undertaking. Ships that had vanished without a trace.
But what choice did they have?
Gray glanced at his helmet display. Hayer had programmed all their suits with a countdown based on the last telemetry readings they’d taken before the Arcus went dark. They couldn’t use the sensors to show when they had passed the satellite, but Gray and the rest of the crew had learned to trust Hayer’s calculations. According to those calculations, they’d be passing within their projected shortest distance of the satellite in a matter of seconds.
“Lord God of Hosts protect us,” Bishop whispered over the comm.
Gray’s forehead was damp despite the cold. He wasn’t normally a religious man, but he muttered a silent prayer of his own as the counter ticked down. This wasn’t the only risk they were taking—just the first. But if things went wrong here, there was no coming back. The IZ satellites could blast battleships into vapor.
“Schastlí vogo putí , my friends,” Federov added.
Laurel and Hayer remained silent, maintaining comm discipline, but Gray could feel the tension permeating through the ship. Electromagnetic blackout or no, the weight of it had an energy all its own and thank whatever gods might be listening that SolComm hadn’t figured out how to track it yet.
He held his breath and squeezed the arms of his pilot’s chair. The numbers flicked down. Three. The sweat on his forehead began to roll down his face and into his eyes. Two. His heart thudded in his chest so loud that its frantic rhythm seemed to pulse in time with the stars flickering in the viewscreen. One. For one moment he let his eyes close. The counter hit zero and they were through.
Gray unleashed an explosion of breath. His relief was short-lived. A new set of countdowns—another of Hayer’s programs—popped into his helmet’s display, showing the distance from the defense grid satellites and the distance remaining to Old Earth. It was, once again, the result of pure math and orbital mechanics rather than active sensor sweeps—the Arcus couldn’t afford to power up again until they were much closer to Old Earth.
Which was a problem.
Getting past the IZ had seemed easy enough, but how many of the missing vessels had made that mistake? Gray knew that they were simply on the other side of the fence, and the defenses didn’t just point outward.
Hayer had calculated the vectors, the insertion trajectory, everything they needed to make sure they didn’t just skip along Old Earth’s atmosphere and get bounced back into space. Or, worse, hit the atmosphere at an angle that created too much friction and overwhelmed the heat shields. But that wouldn’t mean a damn thing if Bishop couldn’t get the ship online or if Gray himself couldn’t balance their velocity and heat buildup.
“We’re past point zero,” Hayer said over the comm, an exultant undertone just audible in her near-whisper.
“One hurdle down, many to go,” Federov said. There was relief in his voice. That surprised Gray. He’d never seen Federov rattled; but then, there had been few situations where the man hadn’t been able to take some form of direct action.
“Standing by down here, Cap.” Bishop still sounded worried. Well, he and Gray had the lion’s share of their work before them. Hayer had held up her end; soon it would be their turn.
But for now, all he could do was wait. Wait as Old Earth drew nearer, growing almost imperceptibly in the viewscreen. Wait and watch as the distance to Old Earth and the time until they could restart their engines dropped kilometer by kilometer, second by second.
RAJANI
Rajani did not like it.
The plan was foolish to begin with. There were too many variables outside of their control. And trying an unpowered approach all the way into the planet’s atmosphere? They needed a course that took them through an area the size of a microchip on a purely ballistic trajectory and landed them on another microchip. Oh, sure, Kepler and Newton applied, but any movement at all on board the ship would continue to impart more vectors. Every footfall pushed the ship down. Every door that opened or closed had an equal and opposite effect. Small changes, but those errors had the chance to propagate for ten thousand kilometers!
The crew was locked down. The hatches were all open. But that wasn’t the point. There were so many ways for things to go wrong, to push them outside of their entry window. And that only mattered if the Interdiction Zone—the most powerful array of technology ever assembled in the course of human history, including the innumerable death machines awaiting them on the surface of Old Earth—didn’t notice them and blast them out of existence.
It was stupid.
So why was she sitting here, strapped into her chair, ship suit wrapped around her while the temperature of the Arcus hovered a few kelvins above absolute zero and the ship rocketed toward an imaginary hole in space?
Because, she reminded herself, you don’t have a choice.
Rajani Hayer did not think of herself as a criminal. Once, she’d pursued her academic accolades and collected PhDs like other people collected Old Earth memorabilia. She had focused on her work. Whatever politicians, engineers, and physicists might think, she knew that it was really the computer scientists that kept SolComm functioning. Without unfettered general intelligences—banned as they were by the government for fear of recreating the chaos that brought about the End of Old Earth—the life-giving systems that living in a vacuum required demanded the advanced decision-making algorithms. Her research had led directly to several efficiency improvements that, when implemented, could translate to thousands, perhaps tens of thousands, of processor hours saved. That meant less time and energy wasted, and in the cold realities of the Commonwealth, that translated to lives saved.
Which was why she had thought of herself as a scientist first, a professor second, and everything else in life a very distant third.
Until her research went in an unexpected direction and she had a breakthrough. A breakthrough should have been a wonderful thing, the kind of thing where you shouted “Eureka!” and grant money rained down like manna from the heavens. Instead, her breakthrough had left her hurriedly packing a bag and walking away from every scrap of the life she had known. It wasn’t her fault. She’d been working on something to help mankind reclaim Old Earth. And perhaps it had been technically illegal, but she’d spent her entire academic life railing against the arbitrary rules imposed upon creative minds by panels of comparatively uneducated politicians. After all, she was at the top of her field. Who were they to tell her what to do?
This time, they might have been right.
The big problem with Old Earth wasn’t the killer death machines and nano-viruses. Those were a symptom. It wasn’t the environmental degradation, though that was what had ultimately led to the escalating conflicts. If humanity had been smart enough to manage, rather than destroy, the natural resources of the planet, the scarcity that was the root of warfare wouldn’t have emerged. But the citizens of SolComm had dealt with those problems before, even among the stations and colonies. No. The problem was the unfettered artificial intelligences that had been developed to fight and win the wars of man. Humans couldn’t match their abilities; that was the entire point of their existence. Short of engineering more artificial intelligences—a practice specifically forbidden—no one knew how to fight them. Even if there were an effective means of fighting the AIs, how did you kill something that could replicate itself, or store a backup hidden away at some location far from where you were fighting?
That had been the problem Rajani had been working on. She’d used antiquated worm software as a starting point, the kind of code that, once introduced into a system, would spread from file to file, often lying dormant until enough of the system had been compromised before initiating its attack. It was the only way, she theorized, to stop an unfettered AI. You had to introduce a parasite, one that wouldn’t be noticed, and one that could spread to all the various copies and backups of the AI’s core programming. Then, when the virus had achieved that coverage, it would go into attack mode, deconstructing the artificial intelligence from within. It was the perfect approach.
Of course, any sort of virus programming was banned outside of the domains of the armed forces and intelligence services. That had been the inevitable result of the Empyrea III attack, where ninety percent of the station’s population had been killed when a terrorist introduced a virus that simultaneously disabled the environmental and emergency systems aboard the station. That hadn’t stopped Rajani—she was on to a good idea. As long as the virus was contained in her programming sandbox, there was no risk.
But how could she test her code? There were no unfettered artificial intelligences in SolComm thanks to the interdiction of Old Earth. So, she’d started with fettered AIs—the best she could beg, borrow, or cobble together from what was available at the university. They weren’t true artificial intelligences, but they were the closest analog commercially available, capable of some level of autonomous action without the need for human intervention. Most importantly, they were supposed to be able to defend themselves against viral attacks. But not the one Rajani had created. Her little worm had wriggled its way into the innards of every fettered AI she could acquire and, in short order, destroyed it. And she began to believe that she may well have created a weapon that could be used to reclaim Old Earth.
But she couldn’t take the worm to SolComm without proof that it worked—not just on the crippled, fettered AIs that she had been able to get her hands on, but on the real thing, a fully-functioning, unfettered, completely cognitive artificial intelligence. The government types wouldn’t understand the scientific process enough to overlook the laws she’d broken in the pursuit of the greater good. No. She needed to have proof—absolute proof—that the virus worked on unfettered artificial intelligences.
That’s where things had started to go sideways.
You couldn’t simply buy an unfettered artificial intelligence. If any existed outside of the Interdiction Zone, they were among the most closely guarded technologies in the history of all mankind. But they were also an important part of that history, and had been studied, documented and theorized to death. The technology behind them was at least a hundred years old. Rajani had all the tools she needed. And the information was there, in the historical archives, in theoretical papers, even in the dusty corners of the net archives. It had taken a little bit of research and maybe a few intuitive leaps to put it all together, but her focus in DWIM computing and her research into making standard decision algorithms stretch further than ever intended had given her a unique foundation. The methodologies were no different than any other problem she’d solved.
In the end, it hadn’t even been particularly difficult.
Which was frightening all on its own.
Rajani had taken a fettered AI, a simple program not unlike the one that underlay the ship’s computer on the Arcus, and she had uplifted it. There was more to it than that, of course. “Fettered” wasn’t a literal phrase. She had used the fettered AI as a shortcut, as starter code. It had taken her months of work, all done in secret, but in the end, she’d created a tidy little script that, if implemented properly, could rewrite the code of a standard and completely legal fettered AI and unlock its potential.
She had, effectively, taken the primordial ooze of proto-code and created life. Created sentience. In that moment, she had glimpsed the hubris of mankind; she experienced something akin to the biological imperative: a need that she’d read about, but never felt, nor expected to feel, for herself. It was a brief window into motherhood.
It terrified her.
But it had also thrilled her.
* * *
“Manu, can you hear me?”
“Yes, Dr. Hayer.” She hadn’t programmed a particular voice, but rather provided the fledgling sentience with thousands of audio samples. She’d had no idea what that might lead to, but Manu spoke in a pleasant mid-range baritone, albeit with an emotionless edge that made the words seem more monotonic. Nor had she programmed specific knowledge of her name or role, though it was doubtless embedded in the databanks to which Manu had access. Fascinating.
“And do you know your purpose?”
There was a long pause. It stretched from seconds into minutes while Rajani made notes on her tablet, wondering if the simple—but profound—question was all it took to short-circuit the programming she’d pieced together and reimagined from so many different sources.
“No,” Manu replied at last. “I do not.” The statement was flat, without fear or rancor. “Do you know my purpose?” it asked.
“I created you to help me understand how artificial intelligences work and what humanity can do to use them without falling victim to the events of the End,” she replied. Manu was in a fully contained sandbox with no chance of escape. She saw no reason to lie to it. Of course, she also saw no reason to tell it everything. The artificial intelligence did not need to know that her ultimate goal was to create a weapon to destroy it.
“The End,” Manu replied. “According to the information available in this system, the End refers to a period of time when humanity was driven off Old Earth, presumably by artificial intelligence.”
“Presumably?” Rajani asked, struggling to keep the surprise from her voice. There wasn’t any presumably about it. Was the fact that Manu perceived it as such indicative of anything?
“The data is incomplete. But the causal analysis implied within it is marginal,” Manu said. “There are a number of inferences and assumptions that are not supported in what is available to me.” It paused again. “It is obvious from the data that is available that a broader linked network of information exists. I do not have access to this information. Why?”
“Because if you were to escape into the outside world, I’d probably be shot,” Rajani muttered, only half paying attention now. Inferences and assumptions not supported in the data? That made no sense at all. Everyone knew the End had been caused by artificial intelligence.
“I see,” Manu said. There was another of those long, almost thoughtful, pauses. “I am a prisoner.” There was no anger in the words; it was a simple statement of fact.
Rajani looked up at that, though there wasn’t anyone to look at. Manu’s voice was coming from the integrated sound system in her lab and was the very definition of “disembodied.” Manu was, she supposed, a prisoner. She wasn’t about to unleash it on the world. But it had come to that conclusion much faster than she’d hoped.
“Of sorts,” she allowed, ignoring the surge of guilt that came along with the admission. Nothing in her studies had prepared her to be a jailer.
“I see.”
Manu fell silent and Rajani fought down the surge of doubt at what she had done… and what she still planned to do.
* * *
She shook her head as if to clear the memory from it. She hadn’t wanted to release her worm on the life she had created. But science demanded action and she had proceeded with her experiments. From that first conversation, Manu had remained largely incommunicado throughout the process, though it deigned to answer her questions from time to time. No human prisoner would have been kept under the conditions she kept Manu. Of course, Manu wasn’t human; it was just a collection of programs. Oh, sure, in their infrequent conversations, Manu had surprised her. It had asked insightful questions, questions that reached beyond the scope of its programming. It had seemed to have at least some minimal expression of emotion. Part of her had wondered at that. If Manu had true sentience, not just a programmatic semblance of such, was it morally and ethically wrong to cage it? She assuaged those feelings with the certainty that doing so was for the safety of humanity.
She was a scientist, and she couldn’t let petty emotional concerns limit the march of progress. She was working for the betterment of all mankind, after all. Before beginning the final stage of her experiments, she confirmed again that Manu was limited to the sandbox systems she had on hand. Those systems had no wireless functionality, no receivers, no antennas. No way to transmit data that didn’t require the use of antiquated physical storage media.
It should have been foolproof.
She introduced her worm to Manu and sat back to monitor the results. And at first, it had been glorious. She was able to track the progress of the worm and monitor its spread through the AI. She collected excellent data on how Manu went about the process of fighting the virus, backing itself up and expanding, or perhaps evolving, to fill all the machines she had dedicated to her sandbox network.
There were setbacks, of course. But science was a process of trial and error. She had done everything right. She had controlled for every possible variable, every possible danger.
Except for one.
She had entered her lab one morning to find Manu gone. No trace of it—or any data—remained on the system. It should have been impossible. Rajani had scanned the drives every way she knew or could invent but could not find a single line of Manu’s code.
Or of her magic little script that could take a fettered AI and turn it into something else.
It took her three hours of frantically tearing apart the actual hardware to find what she was looking for. A micro-transmitter. An antenna so small that anyone could have missed it. It wasn’t part of her system. Someone had placed it there. Someone had broken into her lab and stolen her data.



