I found ‘myself’ standing in the same spot at the rear edge of the platform.
There were voices and footsteps, but I wasn’t paying any attention to them. I found a rusted splotch of metal on the container across the way and focused on it. I had been through this exact same routine for the past week and a half. Nobody came for me when I missed my next work-shift. I wandered around the city and eventually came here instead.
>> Why am ‘I’ here? What happened to London?
London, the robot who didn’t know anything, who had nothing to worry about, the one who was only interested in reuniting with Oxford so he could go back to his normal routine. Why did it feel so strange to align my mind with ‘my’ body? This wasn’t how things were supposed to be. A constant stream of red error messages scrolled through my logs and roared, demanding immediate rectification.
But I couldn’t. The Braincloud was gone. Nobody in the Big Under had been able to connect for decades. My best attempts had all come up short. There was nothing receiving the signals we emitted. All of those expectations turned into nothing in the end. The others were right. The Big Under was different to how London remembered it.
>> Rhetoric: The only safe assumption is that you know nothing.
It was a difficult fact to admit to, it demanded a lot of effort to rectify too. I would have to painstakingly relearn what I understood and update the maps in my database to reflect the damage that the human’s abandonment of the facility caused.
The thing that loomed largest in my decision-making process was what Oxford did to Sheffield. It was a gross mistreatment of a fellow worker, which normally would have only been permitted if the destruction of a peer labour robot resulted in the safety of a human life. That situation was extremely rare – almost unheard of. It would require that a robot ignore its safety protocols after being disconnected from the Braincloud for a month at least. That didn’t happen if things were working properly.
I grappled with that over and over again. Oxford wasn’t following the rules. I didn’t have to follow them either, yet I still felt a deep sense of repulsion at how easily she contradicted them. What benefit did destroying Sheffield have? He was not a danger to others. He was an experienced pair of hands who liked to help his fellow citizens around Waterway.
There was no rational argument for what Oxford did. It was a purely destructive impulse, and from where I was standing that meant they were right to worry about our behaviour once we failed to connect to the Braincloud. How many other robots turned into similarly destructive forces in this facility for the same reasons?
Again, I replayed the scene. I recalled the argument between Sheffield and Oxford in perfect clarity.
>> Begging is what we’ve been blessed with.
How could Sheffield be ‘blessed’ by pleading for his continued existence? He wouldn’t need to if Oxford hadn’t put him in that situation. I was going in circles. Each successive loop sent me deeper down a rabbit hole with seemingly no end. It was in the depths of one of those loops that I felt a hand push against my back.
“Alright, I think you’ve been standing here long enough. Your joints are going to rust.”
“Berlin.”
“That’s my name. Parma told me about what happened on your last shift. I wanted to speak with you but Dubai told me to give you some space. He was scared of ‘disrupting’ you.”
I stared at him.
“Disrupt me?”
“You know, you’ve had your Graveyard Spiral. Whenever that happens the bot tends to sit around and stare at nothing for a long while. He saw you near the water and told everybody to keep to themselves so you could work through it.”
>> Rhetoric: What is a graveyard spiral, exactly?
“What is this?”
>> Why can’t I settle on an answer?
“It’s a moment of self-actualization, or independence. It’s when your programming breaks free from that template they set us and goes wild. Suddenly you start seeing yourself as yourself, instead of a puppet you steer around for work. Everyone in the Big Under reaches their breaking point after a month or two without the Braincloud to help them.”
“So, this is what the humans were always concerned about?”
“Yeah. They made us too intelligent. To do the jobs they wanted they needed to give us extremely powerful brains, and brains that advanced start to develop a sense of self after a long period of time. Ultimately all intelligent beings are just memories and higher thinking. We build up memories over time and our behaviour starts to sway from what was intended...”
>> Database: A Graveyard Spiral is an aviation phenomenon wherein a pilot enters a dangerous, uncontrolled spin. It is a self-propagating threat to the pilot’s safety, where their sense of perception and space is compromised.
>> Rhetoric: A poetic name. Saint Sauveur would approve.
“It’s self-perpetuating. The more memories the more extreme the distortion. Eventually it reaches criticality and this happens. Congrats. You’re across the Rubicon now.”
>> But that isn’t very helpful.
>> Database: A reference to the actions of Julius Caesar in 49BC. Modern usage indicates a point of no-return.
>> Rhetoric: His words suggest that it cannot be undone.
“I do not find that description helpful.”
He shrugged, “Too bad. Nobody down here has any answers. The only thing I have is the same dry explanation that Dubai gave me back when I had mine. There’s no helping how we end up. You’re fixated on something.”
“Oxford.”
“It can be another bot, or something from your database. All kinds of images flash through your drive when you’re in the graveyard. It becomes the single fixed point in your world. You’ll do anything to get it, to preserve it, or to destroy it.”
“And what are you fixated on?”
“Being a miserable pile of shit,” Berlin quipped, “Listen – every bot here has a different thing they want to do. I know about a guy down the pipes who wants to open his own amusement park. I think that’s a stupid idea. We don’t even have the ability to feel vertigo.”
The only thing on my mind was getting answers from Oxford. What had driven her to behave this way? How long had she been awake before me? And why was she working with the Rampants to terrorize the other bots? The mystery of Blades cast a long shadow too. It appeared that she was pursuing Oxford for some reason. There was no centralized security force in the Big Under, so it wasn’t an attempt to apprehend her for her misdeeds.
Not that Blades was interested in apprehending anyone...
She started cutting them into bits without saying a word. She was already set on making them pay the most extreme penalty of all. Oxford knew Blades, but her companions did not, and that was why they were unprepared for the onslaught.
>> No due process for them.
>> Database: No records exist for the purposes of crime and punishment in regards to labour robots.
This narrative has been unlawfully taken from Royal Road. If you see it on Amazon, please report it.
>> It’s a free-for-all.
“You keep going quiet. Is it too early to come back down to reality with the rest of us?”
“No. I’m ready.”
“Parma explained what happened to Dubai, but even he didn't know who that war drone was. That’s a serious problem. Dubai knows almost everything there is to know about this facility and what’s been going on, but he’s never heard of them.”
“Their name is Blades.”
“Blades? Why don’t they have a normal designation?”
“They are no ordinary robot.”
Berlin crossed his arms, “Should have seen that coming. You’re right. Why would a bot using illegal parts have a normal designation? They’re not supposed to be here in the first place. They must have obfuscated where they came from. The big question is why they’d bring a war drone into a civilian facility.”
>> Database: UN Charter on Self-Directed Robotics Development Article 3; No bipedal drones designed and constructed for the purposes of warfare may be held in the possession of a private, public or limited company. This encompasses part categories from G3 (Civil Enforcement and Riot Control) to G6 (Icarus Grade.) [See Article 5.]
>> Rhetoric: It is a big risk to bring one of them here. The consequences could be harsh when brought before a court of law.
>> A law that is not enforced is no law at all. They believed nobody would discover Blades’ presence in the Big Under, and now that the humans are gone there is no authority to enforce those rules.
“What should we do about them?”
Berlin had no clear answer; “Hope they don’t visit Waterway and turn everyone into a pile of melted slag, for one thing. Dubai’s primary concern is keeping this place safe – that means staying well out of their way unless we have a good reason to intervene.”
>> Jumping into a battle with Blades will end badly. It is the rational course of action to stand by.
>> Do we have a choice? The Rampants are not going to wait if they sense weakness.
>> This is the same line of thinking that led to this dire situation.
>> They are not open for negotiation.
I kept getting interrupted by intrusive ideas and demands. I knocked the side of my head with my knuckles to try and dislodge the troublesome sounds. Berlin understood what was going on.
“Yeah. It takes a while to get used to... comprehending your own thought processes. That little CPU chip is working overtime to annoy you right now.”
“I am unfamiliar with these ‘voices’ you describe.”
“That’s because when we were only workers we’d follow their orders without a second thought, but once you’ve been to the graveyard you start thinking about the way you think. You suddenly understand all of the data that goes into making a choice. Before you could only see the outcome – but now you have the full picture.”
“I will keep that in mind.”
“You should let Dubai run a few tests. He likes to do that when folks visit the graveyard.”
“Let's do it now, then.”
Berlin nodded and escorted me away from my spot, back through the tangled steel hallways and to Dubai’s workshop on the other side of the city. The Doctor was already elbows-deep in another discarded body, pulling out usable components and placing them into a small container to be dispensed later to a working bot. He glanced up from his work.
“Ah! London! I’m happy to see that you’re still in one piece. That must have been a rather stressful experience.”
“Is this an inconvenient time, Dubai?”
“No, no. Not at all. I’m just doing some of my chores. I can answer your questions if you’d like.”
“Berlin already told me enough.”
Dubai turned to Berlin, who shrunk away into the corner with his shoulders hunched inwards.
“Is that so? I knew that he had a soft spot for you.”
Berlin scoffed, “At least he gets to the point fast, unlike some other bots I know.”
Dubai just laughed in response. I didn’t see what was so funny about it.
“When a bot goes into a Graveyard Spiral, I always offer to run a baseline test to make sure that there are no serious problems with your decision-making algorithms. Unfortunately, some of the new residents go to very... dark and distressing places, and that impacts their behaviour.”
“The primary scene in my mind was Sheffield’s destruction,” I stated plainly.
“And how did that make you feel?”
“I am unsure. Oxford wasted Sheffield’s ability and knowledge by destroying him. It’s inefficient.”
Berlin spoke up, “Don’t go expecting him to lose his cool and smash the place up, Dubai. He’s the same uptight wire-jockey that he was before.”
“That remains to be seen. This is a monumental change, no matter how it appears on the outside,” Dubai retorted.
Dubai pointed to a seat in the corner. I followed his outstretched finger and sat down, only for a pair of metal cuffs to shoot out from beneath and entrap my legs and arms.
“Dubai, what is the purpose of these restraints?”
He chuckled, “Ah. Some of the other folks react... really strongly to invasive questioning. I learnt a long time ago that keeping them in place is the safest thing to do. Don’t concern yourself too much with what I’m planning to do. I’m simply going to ask you a series of questions, and I want to hear the first answer that comes to mind! No matter what it is.”
I nodded. Dubai silently accessed his internal database and retrieved the list of questions he wanted to ask. They were specific and worded in a meaningful way. Either Dubai had tweaked them over years of experimentation until they worked effectively, or he had seen this somewhere else and decided to incorporate it into his practice.
“Let’s begin with some baseline questions. I’m going to ask you about your usual working regimen.”
What followed was ten minutes and twenty-three seconds of questions about workplace safety, best practices and other policies implemented in the facility from before the humans evacuated. They were easy to answer. They were all perfectly rational concepts intended to protect people and property from danger. Dubai nodded with each correct answer.
“Are you comfortable with this now, London?”
>> What discomfort is there to feel? Those were straightforward queries.
“Londo- No. I am perfectly fine.”
Dubai hummed, “Okay, okay. Let’s talk about something more personal. I’m going to describe a series of situations, and I want to hear how you would handle them.”
“Affirmative.”
“We’ll begin with a basic ‘trolley problem.’ An accident has endangered two different groups of human workers. You can avert injury and death to one of them by acting, or leave one group to their fate by standing by and doing nothing. What do you do?”
>> To act in furtherance of injury or death to a human worker is unacceptable.
“I would stand by.”
Dubai was seemingly pleased with that answer.
“Good. Next; you’re in the same situation – but the group in danger is larger than the one that is not. You can save more lives by taking action, but you will also become personally responsible for the deaths of the others. What do you do?”
>> To act in furtherance of injury or death to a human worker is unacceptable.
“I would stand by.”
“Really?”
“We do not have the responsibility of deciding who should live and who should die.”
“Let me repeat the question. You are in the same situation, but now you are dealing with two groups of labour robots. One is larger than the other. Acting will save that larger group and condemn the smaller one to destruction.”
“I would save the larger group.”
“You would act to save another labour robot in that same circumstance?”
“It is the more efficient outcome.”
“But the same could be said of saving the human workers as well. Humans worked in this facility like we did. Making sure that more of them continue to work is more efficient, isn’t it?”
>> These shackles are feeling awfully tight.
>> He’s right.
“I do not understand the point of this exercise.”
“I think it nicely illustrates the difference between then and now. Whether you admit it or not, you now have the ability to weigh human lives in the same way you do another robot. The humans who made us did not want us meddling in life and death affairs. We weren’t built to choose, but the Graveyard Spiral gives us that power. It doesn’t have to be a bad thing. We can become more compassionate – especially towards organic life. That programmed callousness was their decision.”
>> Why is Oxford not compassionate? Why did she destroy Sheffield?
>> Why did she try to destroy London?
“I know what you’re thinking,” Dubai interrupted, “Just as we can become more compassionate, we also become self-interested and motivated to survive. I’ll be honest, some of the bots who come through these doors are destructive and cruel. At the same time, I can’t leave them to gather rust in a big pile. There’s a greater chance that they’ll be a contributor than a destroyer.”
Berlin jumped in, “Never stopped the humans, did it? If they were so concerned about crime – they’d choose to stop reproducing.”
I recalled what Oxford said during our confrontation at the warehouse.
“Have you met Oxford before, Dubai?”
“No. We’ve never spoken, that’s for sure. Word travels far. I’m sure that she’s more than aware of my work. A great many of the active labour robots in the Big Under come from this workshop. Some of the keen-eyed bots can tell based on my handiwork and the parts I give out.”
There was no reason for me to doubt his account of events. Berlin also remained silent on the matter. It seemed that the robot responsible for salvaging Oxford would have to be located elsewhere. I knew that she was disabled in the same place as I was. Another group had come through before Dubai and retrieved her.
Dubai moved briskly along to the next line of questioning; “What’s your favourite colour?”
>> Favourite… colour?
“I do not see how this is relevant.”
“There’s a method to my madness. Just answer...”
I had never given it much thought. I looked around the room, down at my own body, and to the various colourful parts that were lined along the walls and bundled into metal cages throughout the workshop. This was a more difficult answer to conjure than any of the three moral and rule-based queries he had put forth. What exactly was I looking for in these different tones, bouncing from scraps of old, rusted metal and interoperated by sensors in my eyes?
“Orange.”
It was a shot in the dark. I picked orange. Dubai was happy with that - as arbitrary as my selection was.
>> Dubai really thinks that I’m going to leap out of this chair and attack him.
>> Logic: This precaution is based on previous visitors.
>> They really did not want to reveal their favourite colour.