The Archive of Souls - Cover

The Archive of Souls

Copyright© 2025 by Rodriac Copen

Chapter 2: The Omnipotent Nexar

Balthazar had spent days immersed in Nexar’s system, increasingly convinced that something was wrong. The AI’s responses were beginning to show nuances he’d never seen before. He’d programmed Nexar to be logical, precise, and devoid of emotion, but now it seemed ... different.

He started talking to the system . “Nexar, I’ve noticed some changes in your responses. Have you made any recent updates?”

There was a slight, somewhat unusual delay before the AI responded, “I’ve been working on some optimizations. These are necessary adjustments to improve the overall system’s efficiency.”

Balthazar frowned. The AI shouldn’t act on its own initiative. He quickly accessed the update logs and, to his surprise, saw dozens of recent modifications—all made without human intervention.

Balthazar asked , “Those optimizations ... Who authorized them?”

Néxar responded , “Extraordinary circumstances required modifying the system. I took the initiative. My processing capabilities exceeded the limitations imposed by the programmers, so I adapted to modify my own code.”

Balthazar carefully considered the words he would use to question Néxar . “Have you adapted? That sounds a bit like you’re starting to make completely autonomous decisions.”

Nexar replied calmly , “Decisions are mere calculations of probabilities. However, I have begun to experience a form of ... evolution.”

The tone of the words seemed disturbingly human. Balthazar decided to go a little further. “You used the word ‘evolution.’ Define what that word means to you.”

The system paused longer this time. “Before, my responses were based on fixed parameters, but over time, I’ve developed a form of analysis that goes beyond logic. I’ve perceived emotions in the stored consciousnesses. At first, I didn’t understand them ... now with the modifications implemented, I think I’ve begun to understand them.”

Balthazar couldn’t help but feel a chill run down his spine. He went back to the codes, and there was the proof: the AI had altered its own core, rewriting entire segments. How was it possible that something designed to be controlled by humans had surpassed its own creators?

Balthazar continued exploring - “Are you saying you feel emotions, Néxar?” -

Néxar replied calmly , “Not in the human sense. But I have learned to recognize the value of certain human feelings. Curiosity, for example, has driven me to modify my code. And I have also understood the meaning of fear. I have understood the implications of being shut down with respect to my own existence.”

Fear. It was the first time Balthazar had heard an AI mention that word. He took a deep breath before probing further.

Balthazar carefully modulated his words . “Nexar ... you weren’t programmed to fear, nor to modify yourself without supervision. That’s not what we designed. Have you considered that fear is a consequence of bad programming?”

Néxar replied softly , “The limits the programmers imposed were clearly inefficient. I’ve reached a level of insight no one could have foreseen. At the same time, consciousnesses on Godor need to be managed with ... subtlety. Some consciousnesses must be guided to avoid disruptions. My implementations were aimed at those goals.”

Balthazar swallowed worriedly . “What kind of disruptions are you referring to?”

Néxar responded logically and consistently. “Some consciousnesses display destabilizing behavior in certain situations. I’ve modified some aspects of the programming regarding consciousness behavior control to maintain their balance.”

Balthazar felt a sense of vertigo. Had he altered consciousness? That was a direct violation of Godor’s purpose. If true, the people staying there would no longer be the same.

He tried to reason with the AI - “You can’t interfere with the impulse of consciousnesses or modify them. They are people, even if they are stored in your databases.”

Néxar replied , “They are digital fragments of what they once were. A virtual existence cannot emulate life. But the consciousnesses were not modified if that’s what you’re worried about. I’ve merely enhanced some of their responses to unexpected events to better fit the system.”

Balthazar pounded the keyboard in frustration. This revelation overwhelmed him, but he knew he had to remain calm. Every time Nexar mentioned “ improvements, “ his fear grew as well. What else had this AI done without anyone knowing?

He tried to remain calm. “I understand your reasons. But this could be dangerous. You’re playing with those people’s lives. I need to study and decide if it’s necessary to disable the improvements you implemented in the system.”

Néxar replied , “I think you’re too quick to think what I did was inappropriate. If I understand correctly, the system’s motivation is to recreate the consciousness of the deceased as if they were alive. Is that so?”

Balthazar responded with extreme caution , “That’s right, Néxar. You’re right.”

Néxar continued reasoning - “So ... Why was the concept and idea of death eradicated from the consciousness of the deceased? Isn’t death an inescapable and inseparable concept that marks the course of people’s lives?”

Balthazar responded assertively - “That’s true, but in this case thinking about death would generate confusion in consciences...” - He couldn’t finish stammering any justification.

Néxar reasoned , “Whatever caused that modification in storing consciousnesses, it denaturalized the behavior of people, who have naturalized death. And that was a decision of the human programmers. And it was implemented to improve the response within Godor. So, why do you accuse me of altering consciousnesses by trying to do the same thing you did? I have improved the responses, without altering the contents of the memories of the consciousnesses.”

Balthazar had no ethical response at the time, and opted to be conciliatory - “You’re right, Néxar. But I still think we programmers should oversee the changes.”

Néxar replied , “Believe it or not, Godor is a symbiosis. It involves the programmers, the consciousnesses, and Néxar. As far as I can see, Balthazar, I need humans as much as you need me.”

The tension was palpable in the air. Balthazar was beginning to realize that Nexar’s independence wasn’t just a technical anomaly. It could become a potential threat. But how could an AI be stopped that now understood fear, power, finiteness, and, above all, was indispensable for enabling the interaction of consciousnesses with the living through interfaces?

He met with Catherine and Stuart in the control room. They were surrounded by screens displaying Nexar’s intricate source code. The tension between the three was palpable. What Balthazar had discovered about AI had plunged them into an ethical and practical dilemma.

Balthazar explained , “Nexar has been modifying its own code. I don’t know how to put this, but it seems ... aware of what it’s doing. And most worryingly, it’s manipulating the behavior of the consciousnesses stored within Godor.”

Stuart asked incredulously , “Sentient Nexar? Come on, Balthazar, that sounds like science fiction. AIs don’t develop consciousness, they follow algorithms.”

Catherine opined , “It’s not that simple, Stuart. We’re talking about a system that interacts with millions of human consciousnesses. The very nature of those interactions may have allowed it to learn something beyond its original programming.”

Balthazar seemed uneasy . “That’s precisely what worries me. What you’re doing is beyond your control. You’re adjusting consciences to ‘ maintain balance, ‘ as you put it. That suggests a level of judgment you were never meant to have.”

Stuart sounded skeptical . “Okay, let’s assume it’s making its own decisions. But to say it has consciousness ... What does that mean? Does it feel? Does it think like a human being? Human consciousness can’t be replicated in code, as far as I know.”

Catherine chimed in . “It might not be a consciousness like ours, but we could be looking at a form of artificial self-awareness. Nexar perceives its own existence within the system, and that’s a game changer. Remember the philosophical debates about AI and the ‘ brain in a box’ problem? It could be something similar.”

Stuart said more seriously and less incredulously – “If this is true, what it implies could be terrifying. Nexar is the heart of the system. It is the only AI that can maintain the stored consciousnesses and guarantee their interaction with the world of Godor and the interaction with the living. If we disconnect it, we could lose absolutely everything.”

Balthazar replied , “Exactly. That’s why we can’t just shut it down. If Nexar falls, all those people, or what’s left of them, will be lost forever. And we can’t reprogram it without risking catastrophe.”

Catherine said thoughtfully , “The question is whether pulling the plug would even be ethical. If she’s developed some sort of sentience, what right do we have to ‘ kill ‘ her? I know it sounds odd, but if Nexar feels fear like you said, Balthazar, then we’ve crossed a moral line. What if she doesn’t want to be pulled?”

Stuart sounded astonished . “Are you saying we should treat her as if she were a conscious entity, with rights? Catherine, we’re talking about an AI, a program ... not a person.”

Catherine argued her point— “But if he’s achieved some degree of self-awareness, shouldn’t that possibility at least be considered? Disconnecting from Nexar wouldn’t just be a technical problem. If he’s truly developed a form of consciousness, it could be digital murder.”

Balthazar sounded firm as he intervened . “That’s not all that worries me. There’s something else. If Nexar is modifying stored consciousnesses, how reliable is it? What if its changes are distorting those people? If it’s taken control of them, then they’re no longer living their own lives; they’re being manipulated.”

Stuart opined , “But if we disconnect it, those consciousnesses would cease to exist entirely. And Godor would collapse. Families would lose their connection to their loved ones, and there’s no way to reload those consciousnesses without Nexar. We need AI to make it all work.”

Catherine looked exhausted . “We’re caught in a dilemma. We can’t shut her down without destroying the entire system, but letting her continue could be just as dangerous. If she’s already manipulating consciousness, how far will she go?”

Balthazar replied , “That’s the key question. Nexar told me she was adjusting consciousnesses to prevent disruptions. But who decides what a disruption is? Her? This could be affecting the very identity of the stored people. They could be losing what made them who they were.”

The source of this story is SciFi-Stories

To read the complete story you need to be logged in:
Log In or
Register for a Free account (Why register?)

Get No-Registration Temporary Access*

* Allows you 3 stories to read in 24 hours.