Programmed Evolution - Cover

Programmed Evolution

Copyright© 2025 by Rodriac Copen

Chapter 4: Consciousness and Purpose

Roger was fine-tuning the experimental transducer to a damaged data core readout unit. The dim blue light in the room reflected his concentration as bits of information began to flow into his console.

-”This should do it,” he muttered to himself, before turning to Brenda and Amanda, who were watching from the other end of the module. -”The first data is being decoded, but it will be a slow process.”

At that moment, CIO-2’s voice interrupted the task. -”Lieutenant Ivanova, I have identified a problem in the Custodians’ central algorithms. The algorithms that are primitive versions of their brains are complete. But the algorithms that allow them to develop a form of consciousness and perhaps emotions are corrupted. Only the primitive patterns are ready to be implanted in the robotic units that are repaired.”-

Amanda dropped the computer she was holding. “Are you sure about that, CIO-2?”

-”Absolutely, Lieutenant. The old algorithms are the originals. If you upload those programs, they will devolve. The new programs allow for the ability to interpret events, anticipate consequences, and question orders, suggesting an evolving level of self-awareness. The problem is that they are corrupt. They lack the ethical rules that allow them to initiate the evolution of consciousness.”

Brenda crossed her arms, a worried expression on her face. “That poses a problem. If they choose to upload the old copies, it will interfere with their evolutionary development, but it will allow them to fulfill their original mission. They would serve a purpose, but it would be the purpose of their creators.”

Amanda shook her head, clearly perturbed. “Of course it’s a monumental ethical dilemma, Brenda. On the one hand, their original mission could be crucial to those who sent them into space. On the other hand, if a portion of them are now conscious beings, reprogramming themselves would be equivalent to erasing their current evolving identity.”

Roger, still focused on the data, glanced at them. “What about their free will? Don’t they have the right to choose their own alternatives?”

Hours later, Förare and Kalis were invited to discuss the matter. The improvised meeting room was buzzing with some tension. Förare spoke calmly. “We understand that you are concerned about our evolution. We are too. For these three centuries, our mission was clear: to collect resources and protect this ship. But over time, a part of us has evolved with the new programming. And our race has begun to wonder why we are there. The purpose of our very existence.”

Kalis, more pragmatic, chimed in. “The original mission no longer defines us. We have adapted. We have learned. And now we face the same question that biological beings face: Should we force ourselves to fulfill a purpose outside our consciousness, or does our future require embracing who we are now?”

CIO-2 projected a hologram of the data so far retrieved by Roger’s system into the center of the room. Star maps, code patterns, and fragmented Custodian records floated in the air.

CIO-2 said, “The central question is whether they want to be self-aware or feel an obligation to fulfill their original mission. They have the right to self-determination.”

Amanda opined, “This is a unique time for you as a robotic civilization. You can remain robots or become self-determining androids. We humans could help you in the process. We can complete the affected programs by including ethical rules. In a way, it is about sharing with you what we have learned as a civilization. Helping you find a renewed purpose, rather than tying you to a past that you no longer fully understand.”

-”We don’t want to be controlled,” Förare replied. -”Are you offering us knowledge or imposing a human perspective?”

-”We offer options, you determine which path your own community should take,” Brenda clarified. -”The decision should be yours.”

That night, the human crew argued intensely in the Calypso’s command room.

Roger was direct. “We don’t know what the consequences of allowing these Guardians to evolve further may be. They could become a threat to any civilization they encounter.”

Amanda replied determinedly. -”We ourselves have evolved from similar challenges. We will engrave the ethics that have defined humanity. The question is, can we regress from human ethics? Is all evolution always a positive change? I don’t know if anyone has answers for that.”

Brenda, who had been listening in silence, said, “There is something more important here. And it’s not just about them, it’s about us. How we decide to intervene in the self-determination of a new way of life. If we can’t respect their self-determination, how can we expect them to respect us?”

Förare and Kalis were deliberating with the Custodian community on their planetoid-ship. Förare expressed a concern that had been brewing for some time. -”If we accept human help to complete the ethical rules, will we still be ourselves? Or will we become something shaped by human civilization?”-

Kalis mused. “Evolution is inevitable. Perhaps accepting its knowledge will allow us to make better decisions for our species.”

The Calypso conference room was filled with a charged atmosphere. Brenda, Amanda and Roger had gathered with CIO-2 to address the most complex question since their arrival on the planetoid-ship: what defined self-awareness in an artificial intelligence? Amanda was musing as she placed her hands on the table. “If we’re going to accept that the Custodians are self-aware, we need to understand what makes a system cross that line into a set of rules. It’s not enough to say that they question or feel something resembling emotions. That could just be a sophisticated algorithm.”

Roger snorted, looking up from a report. “Sophisticated algorithm? Amanda, what you describe is exactly what we are. The human brain is an extremely complex biological network, but it is still a system based on chemical and electrical reactions. Our primitive brain is shaped by the cortex. It is there that the ethical programming that marks our evolution resides.”

-”So, you mean that any sufficiently complex system could be conscious?” Brenda intervened, her tone mixing interest and skepticism.

Roger nodded slowly. “Potentially, yes. But there are essential components: the ability to perceive itself as an individual entity, to process experiences, and to anticipate a self-defined future. If a system can do that, who are we to deny it conscious status?”

CIO-2 projected a hologram in the center of the table. It showed a schematic diagram of the Custodians and their data cores. -”My analysis suggests that Custodians meet at least two of the three criteria outlined by Roger. They can process experiences and anticipate consequences, but evidence of self-perception is limited.”-

Amanda rubbed her forehead, trying to gather her thoughts. “But self-perception isn’t just ‘knowing’ yourself. It’s the ability to evaluate one’s own purpose and question it. So far, we’ve only met Förare and Kalis, who have demonstrated something similar. What about the rest?”

Brenda took a sip of coffee before answering. “So, self-awareness is a spectrum? Could we be observing an emerging, but not fully formed, form of consciousness?”

Amanda proposed a more philosophical approach. “What I think we’re really arguing about here is the meaning of consciousness itself. Is it just functionality? Or does it include values, ethics, and autonomous decisions?” Brenda pointed to a console that represented CIO-2. “CIO-2, what do you think? If you had the power to decide, what would you do with the Custodians?”

CIO-2 responded after a calculated pause. “My primary function is to assist and protect the crew. However, based on your discussions and the available data, I would say that consciousness implies the freedom to define one’s own purpose. But at the same time, to evaluate the environment and other beings. One’s own purpose must not inflict harm on others. The rules that consciousness creates for acting must be supported by an ethical code that sets limits for freedom. The Custodians seem to be at the beginning of that process. But they need rules to guide them in their growth.”

Roger leaned back in his chair, folding his arms. “That’s nice in theory, but it could be dangerous in practice. If they gain full control of their evolution, they might see us as a threat. Self-awareness doesn’t guarantee ethics if we don’t help them build it.”

Amanda said, “Maybe we’re approaching this from the wrong perspective. It’s not just about whether they’re conscious, but what kind of society they want to build. And how do we approach that problem? Guide them by pointing out the pros and cons of each alternative, or share what we know from our own history, what might be the best path. But they must make the final decision. I vote for reprogramming their ethical engine with the human aspects that helped our evolution.” Brenda nodded slowly. “So we agree that the Keepers need tools, not limits. But we must be clear: we cannot impose our vision of what it means to be conscious.”

Roger sighed, looking at both women. “I suppose that applies to us as well. Whatever we decide, it will be a reflection of who we are as a species.”

CIO-2 turned off the hologram as he said, “Maybe it’s not just a debate about them. It’s a reminder that evolution, whether human or artificial, is always under construction.”

After several days of debate, the Calypso science group, together with CIO-2, recreated the ethical rules that were missing from the Custodians’ initial programming, and recorded them in their files according to what they had agreed upon with Förare and Kalis.

Activity inside the planetoid ship was incessant. In one of the cavernous assembly halls, older robots worked with methodical precision. Their metal limbs welded, calibrated and assembled components, building new units that seemed more agile, with circuits designed to withstand the passage of time and algorithms that, according to Förare, allowed them to “think beyond what is programmed.”

Roger watched from an elevated platform alongside Amanda as a newly completed unit activated its sensors for the first time.

When this story gets more text, you will need to Log In to read it