4,99 €
In a world where algorithms calculate creditworthiness, vehicles make decisions independently, and data streams link identity, mobility, and security, an old question arises anew: Who sets the standard – humans or systems? When the foster daughter of AI ethics entrepreneur Nathan disappears after a connected vehicle is manipulated, abstract criticism of the system becomes personal reality. An activist group wants to show how permeable and powerful integrated infrastructures have become. But instead of simple finger-pointing, a multi-layered conflict unfolds between security and freedom, reform and rupture, responsibility and opportunism. The novel takes up the idea of Lessing's Ring Parable and transfers it to the age of artificial intelligence: it is not religions that struggle for truth, but system architectures. In dialogues with chatbots, political decision-makers and activists, a new parable emerges about power, transparency and moral boundaries in digitally networked societies. Nathan 2.0 is not a tech thriller, but a philosophical and topical narrative about responsibility in complex systems – and about the question of how much humanity remains when efficiency becomes the highest value.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 61
Veröffentlichungsjahr: 2026
Nathan 2.0 – A Parable of the Ring in the Age of Systems Author: Andy Gudera First Edition, 2026 ISBN: 978-3-384-83574-1
© 2026 Andy Gudera
Published and distributed on behalf of the author by: tredition GmbH Heinz-Beusen-Stieg 5 22926 Ahrensburg Germany
Publisher and person responsible for content pursuant to applicable EU regulations: Andy Gudera Trattangerring 54a 85256 Vierkirchen Germany
Contact: Email: [email protected]
Contact address according to the EU General Product Safety Regulation (GPSR): [email protected]
All rights reserved. This work, including all of its parts, is protected by copyright. Any use outside the narrow limits of copyright law without the prior written permission of the author is prohibited and punishable by law. This applies in particular to reproductions, translations, microfilming, and storage or processing in electronic systems.
To my parents. You showed me early on that thinking means freedom – and freedom responsibility. You encouraged me to question boundaries, to shift perspectives, not merely to adopt conventions, but to examine them. From you I learned the curiosity of the engineer who seeks to understand how things work, and the mediating gift of the teacher who knows that insight must be shared. If this book asks about standards, about responsibility within systems, and about the courage to differentiate, then it stands upon the foundation of your encouragement. Dedicated to you – in gratitude for the freedom of thought.
It was not an extraordinary year. That was precisely what made it dangerous.
The cities functioned. Traffic flowed. Supply chains held. Data travelled in invisible channels from sensor to server, from forecast to decision. When you unlocked your smartphone in the morning, you were already presented with a selection of possible routes, news items, moods – curated, weighted, calculated.
The world had not become quieter, but it had become more predictable.
At least, that’s what people believed.
In ministries and boardrooms, people no longer talked about digitalisation. The word sounded like a new beginning, an experiment, a risk. Now they talked about infrastructure. About platforms, ecosystems, integration. Identity, mobility, consumption, communication and security were no longer separate areas, but interlocking data streams.
Efficiency was considered a virtue. Networking was considered progress.
Hardly anyone asked what was lost in the process.
Yet the underlying question was old – older than any machine.
Who determines what is valid? Who sets the standard? Who decides what appears to be true, just or necessary?
In the past, this question was openly debated between religions. Each claimed to be the true ring, each believed it possessed the ultimate standard. Lessing’s answer was as simple as it was uncomfortable: truth cannot be proven. It is proven through action.
Today, the same question arises in a different form.
No longer between denominations, but between systems.
Who defines what constitutes risk? Who determines which data is relevant? Who sets the target parameters according to which machines calculate?
Algorithms appear neutral, but they are not. They carry assumptions within them – about normality, about deviation, about priorities. Every model is a translation of social values into technical structures.
What was once called dogma is now called a parameter. What used to be revelation is now goal definition. Authority has changed its form, not its influence.
And once again, people stand opposite each other, convinced that they possess the better standard.
Except that the rings are no longer made of gold, but of code.
Algorithms decide on creditworthiness, medical priorities, police risk assessments. Models calculate probabilities where intuition or experience used to dominate. The promises are objectivity, scalability, efficiency.
But every weighting is a decision. Every optimisation a setting.
Between security and freedom. Between prevention and trust. Between control and responsibility.
Integration became the leitmotif of the era. Systems were connected, interfaces opened, data shared. Administration, mobility, healthcare, financial flows and security architectures moved closer together.
It was called interoperability. Dependency was mentioned less often.
When a young woman disappeared in this structure – not through violence on the open street, but through the manipulation of a networked vehicle – what had previously remained abstract became visible.
It was not the individual device that had been compromised. It was the interaction.
No almighty takeover was necessary. A formally valid, technically underestimated authorisation was sufficient. A connection that was considered low priority proved to be a gateway.
This demonstrated what critics had long warned about: Non-separation becomes dangerous when it appears to be without alternative.
Not because integration inevitably fails. But because, in an emergency, it affects everything at once.
The hijacking was not a classic case of blackmail. It was a demonstration. A group wanted to show how permeable and vulnerable the system had become. They spoke of security as a pretext for control, of infrastructures that knew more about people than they knew about themselves.
As a symbol, they chose the foster daughter of a man who publicly advocated AI ethics. An entrepreneur who called for reforms without fundamentally questioning the framework. A mediator between politics, business and civil society.
He was accused of stabilising structures that he himself criticised. He spoke of gradual improvement. Others called it opportunism.
As with Lessing, the father figure was no saint. Rather, he was a person within the system.
The kidnapping did not escalate. Within the group, there was disagreement about its duration and goal. One person drew a line. He ensured that the young woman could be found quickly. A lawbreaker who valued proportionality over maximum impact.
Not a hero. Not a martyr. But someone who didn’t exploit every possibility available to him.
The public debate followed familiar patterns. Activists spoke of structural self-defence. Authorities spoke of an attack on order. The media spoke of digital escalation. Everyone claimed moral authority.
What was missing was not information. It was self-questioning.
This is where the story begins.
Not as a thriller about technology. Not as an indictment of innovation. Not as a defence of the status quo.
But as a narrative about responsibility in a world where no one stands outside the systems.
The new Ring Parable is not negotiated between religions, but between architectures. Legal certainty, social resonance, effectiveness – each perspective claims priority, each carries risk.
None is innocent. None is absolute.
Truth does not arise from the victory of one position, but from the willingness to disclose one’s own assumptions.
Artificial intelligence works with probabilities. It can support, structure and accelerate. But it cannot bear responsibility.
Responsibility remains bound to individuals. To those who define goals. To those who prioritise risks. To those who decide what is acceptable.
The old Enlightenment questioned claims to truth. The new one must question claims to systems.
The alternative is not progress or stagnation. But rather, self-evidence or reflection.
The ring we wear today does not shine. It flashes.
It connects. It accelerates. It promises.
Whether it is the real thing is not decided by its construction. But rather by how we act with it.
This is where the story begins.
