Ex Machina - Anders Indset - E-Book

Ex Machina E-Book

Anders Indset

0,0
12,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

The philosophical exploration of reality has fascinated humanity since Plato's Allegory of the Cave and has reached new dimensions through technological and intellectual breakthroughs. From Descartes' skepticism of sensory perceptions to Daniel Dennett's provocative "illusionism," the nature of our conscious experiences is continually called into question. Yet, while neuroscience defends consciousness as real, the debate remains alive, shaping scientific and societal discourses. With George Berkeley's subjective idealism, which defines reality as a product of perception, and the groundbreaking concepts of Mach and Einstein, our understanding of space, time, and existence has been radically transformed. These philosophical foundations pave the way for one of the most intriguing questions of our time: Are we living in a simulation? What was once the realm of science fiction now occupies the forefront of cutting-edge research inspired by quantum physics, and the theories of computation and information.In their Scientific-Philosophical work – SciPhi – Anders Indset and Florian Neukart explore the hypothesis that our universe might be part of a chain of simulations – a concept with the potential to revolutionize our understanding of existence, theology, and the laws governing the cosmos. Together they examine the roles of computability, entropy, and the boundaries of physical laws within this chain of simulations. Could a collapse occur if resources are exhausted? And what does this mean for the notion of an infinite, external entity? With a clear focus on experimental approaches, Indset and Neukart analyze how we might uncover evidence about the nature of our reality. Although computational boundaries are still to be overcome,the authors, through observations and analysis on the simulation theories, open new perspectives on the profound questions of existence – while acknowledging the limits of current scientific knowledge. A compelling and profound book that encourages readers to think beyond the nature of reality and redefine the mystery of our existence.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB

Seitenzahl: 241

Veröffentlichungsjahr: 2025

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



What is Reality – Are We Just Part of a Simulation?

Since Plato’s inquiries into the nature of existence, one question has continued to fascinate humanity: the nature of reality. From René Descartes’ radical doubts to Daniel Dennett’s “illusionism,” which challenges the very concept of consciousness, groundbreaking ideas have shaped our understanding of what is real. George Berkeley’s subjective idealism and Albert Einstein’s revelations redefined space and time, paving the way for a compelling modern scenario: Are we living in a simulation? With the advent of exponential technology, the prospect of simulating the universe using quantum computers has become central to this debate. Could our reality be part of a chain of simulations? Anders Indset and Florian Neukart explore this question, shedding light on the profound implications such a scenario would have for our understanding of existence, theology, and the destiny of the cosmos. A book that masterfully combines science and philosophy into a gripping intellectual journey.

Anders Indset is a Norwegian-born philosopher and deep-tech investor. Recognized by Thinkers50 as one of the leading voices shaping technology and leadership, he is the author of four Spiegel bestsellers, with his works translated into over ten languages. Anders is the founder and Chairman of Njordis Group, a driving force behind initiatives like the Quantum Economy, and a sought-after international speaker on exponential technologies and the future of humanity.

Dr. Florian Neukart is an Austrian physicist, computer scientist, and business executive specializing in quantum computing (QC) and artificial intelligence (AI). He serves on the Board of Trustees for the International Foundation of AI and QC and co-authored Germany’s National Roadmap for Quantum Computing. Currently, he is Chief Product Officer at Terra Quantum AG, following over a decade leading global innovation and research labs at Volkswagen Group. He holds advanced degrees in computer science, physics, and IT, including a Ph.D. in AI and QC. A professor at Leiden University, Florian has authored books and published over 100 articles on topics including space propulsion, materials science, and AI.

Anders IndsetFlorian Neukart

EX MACHINA

The God Experiment

Værøy

© 2024 Værøy GmbH

Mergenthalerallee 15-21

65760 Eschborn, Germany

Jacket Design: © Egbert Clement

Jacket Image: plainpicture/Jean Marmeisse

Print-ISBN: 978-3-911726-00-9

EPUB-ISBN: 978-3-911726-02-3

Contents

Introduction

TheSimulationHypothesis

QuantumPhysics

QuantumTechnologies

Fate of theUniverse

Simulation fromWithin andWithout

TheGodExperiments

TheExternalProgrammer

EternalHorizons:Emergence of aCosmicMind

Conclusion

Introduction

Imagine a reality where everything we perceive, from the vast expanse of galaxies to the minutest particles, is part of a grand simulation. This provocative notion has not only captivated philosophers and scientists but also gained significant attention in mainstream media. Over the past years, the question of reality has gained massive mainstream media attention as prominent figures like Elon Musk have stated there is a “one in billions” chance we do not live in a simulation [1], and pop star astrophysicist Neil deGrasse Tyson has also jumped onto the idea stating that the probability is more than 50% [2]. In addition, philosopher David Chalmers has also caught on to the belief that we likely live in a simulation [3, 4], pushing for further examination of the very notion.

However realistic or plausible such a hypothesis [5, 6] may be, how could modern physics and mathematics support seeking evidence for such a case? Scientists have criticized the hypothesis made by philosopher Bostrom for being pseudoscience [7, 8] as it sidesteps the current laws of physics and lacks a fundamental understanding of general relativity. Suppose an external programmer - an entity running a simulation and characterized as external to the simulation - could define the simulation’s physical laws. What would an external programmer and beings within the simulation be able to calculate based on their understanding of physical laws? Moreover, theoretically or practically, could beings in the simulation conceive and implement the apparatus or tools to verify that they aren’t participating in a simulation chain?

The philosophical inquiry into the nature of reality, pondered since Plato’s allegory of the cave, has evolved significantly with advances in technology and philosophy. René Descartes introduced skepticism about sensory experiences, leading to modern theories like Daniel Dennett’s “illusionism,” which questions the reality of conscious experience or “qualia.” Despite resistance from neuroscientists, who refute the notion of consciousness as an illusion, the debate continues to inspire both academic and public discourse. The concept of reality shaped by perception, known as subjective idealism, was advanced by Bishop Berkeley and challenged established notions of space and time, influencing thinkers like Mach and Einstein. These philosophical underpinnings have set the stage for contemporary considerations of our universe as a potential simulation—a notion not only entertained in science fiction, such as “The Matrix,” but also in serious scientific contemplation. In the context of quantum technology and the visionary ideas of Richard Feynman, the potential for simulating the universe with quantum computers has brought the simulation hypothesis to the forefront of theoretical physics. This book proposes that our reality could be a part of a simulation chain, an idea with vast implications for our understanding of existence, theology, and the cosmos’ fate. We explore the limitations of computability and predictability in universal simulations, acknowledging how increasing complexity and entropy constrain computational capacity. We propose that a simulation’s fidelity to the physical laws it emulates inevitably leads to an exhaustion of resources, suggesting that a collapse of the simulation chain is possible unless an external entity, not limited by our physical laws, intervenes.

While controversial, the question of whether we exist in a simulation and thus participate in a simulation chain cannot be answered with certainty today. Nevertheless, it is intriguing, and answering it would potentially lead us to question our very definitions of life and spirituality. Suppose we spark a chain of simulations, each hosting intelligent life intending to simulate the universe. Would we classify each of the simulated life forms as actual life? What if we could confidently state that we are part of a simulation chain and simulated beings ourselves? Would that change our definition of what counts as “real” or “artificial” life? In the argument made by Bostrom, one premise is worth examination: if there is a physical possibility of creating a simulation, then based on the state of development and the relation to time access, there would most likely be a higher probability of our residing within such a simulation than our being the exact generation building such a simulation.

Experiments are needed to gain deeper insights, but several constraints prevent us from designing experiments that directly answer the question of whether an external programmer has created the universe and whether it’s only one of infinite hierarchical simulation chains. However, it is possible to test the simulation hypothesis indirectly under certain assumptions. The outlined experiments for doing so involve creating a simulation, potentially resulting in a chain of simulations, and conducting observations on the simulation behavior within the confines of a hierarchy until statistical relevance can be obtained. Potential observations of note could include the emergence of intelligent life and its behavior, a reversal of global entropy, compactification of dimensions, or the evolution of simulations along the simulation chain (all of which are, based on the current understanding of physics, impossible for us to conduct in our universe, but an external programmer shall not suffer from such limitations). Designing such experiments leads to the ultimate boundaries of computability and predictability. Physical and computational constraints prevent us from simulating a universe equal in complexity and size to our universe and from making accurate predictions of the future, whether or not the “real” or simulated universes are based on the same physical laws.

Moreover, the cosmos has not yet been fully understood. For example, the universe’s fate and how to unite quantum physics and general relativity are deep and open questions. Today, quantum theory is widely understood as an incomplete theory, and there may be new models to be discovered - models that will further flesh out our understanding of what quantum theory has indicated thus far. However, the state of modern physics and our imagination clearly allows us to conceive experiments and build advanced technologies to continue scientific progress; thus, the current framework shall not hold us back from searching for evidence related to the simulation hypothesis. The entrance point, however, must be the current understanding of mathematics and the challenges associated with our current knowledge of physics. Therefore, conducting experiments on such a hypothesis naturally requires assumptions to be made.

Also, many open questions remain in living systems theory, and we don’t yet know with certainty whether or not we are the only intelligent species in the universe. Still, we can conceive experiments that help us to gain insights into the ultimate questions: Was our universe and everything in it created, or did it emerge by itself? Is our universe unique, or is it just one of many, as described by the many-worlds interpretation of quantum physics [9]? In the article, we outline some fundamentals of computing and physics, which will help us define the experiment’s constraints. First, quantum physics is the essential pillar we build our experiment on - ergo, the current understanding of quantum mechanics - as our current understanding constitutes the most fundamental physics in the universe that everything else is based upon. Secondly, we briefly introduce different fates of the universe the scientific community assumes to be scientifically sound and further guide us in designing an experiment independent of how the universe evolves. Thirdly, we consider the ultimate limits of computability, which also lead us back to quantum physics, both when it comes to engineering quantum computers and simulating physical and chemical processes in the universe. While Alan Turing showed what is computable [10], we show which computers are constructible within this universe. Finally, we explore different interpretations of observations gained from simulation chains and individual specimens we base the proposed experiments on, as well as investigate experiments and discuss observations in our universe indicating whether we participate in a simulation chain or not.

This book takes the reader on an ambitious journey to explore one of the most profound questions of our existence. By leveraging the latest advancements in quantum technology, computational theories, and philosophical insights, we aim to push the boundaries of our understanding and shed light on the possibility of our universe being a simulation. The implications of this exploration are vast, challenging our notions of reality, consciousness, and the very nature of existence. Whether we find ourselves at the brink of discovering an external programmer or furthering our understanding of the cosmos, this inquiry will undoubtedly redefine our existential framework and inspire future generations to continue seeking the truth about our place in the universe.

The Simulation Hypothesis

The simulation hypothesis, first proposed by philosopher Nick Bostrom in 2003 [5, 6], posits that it is highly probable that we are living in a computer-generated reality. This hypothesis is an extension of the “simulation argument” [3, 5, 6], which lays out three possibilities regarding the existence of technologically mature civilizations, at least one of which is considered to be true. According to the simulation hypothesis, most contemporary humans are simulations rather than actual biological entities. This hypothesis is distinguished from the simulation argument by allowing this single assumption. It does not assign a higher or lower probability to the other two possibilities of the simulation argument.

The simulation argument presents three basic possibilities for technically “immature” civilizations – like ours. A mature or post-human civilization is defined as one that possesses the computing power and knowledge to simulate conscious, self-replicating beings at a high level of detail (possibly down to the molecular nanobot level). Immature civilizations do not have this ability. The three choices are as follows [5]:

Human civilization will likely die out before reaching a post-human stage. If this is true, then it almost certainly follows that human civilizations at our level of technological development will not reach a post-human level.

The proportion of post-human civilizations interested in running simulations of their evolutionary histories, or variations thereof, is probably close to zero. If this is true, there is a high degree of convergence among technologically advanced civilizations, and none of them contain individuals interested in running simulations of their ancestors (ancestor simulations).

We most likely live in a computer simulation. If this is true, we almost certainly live in a simulation, and most people do. All three possibilities are similarly likely. If we don’t live in a simulation today, our descendants are less likely to run predecessor simulations. In other words, the belief that we may someday reach a post-human level at which we run computer simulations is wrong unless we already live in a simulation today.

According to the simulation hypothesis, at least one of the three possibilities above is true. It is argued on the additional assumption that the first two possibilities do not occur. For example, if a considerable part of our civilization achieves technological maturity and a significant portion of that civilization remains interested in using resources to develop predecessor simulations, then the number of previous simulations reaches astronomical numbers in a technologically mature civilization. This happens based on an extrapolation of the high computing power and its exponential growth, the possibility that billions of people with their computers can run previous simulations with countless simulated agents, and technological progress with some sort of adaptive artificial intelligence, which an advanced civilization possesses and uses, at least in part, for predecessor simulations. The consequence of the simulation of our existence follows from the assumption of the assumption that the first two possibilities are incorrect. There would be many more simulated people like us in this case than non-simulated ones. For every historical person, there would be millions of simulated people. In other words, almost everyone at our level of experience is more likely to live in simulations than outside of them [3]. The conclusion of the simulation hypothesis is derived from the three basic possibilities and from the assumption that the first two possibilities are not true as the structure of the simulation argument.

The simulation hypothesis that humans are simulations does not follow the simulation argument directly. Instead, the simulation argument presents all three possibilities mentioned side by side, with the assertion that one of them is true. It remains unclear which one that is. It is also possible that the first assumption will come true, and all civilizations, including humankind, will die out for some reason. According to Bostrom, there is no evidence for or against accepting the simulation hypothesis that we are simulated beings, nor the correctness of the other two assumptions [5].

From a scientific standpoint, everything in our perceived reality could be coded out as the foundation of the scientific assumption that the laws of nature are governed by mathematical principles describing some physicality. The fact that an external programmer can control the laws of physics and even play with them has been deemed controversial in the simulation hypothesis. Something “outside of the simulation” - an external programmer - is, therefore, more of a sophisticated and modern view of the foundation of monotheistic religions/belief systems. Swedish technophilosopher Alexander Bard proposed that the theory of creationism be moved to physics [11], suggesting that the development of super (digital) intelligence was the creation of god, turning the intentions of monotheism from the creator to the created. Moving from faith and philosophical contemplation towards progress in scientific explanation is what the advancement of quantum technology might propose.

Critics of Bostrom argue that we do not know how to simulate human consciousness [12–14]. An interesting philosophical problem here is the testability of whether a simulated conscious being – or uploaded consciousness – would remain conscious. The reflection on a simulated superintelligence without perception of its perception was proposed as a thought experiment in the “final narcissistic injury” (reference). Arguments against that include that with complexity, consciousness arises – it is an emergent phenomenon. A counter-argument could easily be given that there seem to be numerous complex organs that seem unconscious, and also – despite reasoned statements by a former Google engineer [15] – that large amounts of information give birth to consciousness. With the rising awareness of the field, studies on quantum physical effects in the brain have also gained strong interest. Although rejected by many scientists, prominent thinkers such as Roger Penrose and Stuart Hameroff have proposed ideas around quantum properties in the brain [16]. Even though the argument has gained some recent experimental support [17], it is not directly relevant to the proposed experiments. A solution to a simulated consciousness still seems far away, even though it belongs to the seemingly easy problems of consciousness [18]. The hard problem of consciousness is why humans perceive to have phenomenal experiences at all [18]. Both don’t tackle the meta-problem of consciousness stating why we believe that is a problem, that we have an issue with the hard problem of consciousness.

German physicist Sabine Hossenfelder has argued against the simulation hypothesis, stating it assumes we can reproduce all observations not employing the physical laws that have been confirmed to high precision but a different underlying algorithm, which the external programmer is running [19]. Hossenfelder does not believe this was what Bostrom intended to do, but it is what he did. He implicitly claimed that it is easy to reproduce the foundations of physics with something else. We can approximate the laws we know with a machine, but if that is what nature worked, we could see the difference. Indeed physicists have looked for signs that natural laws proceed step-by-step, like a computer code. But their search has come up empty-handed. It is possible to tell the difference because attempts to reproduce natural laws algorithmically are usually incompatible with the symmetries with Einstein’s Theories of Special and General Relativity. Hossenfelder has stated that it doesn’t help if you say the simulation would run on a quantum computer. “Quantum computers are special purpose machines. Nobody really knows how to put general relativity on a quantum computer” [19]. Hossenfelder’s criticism of Bostrom’s argument continues with the statement that for it to work, a civilization needs to be able to simulate a lot of conscious beings. And, assuming they would be conscious beings, they would again need to simulate many conscious beings. That means the information we think the universe contains would need to be compressed. Therefore, Bostrom has to assume that it is possible to ignore many of the details in parts of the universe no one is currently looking at and then fill them in case someone looks. So, again, there is a need to explain how this is supposed to work. Hossenfelder asks what kind of computer code can do that. What algorithm can identify conscious subsystems and their intentions and quickly fill in the information without producing an observable inconsistency? According to Hossenfelder, this is a much more critical problem than it seems Bostrom appreciates. She further states that one can not generally ignore physical processes on a short distance and still get the large distances right. Climate models are examples of this - with the currently available computing power models with radii in the range of tens of kilometers can be computed [20]. We can’t ignore the physics below this scale, as the weather is a nonlinear system whose information from the short scales propagates to large scales. If short-distance physics can’t be computed, it has to be replaced with something else. Getting this right, even approximately, is difficult. The only reason climate scientists get this approximately right is that they have observations that they can use to check whether their approximations work. Assuming the external programmer only has one simulation, like in the simulation hypothesis, there is a catch, as the external programmer would have to make many assumptions about the reproducibility of physical laws using computing devices. Usually, proponents don’t explain how this is supposed to work. However, finding alternative explanations that match all our observations to high precision is difficult. The simulation hypothesis, in its original form, therefore, isn’t a serious scientific argument. That doesn’t mean it is necessarily incorrect, but it requires a more solid experimental and logical basis instead of faith.

Quantum Physics

As Richard Feynman famously said, if we intend to simulate nature, we have to do it quantum mechanically, as nature is not classical.1 While the transition dynamics from the microscopic to the macroscopic is not yet fully understood in every aspect, theory and experiments agree that macroscopic behavior can be derived from interactions at the quantum scale. Quantum physics underlies the workings of all fundamental particles; thus, it governs all physics and biology on larger scales. The quantum field theories of three out of four forces of nature—the weak nuclear force, the electromagnetic force [22, 23], and the strong nuclear force [24]—have been confirmed experimentally numerous times and have strongly contributed to the notion that quantum physics comprises, as of our current understanding, the most fundamental laws of nature.

Immense efforts worldwide are underway to describe gravity quantum-mechanically [25–27], which has proven to be elusive so far. Gravity differs from the other interactions in the sense that it is caused by objects curving space-time around them instead of particle exchange. Uniting quantum physics with general relativity has proven to be one of the most formidable challenges in physics and our understanding of the universe [28, 29]. Despite many scientific hurdles still to take, we have gained some insights into how the universe works, and when we look at quantum physics as the “machine language of the universe,” the universal interactions can be interpreted as higher-level programming languages.

Quantum physics includes all phenomena and effects based on the observation that certain variables cannot assume any value but only fixed, discrete values. This includes wave-particle duality, the non-determination of physical processes, and their unavoidable influence by observation. Quantum physics encompasses all observations, theories, models, and concepts that date back to Max Planck’s quantum hypothesis, which became necessary around 1900 because classical physics reached its limits, for example, when describing light or the structure of matter. The differences between quantum physics and classical physics are particularly evident on the microscopic scale, for example, the structure of atoms and molecules, or in particularly pure systems, such as superconductivity and laser radiation. Even the chemical or physical properties of different substances, such as color, ferromagnetism, electrical conductivity, etc., can only be understood in terms of quantum physics.

Theoretical quantum physics includes quantum mechanics, describing the behavior of quantum objects under the influence of fields, and quantum field theory, which treats the fields as quantum objects. The predictions of both theories agree extremely well with the experimental results, and macroscopic behavior can be derived from the smallest scale. If we define reality as what we can perceive, detect, and measure around us, then quantum physics is the fabric of reality. Therefore, an accurate simulation of the universe, or parts of it, must have quantum physics as a foundation. The internal states of a computer used for simulation must be able to accurately represent all external states, requiring a computer that uses quantum effects for computation and can accurately mimic the behavior of all quantum objects, including their interactions. The requirements for such a computer go beyond the quantum computers built today and envisioned for the future, and engineering such a computer, too, proves to be a formidable challenge, which will be discussed in the following chapters.

One of the arguments presented later in this article is the physical predictability constraint, which prevents us from building a computer that can be used to predict any future states of the universe through simulation. If nature were purely classical, a computer would not suffer from that constraint (there are others, though). Still, quantum physics imposes some restrictions, no matter how advanced our theories on how nature works become. Within the framework of classical mechanics, the trajectory of a particle can be calculated entirely from its location and velocity if the acting forces are known. The state of the particle can thus be described unequivocally by two quantities, which, in ideal measurements, can be measured with unequivocal results. Therefore, a separate treatment of the state and the measured variables or observables is unnecessary in classical mechanics because the state determines the measured values and vice versa. However, nature shows quantum phenomena that these terms cannot describe. On the quantum scale, it is no longer possible to predict where and at what speed a particle will be detected. If, for example, a scattering experiment with a particle is repeated under precisely the same initial conditions, the same state must always be assumed for the particle after the scattering process. However, it can hit different places on the screen. The state of the particle after the scattering process does not determine its flight direction. In general, there are states in quantum mechanics that do not allow the prediction of a single measurement result, even if the state is known exactly. Only probabilities can be assigned to the potentially measured values. Therefore, quantum mechanics treats quantities and states separately, and different concepts are used for these quantities than in classical mechanics.

In quantum mechanics, all measurable properties of a physical system are assigned mathematical objects, the so-called observables. Examples are the location of a particle, its momentum, its angular momentum, or its energy. For every observable, there is a set of special states in which the result of a measurement cannot scatter but is clearly fixed. Such a state is called the eigenstate of the observable, and the associated measurement result is one of the eigenvalues of the observable. Different measurement results are possible in all other states that are not an eigenstate of this observable. What is certain, however, is that one of the eigenvalues is determined during this measurement and that the system is then in the corresponding eigenstate of this observable. For determining which of the eigenvalues is to be expected for the second observable or - equivalently - in which state the system will be after this measurement, only a probability distribution can be given, which can be determined from the initial state. In general, different observables have different eigenstates. For a system assuming the eigenstate of one observable as its initial state, the measurement result of a second observable is indeterminate. The initial state is interpreted as a superposition of all possible eigenstates of the second observable. The proportion of a certain eigenstate is called its probability amplitude. The square of the absolute value of a probability amplitude indicates the probability of obtaining the corresponding eigenvalue of the second observable in a measurement at the initial state. In general, any quantum mechanical state can be represented as a superposition of different eigenstates of an observable. Different states only differ in which of these eigenstates contribute to the superposition and to what extent.

Only discrete eigenvalues are allowed for some observables, such as angular momentum. In the case of the particle location, on the other hand, the eigenvalues form a continuum. The probability amplitude for finding the particle at a specific location is therefore given in the form of a location-dependent function, the so-called wave function. The square of the absolute value of the wave function at a specific location indicates the spatial density of the probability of finding the particle there.

Not all quantum mechanical observables have a classical counterpart. An example is spin, which cannot be traced back to properties known from classical physics, such as charge, mass, location, or momentum. In quantum mechanics, the description of the temporal development of an isolated system is analogous to classical mechanics employing an equation of motion, the Schrödinger equation. By solving this differential equation, one can calculate how the system’s wave function evolves (see Eq. 1).

(1)iħ∂∂tψĤψ

In Eq. 1, the Hamilton operator Ĥ describes the total energy of the quantum mechanical system. The Hamilton operator consists of a term for the kinetic energy of the particles in the system and a second term that describes the interactions between them in the case of several particles and the potential energy in the case of external fields, whereby the external fields can also be time-dependent. In contrast to Newtonian mechanics, interactions between different particles are not described as forces but as energy terms, similar to the methodology of classical Hamiltonian mechanics. Here, the electromagnetic interaction is particularly relevant in the typical applications to atoms, molecules, and solids.

The Schrödinger equation is a first-order partial differential equation in the time coordinate, so the time evolution of the quantum mechanical state of a closed system is entirely deterministic. If the Hamilton operator Ĥ