Figiure 2–1. Infinite detail stuck in an instant of time. Mandel zoom 11 satellite double spiral, by Wolfgang Beyer. Creative Commons Attribution-Share Alike 3.0

Reimaging Physics — Part 2

Harrison Crecraft
Science and Philosophy
10 min readDec 11, 2021

--

The Reality of Time and Change — Part 1
Physics’ Timeless Universe — Part 2
A Thermocontextual Perspective — Part 3
What is Time? — Part 4
Wavefunction Collapse and Symmetry-Breaking— Part 5
Entanglement and Nonlocality — Part 6
The Arrow of Functional Complexity — Part 7

Physics’ Timeless Universe

A drop of ink disperses in water; it never coalesces back into a drop. Heat flows from hot to cold; never from cold to hot. Effects follow causes; never the reverse. These observations empirically illustrate the arrow of time, as measured by the sweep of the sun’s shadow or a clock’s hands.

Physics is ultimately founded on empirical observations, yet it interprets physical reality as fundamentally reversible and deterministic. Relativity describes the universe as a static block in 4D spacetime. We perceive change as we encounter a new 3D “slice” at each new “now,” but like the frames of a movie reel, physics describes the universe as static. Physics describes a world in which the past is never lost, the future is already determined, and both are equally real, right now. The universe simply exists, unchanging in spacetime, with no fundamental distinction between the past, present, and future.

So, how did physics come to interpret time as fundamentally reversible and deterministic despite abundant empirical evidence to the contrary? Part 2 of Reimagining Physics briefly reviews the history of physics from Newton to the present to reveal how physics got to its current state of timelessness.

Newtonian Mechanics

Determinism is the logical consequence of classical mechanics. Application of Newton’s Laws of mechanics to a precisely defined physical state determines all future states. Change in Newtonian mechanics is deterministic, but it is not reversible. Newtonian mechanics is an empirical model based on momentum and forces, and friction is an empirically well-defined force. Colliding clay lumps conserve momentum, but their kinetic energy is irreversibly dissipated by friction. Newton’s three laws of mechanics neither include nor imply the conservation of mechanical energy or the reversibility of time. Mechanical energy is measured by its potential for work. It is conserved only in the idealized case where frictional forces can be ignored, such as approximated by celestial mechanics. Time in Newtonian mechanics has a direction, defined by the frictional dissipation of mechanical energy and loss of work potential.

Carnot and the 2nd Law

By the start of the 1800s, the Industrial Revolution was underway. Factories were being powered by coal and steam. The prevailing view at the time regarded heat as a fluid, referred to as the caloric. Just as a gaseous fluid flows from high pressure to low pressure, the caloric flowed only from high temperature to low temperature. As with flowing water or air, the flow of the caloric could be harnessed to do work.

With the expanding industrial revolution, there was a need to improve the performance of steam engines for both industry and transportation, and Sadi Carnot undertook a systematic study of steam engine efficiency. In 1824, he published Reflections on the Motive Power of Fire. In it, he concluded that the theoretical efficiency of a heat engine is related only to the temperatures of the heat source and sink at which heat is rejected. The hotter the heat source or the cooler the heat sink, the greater the potential work of the caloric.

Carnot recognized that in a real steam engine, friction and irreversible leakage of heat leads to less than perfect theoretical efficiency. Without an understanding of the true nature of heat, Carnot described the loss of work potential by irreversible dissipative processes. This expressed the essential idea of what later came to be known as the Second Law of thermodynamics. The Second Law defines the thermodynamic arrow of time by the irreversible dissipation of work potential.

Hamiltonian Classical Mechanics

William Rowan Hamilton reformulated classical mechanics in 1832. He resolved a system into point particles, which have mass, but no parts or internal energy. With no internal energy, there is no heat. Hamiltonian interpreted heat as the mechanical energy of a system’s particles. James Prescott Joule later confirmed the equivalence of heat and mechanical energy in a series of experiments, and in 1850, Rudolf Clausius published the First Law of thermodynamics, which formally established the conservation of total energy, which includes both mechanical energy and heat.

Hamiltonian mechanics recognized heat as microscopic mechnanical energy, and it therefore interpreted the First Law to mean the conservation of mechanical energy. Hamiltonan mechanics thereby eliminated dissipation. With no dissipation of work potential, any process can be reversed without the addition of work. This is the definition of thermodynamic reversibility. Hamiltonian mechanics established thermodynamic reversibility as a fundamental property of physics. This demoted the Second Law of thermodyanamics to an empirical property of observations, and not as a fundamental physical law.

Classical Statistical Mechanics

The irreversibility of thermodynamics posed a challenge to the reversibility of Hamiltonian mechanics. Physics acknowledged the empirical dissipation of work or potential work to ambient heat, but Hamiltonian mechanics did not recognize ambient heat or dissipation as fundamental physical properties. Physics redefined the Second Law in terms of increasing entropy instead of dissipation.

Ludwig Boltzmann sought to reconcile irreversible thermodynamics and mechanics by defining entropy as disorder. He defined disorder by the number of accessible microstates consistent with the system’s macrostate. The microstate precisely describes a system’s actual underlying physical state. The macrostate, in contrast, is an imprecise description based on imperfect measurement and thermal noise. He described the increase in entropy as the statistical tendency for large numbers of initially ordered particles to disperse and become disordered.

We readily recognize billiard balls numerically arranged in a neat triangle as a highly ordered arrangement. After they are scattered across the table, they could be any one of a vast number of similar-looking configurations. We would simply describe the balls as disordered. If all of the possible detailed arrangements of disorder were random and equally probable, disorder would have a much higher probability than the single numerically ordered arrangement. Statistical mechanics interprets the increase in entropy as the tendency of systems to go from low probability to higher probability. If we start with a low probability state, the thermodynamic arrow of time statistically points to higher probability.

Physics interprets entropy as an informational property and a measure of an observer’s ignorance of a system’s precise state. When we see the ordered billiard balls, we know its state with high precision. When we see the balls randomly distributed, we are less certain of their exact positions. The increase in entropy results from the amplification of small measurement errors and uncertainties due to deterministic chaos.

The fractal image in Figure 2–1 graphically illustrates deterministic chaos. It is created by a simple function that deterministically assigns a color to each point. The function can map adjacent points to very different colors, no matter how close, and this creates a fractal image of infinite detail, no matter the degree of magnification.

Statistical mechanics attributes the increase in entropy to the amplification of uncertainty of a system’s initial state. With perfect measurement, however, there is no initial uncertainty. A precisely defined state evolves deterministically to another precisely defined state, and there is no irreversible increase in uncertainty or probabilities. A perfect observer could, in principle, precisely measure and manipulate particles. This is the idea behind Maxwell’s Demon, who could manipulate gas molecules to reduce entropy, without exernal work and without violating any laws of physics [1]. Statistical mechanics regards entropy as a measure of an observer’s uncertainty, but not as a fundamental property of state. It regards the Second Law of thermodynamics as a well-validated empirical principle, but not as a fundamental law of physics.

Beyond Classical Mechanics

With the discovery of quantum phenomena in the early twentieth century, it became clear that the laws of classical mechanics break down for very small particles, and a new theory was needed. Quantum mechanics defines the quantum microstate by the Schrödinger wavefunction, which describes everything that is measurable and knowable about a system.

Individual quantum measurements contextually depend on the specific experimental setup, but quantum mechanics defines the wavefunction by summing over all possible experimental setups. Like the classical mechanical microstate, the quantum microstate is noncontextually and deterministically defined, independent of any particular reference.

The wavefunction describes a radioactive particle, when it is initially prepared, as a definite state of undecayed. The results of individual measurements subsequent to preparation are intrinsically random — sometimes decayed and sometimes undecayed — but quantum mechanics defines the wavefunction deterministically, as an indefinite superposition of all potentially measurable states. A superposed wavefunction defines the probabilities of individual measurements, but the probabilities, and the wavefunction itself, are definite and their changes follow deterministic rules. At observation, however, the superposed wavefunction randomly “collapses” to a single observed outcome. The determinism of the wavefunction and quantum microstate, but the randomness of measurement results, describes the measurement problem of quantum mechanics.

Metaphysical Implications

The wavefunction and quantum microstate are defined reversibly and deterministically. Whether the underlying physical state is reversible and deterministic, however, is a matter of ongoing debate. The Copenhagen Interpretation (CI), which emerged during the 1920s and which remains the prevailing and mainstream interpretation, followed classical mechanics by assuming that the quantum microstate is a complete description of the underlying physical state. The reversibility and determinism of the wavefunction microstate therefore implies that the physical state also evolves reversibly and deterministically.

Erwin Schrödinger tried to illustrate the absurdity of the Copenhagen Interpretation by considering a radioactive particle, a cat, and a detector which releases cyanide gas if the particle decays (Figure 2–2). He imagined all of this in a box isolated from external perturbations. At preparation, the system’s wavefunction describes a live cat entangled with the radioactive particle. Sometime later, it describes the probabilities of observing a dead cat or live cat. The wavefunction is a deterministic function of time. If the cat is isolated from external perturbations, then by the completeness of the wavefunction, the physical cat also evolves deterministically, from a definite state of live cat to a superposed state of live-dead. Upon observation, when the veil of isolation is broken, the superposed cat collapses into either the dead cat or live cat that we observe. Schrödinger rejected the possibility of superposed cats, and he proposed his experiment to illustrate absurdity of the Copenhagen Interpretation.

Figure 2–2. Schrodingers cat.svg by Doug Hatfield, CC BY-SA 3.0

The Copenhagen Interpretation accepts superposed states, and it attributes their collapse to a definite state to the effects of external interactions when the system’s isolation is breached. External interactions could include measurement or observation.

The universe, by definition, has no surroundings and no external interactions, so there can be no collapse. Hugh Everett applied this idea to propose an alternative interpretation that avoids the possibility of superposed cats. In essence, his Many Worlds Interpretation (MWI) [2] says that everything that can happen does happen in separate branches of an exponentially branching universe. In one branch, Schrödinger’s cat lives, and in the other, it dies. Even we, as observers, are split. Each of our split selves exists in a separate branch and sees only a single outcome. We perceive random wavefunction collapse, but from the objective perspective of the universe as a whole, there is no random selection, and the universe evolves deterministically. The MWI trades the possibility of superposed cats for an exponentially branching universe instead.

Superdeterminism is another proposed resolution to the apparent randomness of wavefunction collapse. Superdeterminism is simply the application of determinism to a non-splitting universe. There is no random wavefunction collapse. The outcome of measurement and wavefunction collapse only appear random to us because hidden properties or correlations, unknown to us, determine the measurement outcome. Superdeterminism implies that the entire history of the universe, including even our own thoughts and choices, is determined at the beginning of time. Superdeterminism is so aesthetically distasteful that many physicists either ignore its implications or they reject the theory outright. The costs of rejecting superdeterminism and asserting physical randomness, however, are steep. If we reject superdeterminism and recognize physical randomness, we need to explain randomness of the physical state with the deterministic laws of physics.

An even more challenging consequence of rejecting superdeterminism is the need to accept nonlocality and reconcile it with relativity. Nonlocality describes the correlation of simultaneous measurements on entangled particles , even when measurements are spatially separated and simultaneous (Figure 2–3). The instant correlation of physically separated measurements is a well-established empirical fact. As pointed out by Einstein and colleagues in 1935, spontaneous superluminal correlations seemingly conflict with relativity, which asserts that effects cannot propagate through space faster than the speed of light. Einstein referred to this as “spooky action at a distance.”

Figure 2–3. An entangled photon pair is emitted from a source in opposite directions. Prior to interaction with vertically polarized analyzers, each photon is a superposition of two measurable polarizations: vertical and horizontal. If superdeterminism is rejected, then polarization is randomly and spontaneously instantiated by the polarizers. If the photon pair is entangled with perpendicular polarizations and if one photon is instantiated with vertical polarization and transmitted, then the other photon is instantly instantiated with horizontal polarization and is blocked. Einstein and colleagues argued that instant correlation would violate relativity. One possible explanation is that “hidden variables,” inherited from their common source, determine measurement outcomes. Hidden variables are not recognized by quantum mechanics, which they suggested was an incomplete description of the physical state. Image by the author.

Physics’ False Choices

Prevailing interpretations of physical reality define physical reality noncontextually. This means that the description from any one reference framework can be transformed to any other with no loss of information. A consequence of noncontextuality is that we need to choose among 1) superdeterminism, 2) the possibility of superposed cats, 3) splitting universes, or 4) mediation of correlated measurements by superluminal interactions. These metaphysical implications are consistent with observations and the assumption of noncontextuality, but they are not testable and they are not reasonably credible.

We cannot change empirical facts, but we are free to choose any assumption that is consistent with those facts. Not all interpretations of quantum mechanics assume noncontextuality. The Consistent Histories Interpretation (CHI) [3], for example, asserts that physical states are contextually defined by an observer’s choice of a system’s measurement framework. But the CHI also abandons the strict objectivity of physical reality. In Quantum Bayesianism [4], the state is contextually defined and updated by an observer’s information. The Von Neumann-Wigner interpretation attributes the physical collapse of the wavefunction to consciousness of an observation event [5]. Contextual interpretations are motivated by efforts to resolve conceptual problems of quantum mechanics, but they typically define context by an observer or its choices. Existing interpretations falsely frame the debate on physical reality as a choice between 1) noncontextuality and having to accept an implausible and untestable metaphysical implication, or 2) abandoning objective reality.

Part 3 offers a third choice, one that is objective and does not have untestable and untenable metaphysical implications.

--

--

Harrison Crecraft
Science and Philosophy

PhD Geoscientist. Exploring physics’ foundations to reveal the realities of time and evolving complexity.