The Strong Second Law
Concluding Part I of Latent Spaces. After The Rearrangement and The Atwood Machine.
Jaynes' unpublished thermodynamics book opens with two observations in the first chapter that don't announce themselves as the wall.
Reversibility requires proximity. A system nudged infinitesimally can be nudged back. No entropy produced, no information lost, no preferred direction. The question that nags: does the converse hold? If two states are connected by a reversible process, must they be close to each other?
Not exactly, and the gap matters. The endpoints can be far apart in state space. What reversibility requires is a dense path between them — a continuous chain of infinitesimal steps, each one reversible, stitching distant states together through local proximity. This is a topological statement, not a metric one. Reversibility doesn't mean two states are near each other. It means there exists structure connecting them.
The destination is guaranteed. The strong form of the second law: entropy will increase to the maximum permitted by the system's constraints. Not "tends to increase." Will increase — given sufficient time, to the maximum the constraints allow.
Given sufficient time. That qualifier is doing enormous work, and it's easy to read past it. For the universe, sufficient time is heat death — the final equilibrium where nothing further can happen. For a cup of coffee, it's an afternoon. For everything between — rivers, organisms, civilizations, the thought forming as you read this sentence — we are somewhere in the transient. The destination is certain. We haven't arrived. We may not arrive for a very long time.
What happens when you hold both
Something emerges when these two observations sit together, though it takes a moment to see it.
The destination is fixed: maximum entropy consistent with constraints. Reaching it requires traversing state space. Traversal requires dense paths — chains of near-reversible micro-steps. So the rate at which a system approaches its destination depends on the path structure available to it. More structure, more paths. More paths, faster traversal.
And here you begin to feel the pull. If the destination is guaranteed and the rate depends on structure, then systems that build path structure approach the destination faster. The branching of a tree, the dendritic network of a river delta, the vasculature of a lung — these start to look less like accidents and more like dense paths through state space, stitching local near-reversibility into global dissipation. Structure as the means by which entropy finds its way home.
The logic wants to go further. It wants to say: this is why structure emerges. And there's a geometric intuition that almost gets there.
Among shapes of fixed volume, the sphere minimizes surface area — the isoperimetric solution. Run the problem backwards. Maximize surface area for a given volume and you get branching, fractal-like structures. In the limit, infinite surface in finite volume. More surface means more interface with the environment, more pathways for dissipation. The tree is the anti-sphere: the shape that maximizes exposure to the gradient.
It's compelling, and it's incomplete. A pure surface maximizer would branch infinitely fine. Real trees don't — because water has to travel from roots to canopy against gravity, because branches must bear their own weight in wind, because signals need to traverse the structure in bounded time. The actual geometry is a compromise between surface maximization and transport cost and material persistence. Multiple competing constraints, not a single variational problem.
Which is exactly where the clean story breaks. The anti-sphere gives you the direction — more surface, more dissipation. But the shape that emerges depends on constraints we haven't accounted for, interacting in ways a single optimization can't capture. We're not on firm enough ground for that claim. Not yet.
Where the ground thins
Jaynes' framework is MaxEnt — maximum entropy inference. Given constraints, it tells you the most probable macrostate. It is, at its core, an equilibrium tool. You specify what's fixed, maximize entropy subject to those constraints, and out falls the distribution. Elegant, rigorous, and silent on everything that happens before equilibrium is reached.
Everything we care about happens before equilibrium is reached.
The tree, the river, the civilization — these are not maximum-entropy states. They are dissipative structures, sustained by continuous entropy production, maintained by external gradients. Cut the gradient and they decay toward the equilibrium Jaynes can model beautifully. They exist in the space his framework passes over.
Prigogine spent a career in that space. Far from equilibrium, systems spontaneously break symmetry and build structure that accelerates dissipation — structure that MaxEnt doesn't predict, because MaxEnt doesn't model dynamics. Jaynes tells you where the system ends up. Prigogine watches what it builds on the way there.
They're working from different foundations, and neither would be entirely comfortable with the other's. Jaynes is epistemological — entropy as inference, as what you can deduce from counting. Prigogine is ontological — dissipative structures as things that happen whether or not anyone is counting. The dense-path argument sketched above works locally: each branch of the tree is nearly in equilibrium with its immediate surroundings. But the global architecture — the whole tree, sustained by sunlight, pumping water, building wood — lives in territory that Jaynes' vocabulary was not built to name.
Part I has been built on Jaynes. It has been honest work, and it has taken us to the edge of what his tools can reach.
What Part I established
Three things, on ground we trust:
Entropy is the primitive — countable from microstates, no energy concepts required. Energy enters as a constraint, temperature as a derivative. (The Rearrangement)
Constraints reduce phase space and channel entropy production into work. But asking why particular configurations exist — why this river, this tree, this coupling ratio — returns the observer as irreducible residue. (The Atwood Machine)
The destination is guaranteed. The itinerary is open. And the second law's silence about rate — about what the system builds in the transient, and why — is the space where everything that matters lives.
Part II goes into that silence. The ground will be less certain. The questions will be better.
