The Rearrangement
Part I of Latent Spaces, a series on the structures that emerge when entropy meets constraint.
Every physics student learns this equation:
dU = TdS − PdV
Left to right: internal energy changes when you add heat (TdS) or do work (PdV). Energy is the protagonist. Entropy and volume are terms in its story.
Read the other way, S and V are extensive quantities — directly countable. T and P are intensive quantities — relative quantities. Temperature differences permit heat flow. Pressure differences permit matter movement. U can then be thought of as the accounting machinery.
"It is important to realize that in physics today, we have no knowledge of what energy is." — Feynman, 1963
He goes on to discuss that it is a calculated quantity that helps us understand how the system evolves on the basis of changed state variables.
The conventional derivation goes: start with energy concepts (heat, work), define entropy through them — dS = δQ/T. Entropy is derived from energy.
E.T. Jaynes reversed the construction. Start with microstate counting. S = k ln Ω — no energy concepts needed, just combinatorics. Then maximize S subject to a constraint: fixed average energy. Energy enters as a Lagrange multiplier in that optimization, and temperature falls out as the multiplier's value: 1/T = ∂S/∂U.
The direction matters. Entropy exists before you mention energy — it's just counting. Energy is what you impose as a constraint. Temperature is what emerges from the shape of the entropy function under that constraint.
What changes
If entropy is the primitive and energy is derived, three things follow.
Work is constrained entropy. Not "converting energy from one form to another" Work is what happens when entropy production is forced through a structured channel instead of dissipating freely. The channel is the constraint. The work is the rearrangement that the constraint permits. The next post in this series will make this mechanical — an Atwood machine, Odum and Pinkerton, the maximum power principle.
Information inherits from the primitive. Shannon entropy generalizes Boltzmann entropy by direct mathematical descent. If entropy is fundamental, then information theory isn't borrowing a metaphor from physics. It's extending the same combinatorial primitive into a new domain. Information isn't "like" entropy. It's the same rearrangement, different substrate.
The ledger works because it's conserved, not because it's real. Energy conservation is the accounting identity that makes the ledger close. Immensely useful — you can't do engineering without it. But "useful accounting identity" and "fundamental constituent of reality" are different claims.
Landauer's bridge
The sharpest objection: Landauer's principle. Erasing one bit of information costs at least kT ln 2 of energy. If energy is just accounting, why does this precise bridge between information and physics hold?
Read it in the entropy-first frame. Erasing a bit reduces the system's information entropy by k ln 2. The second law requires total entropy to not decrease. So the environment's thermodynamic entropy must increase by at least k ln 2. That's the whole statement — an entropy constraint, denominated in entropy units. The kT converts to energy the way a currency exchange converts yen to dollars. The bridge holds precisely because both sides are entropy. Energy is the denomination, not the substance.
Landauer's principle doesn't prove energy is fundamental. It proves the accounting is consistent.
The rearrangement
What the universe does, at every scale, is rearrange. Molecules find new configurations. Information redistributes across channels. Structure emerges, persists, dissolves. Energy doesn't drive this process — it bookkeeps it. Entropy doesn't measure the disorder left behind — it is the space of possible rearrangements.
Next: what happens when you put the rearrangement in a box and attach a pulley.
