John D. Norton

Center for Philosophy of Science

Department of History and Philosophy of Science

University of Pittsburgh

http://
www.pitt.edu/~jdnorton

The thermodynamics of computation presumes that many, if not most, of its processes can be carried out as isothermal, reversible processes on the molecular scale. They include expansion and contraction of a device's phase space and dissipationless detection of the state of a device.

The no-go result asserts that all such processes are fatally disrupted by thermal fluctuations.

*This page provides a quick summary for experts of the no-go result discussed in the page "When a Good Theory Meets a Bad Idealization."*

The systems considered comprise the device undergoing the change and the other devices that drive the change, so that their combination can exchange heat but not work with its thermal environment. For a compression process, the system consists of the device compressed (e.g. a one-molecule gas) and device compressing it (e.g. a weighted piston). For a detection process, the system consists of the target device whose state is detected, the detector and the machinery that couples them.

The reversible process is a sequence of states in thermal equilibrium with their surroundings. The stages are labeled by the parameter λ.

The system is canonically distributed, so that its probability distribution over its phase space is

p(x,π) = exp(-E(x,π)/kT)/Z

where x and π are generalized configuration and momentum variables and the expression is normalized by the partition function

Z = ∫exp(-E(x,π)/kT) dx dπ. Each stage of the process consists of some subvolume of this phase space, so the probability density over λ is given by

p(λ) is proportional to Z_{λ}= ∫_{λ}exp(-E(x,π)/kT) dx dπ

where the integral extends over the subvolume corresponding to stage λ. For canonically distributed systems at equilibrium, the free energy F=E-TS is related to the partition function by F= -kT ln Z. For descriptive purposes we assign the same free energy to each stage, so that

F(λ)= -kT ln Z_{λ}

Inverting and noting the relation of Z_{λ} and p(λ) we have

p(λ) is proportional to exp(-F(λ)/kT)

For an isothermal process that can exchange heat but not work with its surroundings, the condition for equilibrium is

dF(λ)/dλ = 0

so that F(λ) is a constant for the process.

It now follows that

p(λ) is a constant over the process

This completes the no-go result. This last constancy tells us that an infinitely slow, reversible process cannot arise. For if the system is prepared in stage λ_{1}, it is as likely to stay in it as to fluctuate to any other stage λ_{2}.

This result applies to all processes, both macroscopic and microscopic.

However it is of no importance for macroscopic processes. Disequilibria that are minute on macroscopic scales are able to overwhelm the fluctuations. For example, a decrease in free energy through the process of 25kT will produce a ratio of probability densities favoring completion of exp(25) = 7.2 x 10^{10}. But 25 kT of energy is the mean energy of just 10 oxygen atoms. This is well within the awkward but familiar locution that reversible processes proceed by being "infinitesimally away from equilibrium."

Matters are different on the molecular scale. These disequilibria are fatal to the Landauer-Bennet orthodoxy of the theory of computation that holds that only erasure is necessarily dissipative. For a modest ratio of probability densities of exp(3) = 20 requires a decrease in free energy through the process of 3kT. The corresponds to the creation of at least 3k of entropy, which exceeds the k ln2 = 0.69k of entropy tracked by Landauer's Principle.

See "All Shook Up: Fluctuations, Maxwell's Demon and the Thermodynamics of Computation," Part II for an elaboration of this result and the computation of examples. See Part I for an account of why I believe that all attempts to prove Landauer's Principle have failed.

Copyright John D. Norton. January 20, 2010. July 11, 2013.