Quantum physics formalizes any list of

In the limit of classical physics, where a system is formalized by some 2

So, entropy roughly measures (in logarithmic units) the size of the main pack of possible elementary states where a system is likely to be. Physical processes are reversible when they preserve the information of this pack (distinguishing it from its outside), thus when they preserve entropy. Large isolated systems may create entropy by diluting this pack, mixing states in and out and forgetting which is which. This explains the conservation of entropy in fundamental laws, and its possible creation (but non-elimination) in irreversible macroscopic processes. In non-isolated systems, processes can somewhere seem to shrink the pack of distinct microscopic states (the volume of phase space) by evacuating their multiplicity to the environment (final states may be identical here but distinct elsewhere).

Entropy was defined in information theory by the formulaStill, this process has a sort of non-shrinking property (that will lead to the non-decrease of entropy), that looks like a time symmetry in the property of probabilities (not a real symmetry):

- Of course, for all
*i*, ∑= 1 : from a given initial state_{j}m_{ij}*i*the sum of probabilities of all possible final states*j*is 1. - But also, for all
*j*, ∑= 1 : for any final state_{i}m_{ij}*j*the sum of probabilities for*j*to be reached by evolution from each possible initial state*i*, is also 1.

*S* = ∑_{i,j}*m _{ij}*
(-

We conclude

Another form of entropy creation in isolated systems comes from how things behave in practice rather than how they ideally are in theory : while the theory logically implies precise probabilities for the final state, real systems usually do not come with integrated super-computers giving the exact predictions of these probabilities (and if we added one, it would be likely to produce more entropy in its operations than the one it was meant to avoid). Without precise predictions, we can only make gross approximations of the probabilities, thus handle effectively accessible probabilities with more entropy. The idea of such approximations is expressed by the concept of received entropy, discussed below.

- As they meet by chance (without coordination between their
states), they were initially uncorrelated :
*S*_{1}+*S*_{2}=*S* - Entropy cannot decrease on the way:
*S*≤*S'*. - Their interaction may make them correlated when going apart:
*S'*≤*S'*_{1}+*S'*_{2};

The above inequalities present this increase as made of 2 contributions. But quantum physics does not distinguish them: the above described entropy creation for an isolated system (

The creation of entropy is due to the fact that the initial lack of entropy, carried by the correlation after interaction, becomes ineffective, as the correlated objects go apart and have too little chance to meet again in conditions that would let this correlation observable. Instead, the next interacting objects will rather be other uncorrelated pairs, while existing correlations become progressively dispersed among manier and manier molecules that would have to be compared all together, which is harder and harder to decipher, and even impossible when some objects involved in a correlation escape the analysis.

Moreover in quantum physics, the lack of entropy of the global system cannot always be fully described by separately measuring the states of components and analyzing them as a classical information; only a physical recombination (treatment as unobserved quantum information) might do it, but it is even harder to make.

- The probability is a function of conserved quantities
- It is the exponential of an affine function of conserved extensive quantities, when the rest of conserved quantities are fixed; the linear part of this affine function is independent of other conserved quantities.

**Proof of 2.** Consider a system of 2 objects *A* and *B*,
which are uncorrelated (especially if they are far away from each
other) where a conserved quantity *E* takes values *E*_{1},*E*_{2}
on two elementary states of *A*, and values *E'*_{1},*E'*_{2}
on 2 elementary states of *B* (other conserved quantities
staying fixed) such that *E*_{2}−*E*_{1}=
*E'*_{2}−*E'*_{1}. According to 1., both
states (1,2) and (2,1)
having the same value of the conserved quantity *E*_{1}+*E'*_{2}=*E*_{2}+*E'*_{1},
also have the same probability *p*_{1}
*p'*_{2}=*p*_{2}*p'*_{1}.
Thus, *p*_{2}/*p*_{1}=
*p'*_{2}/*p'*_{1}, i.e. *p*_{2}/*p*_{1}
only depends on *E*_{2}−*E*_{1}
throughout the environment, independently of the particular object
or states (with equal values of other conserved quantities). Thus,
(ln *p*) must be an affine function of conserved
quantities.

Other important conserved quantities are the numbers of atomic nuclei of each type (as long as no nuclear reaction is happening). The component of (ln

Assuming each considered elementary state

*F _{i}* =

*F* = *E*−*TS* = ∑_{i
}p_{i}*F _{i}*

*p _{i}* =

As ∑According to the above description of stable environments (not creating more entropy), any stable probabilistic state of an object as well as its environment will follow the Boltzmann distribution of some temperature in some reference frame (for which this state is the one with minimal free energy), unless the conservation of another quantity is at stakes (which might be ignored by looking at the configuration space of an object taken with a fixed value of that other conserved quantity, describing the case of an object that is isolated for that quantity)._{i}pd_{i}F= ∑_{i}_{i}pd_{i}Tp/_{i}p=_{i}T(∑d_{i}p) = 0, we get d_{i}F= ∑_{i}Fd_{i}p._{i}

Thus the equilibrium condition (dF= 0 for all variations ofp) is that allFare equal._{i}

WhenF>_{i}Fand d_{j}p= -d_{i}p> 0 while other variations of_{j}pcancel (thus going away from equilibrium because eachFis an increasing function of_{i}p), we get d_{i}F>0, thus the equilibrium is a minimum.

ThenF=F=_{i}E+_{i}Tlnpgives_{i}p=_{i}e^{(F−Ei)/T}, and the value ofFcomes from ∑_{i}p= 1._{i}

By choosing energy laws for

But if we have another state

Thus, the sum of free energies for any fixed temperature

We conclude

Let us express the evolution of an object in interaction with its environment, by a matrix as above (

If the object starts in thermal equilibrium (

*p' _{j}* =

Consider a system in a probabilistic state

Consider an adiabatic transformation of the system, modifying the energies of elementary states while preserving their probabilites (we use the same labels for these states through their evolution) from

Final value of the free energy :

Actual average mechanical energy spent to reach it :

Effectively saved free energy in the process =

where

This scenario is the one that would preserve all existing free energy (by not creating any entropy) if the system really was in the probabilistic state

In practice, systems evolve by transformations which are not
adiabatic but can be analyzed in terms where the final states into
which the initial elementary states evolve, are probabilistic
combinations, with respective free energies replacing the role of
energies, by the roles substitution commented above. Looking at a
non-isolated system, the minimal amount of entropy creation among
all possible initial probabilistic states may be nonzero as well.

Next page : The simplest proof of the ideal gas law

Table of contents : Foundations of physics