Entropy : quantum definition and creation process

See previous texts on entropy:
Quantum physics is quite more time-symmetrical than the general case of a classical probabilistic theory. The dual properties for the transformation of the states set, of being affine and center-preserving, are both satisfied by this evolution (defined by a rotation). The physical evolution of a system can be mathematically defined as detemined both forwards and backwards in time from any specified state at a given time. Both ways are similar, and the computation of the "future of the past" (and vice versa) from any given present state, gives back the same state.

As the laws of quantum physics are time-symmetric at a fundamental level, the irreversibile process of entropy creation is only something emerging at the macroscopic level (or: the mesoscopic level, that is, involving many particles but still much less than what is humanly visible). In other words, the very definition of how much entropy there is, depends on how we express the state of a system by putting it in context (what we know of the system, how we distinguish a given system from the rest of the environment).

The nature of entropy

While entropy creation is not a fundamental process (as there is no irreversible process at the fundamental level), it is possible to give a definition of entropy, even for small (microscopic) systems.

The entropy of the state of a quantum system, can be expressed in successively more precise ways, as follows: In fact, there is something fuzzy and relative in the definition of entropy, as the above 3 definitions don't always agree, and these discrepancies will progressively explain how entropy can be created.

If a system is in a pure state that we know, then when measuring the system in the same direction, we know in advance what will be the result. In this case, we don't have any prior ignorance on what the result will be. The quantity of information still needed to inform about the result is empty: the entropy is zero.

In the case of a 2-states system, thus whose states set is a sphere, then the maximum possible entropy of this system is 1 bit. Because the result of a complete measurement that would be made on the system, (that would collapse it into a pure state), would take one bit (binary digit) of information.

To tell it another way, a 2-states system can store one bit of information maximum. If we store a bit there, then, as it is only interested insofar as the same information is not also stored elsewhere, then the environment does not know the value of this bit. This ignorance, if it considers both possibilities as equally likely, is a way to view the state of the system as being the center of the sphere, in between both possibilities.

However, a state that is neither pure nor in the center, has its entropy somewhere in between. So, a quantity of information can be a number of bits somewhere between zero and one bit. This happens for a bit whose two values don't have the same probability. Indeed, if you have a file of many such bits, then there is a way to compress the file that will give it a shorter average length (a most often shorter length, though there is a small risk this will make it longer).

Quantities of information can be measured for example in any basis. We are familiar with decimal expressions for numbers, while computers are familiar with bits, or byte, where 1 byte (that specifies a number among 256 = 28 possibilities) = 8 bits. As n bits can specify a possibility among N=2n, this means that a possibility among N (a number between 0 and N-1), is specified by a number n of bits that is the binary logarithm of N. But if written in a decimal form, or in any other basis, the logarithm should be taken in that other basis. So, finally, the entropy of a system that has a state among N equiprobable possiblitites, has entropy ln(N). But if the possibilities have different probabilities then the entropy is lower.

For example, consider a system that may be among 3 states with probabilities 1/2, 1/4 and 1/4.
Its entropy is 1.5 bit, because it takes one bit to specify with one bit whether it is in the first state or in either of the other 2 states, then there is a 50% chance that another bit is needed to distinguish between the last 2 states. But if the probabilities were all 1/3 then the second bit would be more likely required, hence a bigger entropy: taking account of the real probabilities of 1/3 each with a ternary digit, gives an entropy ln(3)= 1.098, while an improper representation by one bit and a half of 3 equiprobable states has entropy (1/3)ln(2)+(2/3)ln(4)= 1.155, and one bit a half in its proper case of probabilities 1/2, 1/4, 1/4 has entropy (3/2)ln(2)=1.039.

How is entropy created

While entropy creation is not a fundamental process, it happens in practice at the macroscopic scale for reasons that become more and more significant as the consideration is extended to larger and larger scales (for larger and larger systems of many particles that have large available space for their movement, which are better and better approximations of our large universe), thus with larger numbers of possible states, but can already be understood in a fuzzy sense in the case of the 2-states systems we described, in the following ways:


Related pages

Set Theory and foundations homepage