Entropy : quantum definition and
creation process
See previous texts on entropy:
Quantum physics is quite more
time-symmetrical than the general case of a classical probabilistic
theory. The dual properties for the transformation of the states
set, of being affine and center-preserving, are both satisfied by
this evolution (defined by a rotation). The physical evolution of a
system can be mathematically defined as detemined both forwards and
backwards in time from any specified state at a given time. Both
ways are similar, and the computation of the "future of the past" (and vice
versa) from any given present state, gives back the same state.
As the laws of quantum physics are time-symmetric at a fundamental level, the
irreversibile process of entropy creation is only something emerging at the macroscopic
level (or: the mesoscopic level, that is, involving many particles but still much less than
what is humanly visible). In other words, the very definition of how much entropy there is,
depends on how we express the state of a system by putting it in context (what we know
of the system, how we distinguish a given system from the rest of the environment).
The nature of entropy
While entropy creation is not a fundamental process (as there is no
irreversible process at the fundamental level), it is possible to
give a definition of entropy, even for small (microscopic) systems.
The entropy of the state of a quantum system, can be expressed in
successively more precise ways, as follows:
- The measure of how impure
is its state: the entropy of a pure state is zero, while others
have a positive entropy; in each quantum n-states shape, the
maximum entropy is reached at the center;
- The amplitude of our ignorance about its exact state;
- The average quantity of information that is necessary to
specify the exact state, in the most compressed form of this
information.
In fact, there is something fuzzy and relative in the definition of
entropy, as the above 3 definitions don't always agree, and these
discrepancies will progressively explain how entropy can be
created.
If a system is in a pure state that we know, then when measuring the
system in the same direction, we know in advance what will be the
result. In this case, we don't have any prior ignorance on what the
result will be. The quantity of information still needed to inform
about the result is empty: the entropy is zero.
In the case of a 2-states system, thus whose states set is a sphere,
then the maximum possible entropy of this system is 1 bit.
Because the result of a complete measurement that would be made on
the system, (that would collapse it into a pure state), would
take one bit (binary digit) of information.
To tell it another way, a 2-states system can store one bit of
information maximum. If we store a bit there, then, as it is only
interested insofar as the same information is not also stored
elsewhere, then the environment does not know the value of this bit.
This ignorance, if it considers both possibilities as equally
likely, is a way to view the state of the system as being the center
of the sphere, in between both possibilities.
However, a state that is neither pure nor in the center, has its
entropy somewhere in between. So, a quantity of information can be a
number of bits somewhere between zero and one bit. This happens
for a bit whose two values don't have the same probability.
Indeed, if you have a file of many such bits, then there is a way to
compress the file that will give it a shorter average length (a most
often shorter length, though there is a small risk this will make it
longer).
Quantities of information can be measured for example in any basis.
We are familiar with decimal expressions for numbers, while
computers are familiar with bits, or byte, where 1 byte (that
specifies a number among 256 = 28 possibilities) = 8
bits. As n bits can specify a possibility among N=2n,
this means that a possibility among N (a number between 0 and N-1),
is specified by a number n of bits that is the binary logarithm of
N. But if written in a decimal form, or in any other basis, the
logarithm should be taken in that other basis. So, finally, the
entropy of a system that has a state among N equiprobable
possiblitites, has entropy ln(N). But if the possibilities have
different probabilities then the entropy is lower.
For example, consider a system that may be among 3 states with
probabilities 1/2, 1/4 and 1/4.
Its entropy is 1.5 bit, because it takes one bit to specify with one
bit whether it is in the first state or in either of the other 2
states, then there is a 50% chance that another bit is needed to
distinguish between the last 2 states. But if the probabilities were
all 1/3 then the second bit would be more likely required, hence a
bigger entropy: taking account of the real probabilities of 1/3 each
with a ternary digit, gives an entropy ln(3)= 1.098, while an
improper representation by one bit and a half of 3 equiprobable
states has entropy (1/3)ln(2)+(2/3)ln(4)= 1.155, and one bit a half
in its proper case of probabilities 1/2, 1/4, 1/4 has entropy
(3/2)ln(2)=1.039.
How is entropy created
While entropy creation is not a fundamental process, it happens in
practice at the macroscopic scale for reasons that become more and
more significant as the consideration is extended to larger and
larger scales (for larger and larger systems of many particles
that have large available space for their movement, which are better
and better approximations of our large universe), thus
with larger numbers of possible states, but can already be
understood in a fuzzy sense in the case of the 2-states systems we
described, in the following ways:
- A pure state is in one direction of the sphere of states,
while future measurements can be done in other directions,
making the result uncertain, and making irrevant the fact that
the state was pure; even if not for a formal deliberate
measurement, many chaotic processes will happen where the
purity of the state won't be in the useful direction, making the
state behave like an impure state in practice for similar
reasons.
- "The equator is larger than the poles", so that a pure state
"anywhere on the sphere" has more chances to be near the
"equator" of the future measurement direction, making the
measurement result more "often unpredictable" than
predictable, even starting from a pure state.
- Even though the evolution is deterministic, it may be very
unpractical to effectively compute the prediction so as to
really "know what we can know", as such a computation would
involve a much heavier computer than the object whose evolution
is predicted. Namely, computing a rotation of a point on a
sphere around a direction (to represent 1 bit of quantum
information), takes much more than an elementary logical
operation on one bit of information. Thus, even something
"predictable" can be treated as random in practice, for lack of
means to operate the prediction and make use of it.
- Even if we knew where is the pure state of a system, we may
not have the practical means to use it by orienting the
measurement (or the process that makes use of the purity of the
state) in the right direction.
- An evolution starting from a state in a measurable direction
and just rotating, would only come back to the initial state
after one whole turn for a 2-states system, but not so well for
manier states. To make it more practical to come back to the
initial state, one may have thought to reverse the evolution.
However, the evolution is determined by the energy function, and
negating the sign of the energy to let the rotation go
backwards, is not an available symmetry in the general
case. The time orientation is related to the orientation of the
sphere of states, so that the time symmetry (that the
fundamental laws have) could only be applied to the states
(for bouncing a past evolution back to its
initial state with no need to measure or predict it) by an
orientation-reversing process; but such a process cannot exist,
as evolutions are always expressed by rotations, that preserve
the orientaton of the sphere of states (this was somehow
explained in the case of correlations that always reverse
orientation).
- A pure state can evolve into a pure state of correlation
between several components. If one of these components is lost
in the environment (by the same means as entropy can be
evacuated, such as an infrared photon emitted from Earth into
outer space), then the rest of the system is left in an impure
state (higher entropy); the initial purity of the state,
that now takes the form of a quantum correlation between
the system and its environment, cannot be used anymore. More
generally, any process of entropy evacuation or exchange
with the outside (which includes the case of receiving as much
as emitting, thus letting the system "stable" at some
temperature), can contribute this way to bringing "locally
absolute" entropy to a system (provided that
correlations with the environment are disregarded), thus
changing effective entropy (that was not absolute) into locally
absolute entropy at about the same rythm as this entropy
exchange with the environment.
Related pages
Set Theory and foundations
homepage