Let us introduce entropy in its classical view, that is how it behaves
for macroscopic observers.
The behavior of entropy can be described by comparison to that of energy.
Energy can be transferred between systems but is always globally preserved.
In some kinds of processes, entropy can be transferred between systems but
is preserved, like energy. These processes are called reversible,
which means that they can take place backwards as well, symmetrically
with respect to time. Some other processes happen to be create some
entropy (its total amount increases). But no process can ever eliminate
entropy, which is why the processes that create it are irreversible.
This approach by macroscopic physics (= of human scale, or
generally non-microscopic scales ; unfortunately the only one
presented in many courses of thermodynamics) remains unsatisfactory
as it lets the nature of entropy, its creation process and its irreversibility,
looking like mysteries.
In the fundamental laws (quantum physics without
measurement) that describe elementary (microscopic) processes, entropy
has a clear definition, but all processes are reversible, so that this defined
entropy is preserved. Thus, the process of entropy creation is understood
as an emergent process, that "occurs" only relatively to the approximation
of how things can be summed up in practice when they involve rather
disordered large groups of particles. This approximation operation
affects the conception of effective states of the system at successive
times, and thus the successive values of entropy that are calculated
from these effective states. Another form of this emergent process
of entropy creation will be quantum decoherence, which is the usually
required circumstance to qualify a process as a measurement in quantum
physics. These deeper explanations of the microscopic definition and
creation of entropy, will be presented in the next pages.
There is a maximum amount for the entropy of any material
system evolving within given limits of volume and energy. A body that
reached its maximal amount of entropy within these limits, is said to be in
thermal equilibrium, a state often determined by these conditions
(material content, volume and energy).
To each system in thermal equilibrium, is also attributed another important
physical quantity called its temperature, defined as follows.
Entropy is usually not transferred alone, but together with an amount of
energy. A mixture of amounts of energy and
entropy that can flow from a system to another, is an amount of heat.
(Energy and entropy are not like distinct objects that move, but rather like
2-substance fluids "mixing" themselves by diffusion during contacts, and
only the resulting variations of amounts on each side matter).
Heat can take several forms: either direct contact or radiation.
dE = TdS − PdV .
where TdS is the energy received from heat and − PdV is the energy received from the work of pressure.Thermodynamics |
Garbage market |
entropy | mass of garbage |
energy | money |
−temperature | negative price of garbage |
As the flow of heat must preserve energy but can create entropy, it can only go from "warm" objects (with higher temperature, decreasing their entropy by a smaller amount for the transferred energy) to "cold" ones (with lower temperature, getting more entropy for this energy). This amount of heat increases its entropy when reaching the object with lower temperature.
Usually, any transfer of entropy between systems has a cost: it is an irreversible process, which itself creates more entropy. For example near a given temperature, flows of heat are roughly proportional to the difference of temperature between bodies. To make them faster, the difference of temperature must increase, so that the transfer creates more entropy. Or, a release of heat makes the environment temporarily warmer, which makes this release more costly. This cost can be reduced (approaching reversibility) by slowing down the transfer.Heat flows from the warm to the cold by the fact that warm bodies send their heat faster than cold ones. So, heat transfer is faster at higher temperatures, already in terms of energy, but also usually in terms of entropy (a possibility of speed that can be traded with the fact of producing less entropy). In particular, the radiation from warmer objects has both more energy and more entropy, as we shall see below. In the limit, pure energy (that can be seen as heat with infinite temperature) can often be transferred reversibly.
For entropy-creating processes of life (and machines) to continue their works, they need to transfer their entropy away. As this can usually only happen carried by energy in the form of heat, these systems need to receive pure energy (or warmer energy, with less entropy) in return. The purity of the received energy is what makes it useful, unlike the very abundant heat energy in the environment. Still the release of heat in sufficient flow can also be an issue, which is why, for example, power plants needs to be near rivers for releasing their heat in the water.For example, life on Earth involves many irreversible processes,
which continuously create entropy. As there is a limit in the amount of
entropy that can be contained in given limits of volume and energy, the
stability of this quantity around around average values far below this
maximum (to let life continue) is made possible by the continuous transfer
of the created entropy from Earth to outer space, in the form of infrared
radiation (which carries quite more entropy than sunlight in proportion to
its amount of energy because it is colder).
This radiation then crosses
interstellar space and mainly ends up in intergalactic space. Thus, the
development of life is fed not only by sunlight energy (heat with high
temperature) but also by the ever larger and colder intergalactic space,
which the universal
expansion provides like a huge bin for entropy. Both are
complementary, like two markets with different prices provide an
opportunity for profit by trade between them.
Still, all the entropy of visible and infrared light from stars and planets, is only a tiny part of the entropy in the universe. Among electromagnetic radiations alone, the cosmic microwave background already has comparable energy to visible and infrared light (1); and thus much more entropy (ignoring the entropy of practically undetectable particles: dark matter, neutrinos, gravitons...).
But most of the entropy of the universe is made of the giant black holes in galactic centers. Indeed, the fall of matter into black holes, contributing to the growth of their size and thus of their entropy (proportional to the area of their horizon), is among the most radically irreversible processes of the Universe (that will only be "reversed" after very unreasonable times by "evaporation" in a much, much colder universe...)
The amount of substance counts the very large number of
atoms or molecules contained in macroscopic objects.
Thus its deep meaning is that of natural numbers, but too big
for the unit number (an individual atom or molecule) to be of any
significance.
This concept comes from chemistry, as chemical
reactions involve ingredients in precise proportions to form molecules
containing the right numbers of atoms (this was first an observed
fact at the beginning of the 19th century, until its explanation in terms
of atoms was clearly established later that century).
The conventional unit for amounts of substance is the mol: 1 mol
means NA molecules, where the number NA≈ 6.022×1023 is the Avogadro constant.
Thus, n mol of some pure substance contains n×N
A molecules of this substance.
This number comes from the choice that 1 mol of Carbon-12 weights
12 grams (thus roughly, 1 mol of hydrogen atoms weights 1 gram =
0.001 kg, with a slight difference due to the nuclear binding energy,
converted into mass by E=mc2).
It can be seen as quantity NA
≈ 6.022×1023 mol-1.
As we shall later deduce from the nature of entropy, gases are subject in good approximation to the ideal gas law. This approximation goes when their density is much lower than in the liquid phase, so that each item of gas (atom or molecule) spends in average much more time freely moving than significantly interacting with its neighbors.
The ideal gas law is PV=nRTThe quantity PV is homogeneous to an energy (with
conventional unit J=Joule) : the energy needed to push a volume
V of gas (the volume swept by the push) at its pression
P. This is also 2/3 E where E is the kinetic
energy of just the speed of gas molecules (ignoring their energy of
rotation and other internal moves) if this speed is much smaller
than the speed of light. Here in E=3/2 PV, the factor
3 comes from the number of space dimensions and the factor 1/2
comes from the formula of kinetic energy.(²)
In the ideal gas law, the gas constant R=8.314 J mol−1K−1, is the natural conversion constant by which the temperature T (expressed in Kelvin), in the form of the product RT, is physically involved as a composite of other physical quantities. This gas constant is never far from any phenomenon involving temperature, even for solids instead of gases, so that temperature (and thus entropy) only conventionally has its own unit, while its true physical nature is that of a composite of other physical quantities.
Namely, the ideal gas law presents the physically meaningful expression RT of the temperature, as an energy per amount of substance (which explains the units involved in the value of R). It also reduces entropy (initially expressed in J/K) as comparable with an amount of substance.Indeed, for some kinds of ordinary matter across some ranges of temperature, it is rather stable (independent of the temperature), near a value that is the product of its amount of substance by some precise number (usually half an integer) which represents the possible degrees of freedom involved in thermal agitation. For example, the heat capacity of water at 25°C, is 8.965 times its amount of substance (of water molecules). This number is close to 9 = 3*3, that is 3 atoms per molecule, times 3 space dimensions, as each atom can move in 3 dimensions.
As long as the heat capacity C of an object is stable, the
variations of entropy, integral over T of dS =
C.dT/T, are those of C.ln T.
Such cases mainly occur only around familiar ranges of temperatures,
in successive temperature intervals, each with a different value of heat
capacity. But other situations may occur: the heat capacity may abruptly
change during phase transition (between solid, liquid and gas), and
smoothly change in some other cases.
The hotter an object is, the more the
matter is divided into smaller components able to move independently
of each other (from molecules to atoms to individual particles in plasmas),
thus a higher heat capacity. At higher temperatures, increasing
temperature by a given additive value requires more energy; thus,
increasing temperature by a given multiplicative factor involves a larger
increase of entropy.
Near the absolute zero of temperature, a logarithmic function (as is entropy when capacity is constant) would decrease to infinity. But entropy cannot decrease infinitely. Instead, the third law of thermodynamics states that for any system there exists an absolute zero of entropy, thus an absolute definition of the entropy of the state of any system, as a quantity that always remains positive. This zero entropy is often reached at the absolute zero of temperature by perfect crystals and some other systems kept in limited volumes, but not always: in addition to gases expanded in unlimited volumes, a few other substances keep a positive entropy near the zero temperature, called residual entropy.
So, at very low temperatures, the heat capacity of an object must converge to zero too (this is not an exactly logical consequence but it happens in practice), thus become much less than its number of molecules: atoms and simple molecules no more move individually, but rather only collectively or scarcely. This will be explained by the nature of entropy and its foundation on quantum physics. Let us introduce a first approach, with the case of thermal radiation.In the case of an ideally black object (absorbing the light of all wavelengths), the radiation for every temperature T combines an energy flow proportional to T4 with an entropy flow proportional to T3. In a volume V, the energy E and entropy S of radiation are related by E = (3/4) ST where the coefficient 3/4 < 1 comes from T=dE/dS = E/ST d(T4)/d(T3).
The exact values areE/V =
(π2/15)(kT)4/(cℏ)3
where π2/15 =0.65797
S/V= (4π2/45)k4T3/(cℏ)3
where (4π2/45)=0.8773.
In particular, sunlight contains more entropy than it took from the Sun: the further amount is created by the process of light emission at the Sun's surface, which is irreversible due to the contrast of the emitted light with the surrounding darkness.
The heat capacity of matter starts being dominated by the heat capacity of radiation inside matter when the density of photons number becomes comparable to that of other particles. In the center of the Sun, T= 15 million K = cℏ/kx gives x=1.5Å. This radiation has similar heat capacity per volume to a medium at this "usual" density and thus inter-particle distance; its pressure is also similar to that of a plasma at this particles density and this temperature. However, a plasma at this temperature has no reason to stay close to this density anymore; the density at the sun center is 162.2 g/cm3, so that the heat capacity and pressure of electrons and nucleons still dominates there. Only in bigger stars with higher temperature in the core, this radiation can get a comparable importance to ordinary particles, and thus play a crucial role in stellar stability.In particular, the temperature above which space is filled with electrons and positrons is (from google calculator "electron mass/2*c^2/k=") 3 billion K. Such temperatures correspond to times until a few seconds after the big bang (3), are just touched as factors of pressure reduction in exceptional stars and supernova, and are only exceeded after this, but too late to still matter ("newly formed neutron core has an initial temperature of about 100 billion K").
F = E −
T0
S
A = E − TS
dF = (T−T0)dS
− PdV
dA = −SdT
− PdV
d(A/T) = d(E/T − S) = E d(1/T) + (dE − TdS)/T = E d(1/T) − PdV/T
The special interest of these quantities will appear when explaining the nature of entropy.(2)The calculation of the comparison of kinetic energy and pressure contributed in a 1-dimensional space by a particle at any speed between 0 and the speed of light, goes this way:
Ec=E-m
v=p/E
Pressure P=pv
E2-p2=m2
m2=E2+Ec2-2EEc
p2+Ec2=2EEc
P=p2/E
E=p2/P
p2+Ec2=2.Ec.p2/P
Ec/P=(p2+Ec2)/2p2=(1+(Ec/p)2)/2
(3)a time one can deduce from the energy density of that temperature's radiation and the gravitational constant: google calculator
sqr((hbar)^3/((electron mass/2)^4*(2.5*8*pi*G/3)*c^3))/2= 13.45 seconds
where /2 is because t=1/2H when energy density is dominated by radiation (particles with speeds near c), electron mass/2 is by assuming the main electron/positron annihilation to happen when 2kT=mc2, and 2.5 instead of 0.658 is a number I hazardously insert to reflect how much more the total energy density of the universe may have been than electromagnetic radiation, mainly because of electrons, positrons and neutrinos.