Links on entropy
This page was created from the archive.org copy of www.engineeringdegree.net/resources/entropy/
with the same introduction text and updated links.
The term “entropy” was created by a German scientist, Rudolf Clausius, to describe the
measure of energy wasted by an isolated system (in other words, the energy not used to
perform work). Energy loss occurs in virtually all systems, as complete efficiency is hard to
accomplish. Entropy answers many questions important to physics, such as why processes
tend to progress spontaneously in one form but not another. The idea of entropy is applied
to statistical models in physics to try to determine the distribution of particles in a given system.
Almost a century later, scientist Claude Shannon theorized a version of entropy to serve
information theory. His work on coding languages and compression defines entropy slightly
differently than his forerunners. Entropy, for information science, describes the ability to predict
the next character in a series of random variables. Shannon applied many of the concepts of
thermodynamic entropy to his informational reappraisal including efficiency, additive properties
and the assertion that energy only increases through system state transformations. In fact the
formulas for determining thermodynamic and informational entropy are almost identical.
Shannon’s work on probabilistic entropy inspired mathematicians to take up the cause as well.
In math entropy can refer to a few different measurements of the complexity of a system. The
specifics of mathematic entropy are caught up in complex formulas and concepts, but they
generally resemble previous formulas describing entropy. With the flurry of research
surrounding entropy, researchers in the social sciences began using the term as a way to
describe certain tendencies in their field. Entropy has been used as a way to describe urban
decay, voting tendencies and even political campaigns. Informational sciences may have
more to offer humanities in the fields of linguistics and literature. The use of the term entropy
in social sciences, however, is not well defined and often describes completely different
phenomena based on the researcher and topic studied. For this reason most scientists
view applications of entropy as a less structured concept with skepticism.
Depending on the application of the term, disorder can mean many things, but in general
it denotes approaching a state of equilibrium. For entropy, order is the function of external
forces acting on a system and disorder is the tendency of systems to resist these types of
changes. Research into entropy has helped us achieve unseen productivity in business
sectors and has promoted research into systems too complex to be broached before.
It is an exciting area of study, yet at the same time brings frightening questions such as
the possibility of heat-death, or a state in which matter can no longer perform work.
Despite these fears, the research conducted into the nature and applications of entropy
has helped us understand our world and inspired discussions in science, art, religion,
and biology. To help start your own exploration of entropy, listed below is a related
compilation of resources.
[PS] = files in PostScript format -- you will need the appropriate program to read it.
Informational Entropy
- Entropy in information
theory – a primer to the topic by Sylvain Poirier, in the present site
- Information
Theory Primer – by Tom Schneider, with an Appendix on Logarithms
- Typical
Sequences and All That: Entropy, Pattern Matching and Data Compression
– Aaron D. Wyner’s lecture on entropic systems in computing and
information theory. Wyner’s work sets out mathematical formulas to address common
informational problems to classify their entropy.
- A Mini-Introduction To
Information Theory : short introduction to classical and quantum information theory,
by Edward Witten
- Achieving
the Shannon Limit: A Progress Report – A report on research into
maximizing the use of channel capacity and the effects of entropy on doing so. This is
from Robert McEliece lecture at the Thirty-Eighth Allerton Conference of 2005.
Systems
- Symbolic Dynamics and Coding Applications – An investigation
of binary systems and their tendency toward entropic states. Uses principles primarily found in
coding to illustrate the uncertainty of outcomes in dynamic systems.
- An
Entropy Primer
– Introduces concepts related to system dynamics and integrates these with information
entropy theory. The text offers explanation of Claude Shannon’s Noiseless Coding
and Equipartition Theorems.
- All
Entropies Agree for an SFT – A research paper aimed at developing
relationships between different conceptions of entropy. This type of relational approach is
especially important for efforts in integration, such as String Field Theory.
Mathematics and Algorithms
- Lecture Notes on
Descriptional Complexity and Randomness – Provides algorithmic
examples of entropy with applications to real world applications. The paper stresses the
importance of entropy to understanding probability and complex algorithmic equations.
- Randomness
and Mathematical Proof – Discusses conceptions of randomness as
examples of entropic systems. This paper provides many examples to help the beginning
reader understand complicated concepts.
- Jonathan
Borwein – A mathematics professor at New Castle University. His papers
and books contain great information on mathematic entropy and information theory.
- Randomness
in Arithmetic – An explanation of how concepts which we think of as stable,
such as arithmetic operations, actually exhibit a high degree of randomness. The paper gives
interesting examples of the usefulness of this theory to problems in biology and beyond.
- [PS] The Discovery of
Algorithmic Probability – Describes the background and development of
statistical representation in algorithms.
- Prediction
and Information Theory – Describes the importance of prediction in
constructing algorithmic equations. This paper goes into detail on many different types of
prediction and the problems inherent to each.
- [PS] Probability Theory:
The Logic of Science – An expository book on the nature of statistical
probability and its uses to deduce possible outcomes. (a PostScript file for each chapter).
Thermodynamic Entropy
- The
Page of Entropy – Describes the effects of entropy on physical systems
and provides common examples and models of entropic processes (very simple and accessible
site for anyone unfamiliar with entropy).
- Entropy in statistical physics
– logical exposition of the nature of entropy in physics and the resulting laws of
thermodynamics, in the present site
- A
Brief Maxent Tutorial – A description of the importance of maximal entropy
to statistical modeling and probability. The paper focuses on the application of entropic models
to natural language processing but maintains their validity in a variety of situations.
- The
Black Hole Information Loss Problem – A discussion of Hawking radiation
and how a black hole transforms physical objects made up of many variables into a change in
temperature. This finding presents problems for physics since it posits that information, the
description of the physical state of the object, is lost through the process.
Biological Entropy
Cultural
- Wikipedia:
Social Entropy – A basic outline of how entropy relates to sociology and
cultural structures. Since entropy’s application to social structures is a relatively new
development, this page gives some good ideas on where it has the most to offer.
- A
Test for Conformity in Voting
Behavior by Stephen
Coleman – An interesting paper designed to test whether voting tendencies subscribe
to a probabilistic system. The document does a great job of explaining informational entropy and
separating it from physical descriptions of uncertainty.
Research
More links