Links on entropy

This page was created from the archive.org copy of www.engineeringdegree.net/resources/entropy/ with the same introduction text and updated links.

The term “entropy” was created by a German scientist, Rudolf Clausius, to describe the measure of energy wasted by an isolated system (in other words, the energy not used to perform work). Energy loss occurs in virtually all systems, as complete efficiency is hard to accomplish. Entropy answers many questions important to physics, such as why processes tend to progress spontaneously in one form but not another. The idea of entropy is applied to statistical models in physics to try to determine the distribution of particles in a given system.
Almost a century later, scientist Claude Shannon theorized a version of entropy to serve information theory. His work on coding languages and compression defines entropy slightly differently than his forerunners. Entropy, for information science, describes the ability to predict the next character in a series of random variables. Shannon applied many of the concepts of thermodynamic entropy to his informational reappraisal including efficiency, additive properties and the assertion that energy only increases through system state transformations. In fact the formulas for determining thermodynamic and informational entropy are almost identical.
Shannon’s work on probabilistic entropy inspired mathematicians to take up the cause as well. In math entropy can refer to a few different measurements of the complexity of a system. The specifics of mathematic entropy are caught up in complex formulas and concepts, but they generally resemble previous formulas describing entropy. With the flurry of research surrounding entropy, researchers in the social sciences began using the term as a way to describe certain tendencies in their field. Entropy has been used as a way to describe urban decay, voting tendencies and even political campaigns. Informational sciences may have more to offer humanities in the fields of linguistics and literature. The use of the term entropy in social sciences, however, is not well defined and often describes completely different phenomena based on the researcher and topic studied. For this reason most scientists view applications of entropy as a less structured concept with skepticism.
Depending on the application of the term, disorder can mean many things, but in general it denotes approaching a state of equilibrium. For entropy, order is the function of external forces acting on a system and disorder is the tendency of systems to resist these types of changes. Research into entropy has helped us achieve unseen productivity in business sectors and has promoted research into systems too complex to be broached before. It is an exciting area of study, yet at the same time brings frightening questions such as the possibility of heat-death, or a state in which matter can no longer perform work. Despite these fears, the research conducted into the nature and applications of entropy has helped us understand our world and inspired discussions in science, art, religion, and biology. To help start your own exploration of entropy, listed below is a related compilation of resources.

[PS] = files in PostScript format -- you will need the appropriate program to read it.

Informational Entropy

Systems

Mathematics and Algorithms

Thermodynamic Entropy

Biological Entropy

Cultural

Research

More links