What is Entropy?

entropy:  1) a measure of the degree of disorder in a substance or a system: entropy always increases and available energy diminishes in a closed system;  2) a thermodynamic measure of the amount of energy unavailable for useful work in a system undergoing change.  See the Second Law of Thermodynamics.

The thermodynamical notion of entropy was introduced in 1854 by Rudolph Clausius, who built on the work of Carnot. His ideas were later extended and clarified by Helmholtz and others. In the 1870's, Ludwig Boltzmann found a "statistical" definition of entropy which, he claimed, reduced to the earlier notion of Clausius. Around the same time, Josiah Willard Gibbs introduced a slightly different statistical notion of entropy. Here are some pages discussing these ideas:

Around 1930, John von Neumann (inventor of the programmable computer) introduced the entropy operator, which is the analog of numerical entropy for quantum mechanics. These days, various physicists are attempting to generalize von Neumann's operator entropy in the context of noncommutative C-* algebras (the mathematical setting for the rigorous theory of statistical mechanics introduced by Ruelle and others).

(the proceeding by Chris Hillman)

 

A quick search on google will find a number of sites on Entropy & the Second Law of Thermodynamics.

[home]


Paul Hopkins.
Copyright © 2022 Paul Hopkins. All rights reserved.
Revised: 04/22/22.