How to Calculate Entropy

Meaning of Entropy in Physics

Entropy is a measure of the randomness or disorder of a system.
Entropy is a measure of the randomness or disorder of a system. Atomic Imagery/Getty Images

Entropy is defined as the quantitative measure of disorder or randomness in a system. The concept comes out of thermodynamics, which deals with the transfer of heat energy within a system. Instead of talking about some form of "absolute entropy," physicists generally talk about the change in entropy that takes place in a specific thermodynamic process.

Calculating Entropy

In an isothermal process, the change in entropy (delta-S) is the change in heat (Q) divided by the absolute temperature (T):

delta-S = Q/T

In any reversible thermodynamic process, it can be represented in calculus as the integral from a process's initial state to its final state of dQ/T.

In a more general sense, entropy is a measure of probability and the molecular disorder of a macroscopic system. In a system that can be described by variables, there are a certain number of configurations those variables may assume. If each configuration is equally probable, then the entropy is the natural logarithm of the number of configurations, multiplied by Boltzmann's constant.

S = kB ln W

where S is entropy, kB is Boltzmann's constant, ln is the natural logarithm and W represents the number of possible states. Boltzmann's constant is equal to 1.38065 × 10−23 J/K.

Units of Entropy

Entropy is considered to be an extensive property of matter that is expressed in terms of energy divided by temperature. The SI units of entropy are J/K (joules/degrees Kelvin).

Entropy & The Second Law of Thermodynamics

One way of stating the second law of thermodynamics is:

In any closed system, the entropy of the system will either remain constant or increase.

One way to view this is that adding heat to a system causes the molecules and atoms to speed up. It may be possible (though tricky) to reverse the process in a closed system (i.e. without drawing any energy from or releasing energy somewhere else) to reach the initial state, but you can never get the entire system "less energetic" than it started ...

the energy just doesn't have anyplace to go. For irreversible processes, the combined entropy of the system and its environment always increases.

Misconceptions About Entropy

This view of the second law of thermodynamics is very popular, and it has been misused. Some argue that the second law of thermodynamics means that a system can never become more orderly. Not true. It just means that in order to become more orderly (for entropy to decrease), you must transfer energy from somewhere outside the system, such as when a pregnant woman draws energy from food to cause the fertilized egg to become a complete baby, completely in line with the second line's provisions.

Also Known As: Disorder, Chaos, Randomness (all three imprecise synonyms)

Absolute Entropy

A related term is "absolute entropy", which is denoted by S rather than Δ S. Absolute entropy is defined according to the third law of thermodynamics. Here a constant is applied that makes it so the entropy at absolute zero is defined to be zero.

Edited by Anne Marie Helmenstine, Ph.D.