Entropy Definition in Science

Chemistry and Physics Glossary Definition of Entropy

light contained in glass box
Entropy is a measure of the disorder or randomness of a system. PM Images/Getty Images

Entropy is an important concept in physics and chemistry, plus it can be applied to other disciplines, including cosmology and economics. In physics, it is part of thermodynamics. In chemistry, it is a core concept in physical chemistry.

Key Takeaways: Entropy

  • Entropy is a measure of the randomness or disorder of a system.
  • The value of entropy depends on the mass of a system. It is denoted by the letter S and has units of joules per kelvin.
  • Entropy can have a positive or negative value. According to the second law of thermodynamics, the entropy of a system can only decrease if the entropy of another system increases.

Entropy Definition

Entropy is the measure of the disorder of a system. It is an extensive property of a thermodynamic system, which means its value changes depending on the amount of matter that is present. In equations, entropy is usually denoted by the letter S and has units of joules per kelvin (J⋅K−1) or kg⋅m2⋅s−2⋅K−1. A highly ordered system has low entropy.

Entropy Equation and Calculation

There are multiple ways to calculate entropy, but the two most common equations are for reversible thermodynamic processes and isothermal (constant temperature) processes.

Entropy of a Reversible Process

Certain assumptions are made when calculating the entropy of a reversible process. Probably the most important assumption is that each configuration within the process is equally probable (which it may not actually be). Given equal probability of outcomes, entropy equals Boltzmann's constant (kB) multiplied by the natural logarithm of the number of possible states (W):

S = kB ln W

Boltzmann's constant is 1.38065 × 10−23 J/K.

Entropy of an Isothermal Process

Calculus may be used to find the integral of dQ/T from the initial state to final state, where Q is heat and T is the absolute (Kelvin) temperature of a system.

Another way to state this is that the change in entropy (ΔS) equals the change in heat (ΔQ) divided by the absolute temperature (T):

ΔS = ΔQ / T

Entropy and Internal Energy

In physical chemistry and thermodynamics, one of the most useful equations relates entropy to the internal energy (U) of a system:

dU = T dS - p dV

Here, the change in internal energy dU equals absolute temperature T multiplied by the change in entropy minus external pressure p and the change in volume V.

Entropy and the Second Law of Thermodynamics

The second law of thermodynamics states the total entropy of a closed system cannot decrease. However, within a system, entropy of one system can decrease by raising entropy of another system.

Entropy and Heat Death of the Universe

Some scientists predict the entropy of the universe will increase to the point where the randomness creates a system incapable of useful work. When only thermal energy remains, the universe would be said to have died of heat death.

However, other scientists dispute the theory of heat death. Some say the universe as a system moves further away from entropy even as areas within it increase in entropy. Others consider the universe as part of a larger system. Still others say the possible states do not have equal likelihood, so ordinary equations to calculate entropy do not hold valid.

Example of Entropy

A block of ice will increase in entropy as it melts. It's easy to visualize the increase in the disorder of the system. Ice consists of water molecules bonded to each other in a crystal lattice. As ice melts, molecules gain more energy, spread further apart, and lose structure to form a liquid. Similarly, the phase change from a liquid to a gas, as from water to steam, increases the energy of the system.

On the flip side, energy can decrease. This occurs as steam changes phase into water or as water changes to ice. The second law of thermodynamics is not violated because the matter is not in a closed system. While the entropy of the system being studied may decrease, that of the environment increases.

Entropy and Time

Entropy is often called the arrow of time because matter in isolated systems tends to move from order to disorder.

Sources

  • Atkins, Peter; Julio De Paula (2006). Physical Chemistry (8th ed.). Oxford University Press. ISBN 978-0-19-870072-2.
  • Chang, Raymond (1998). Chemistry (6th ed.). New York: McGraw Hill. ISBN 978-0-07-115221-1.
  • Clausius, Rudolf (1850). On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat. Poggendorff's Annalen der Physick, LXXIX (Dover Reprint). ISBN 978-0-486-59065-3.
  • Landsberg, P.T. (1984). "Can Entropy and "Order" Increase Together?". Physics Letters. 102A (4): 171–173. doi:10.1016/0375-9601(84)90934-4
  • Watson, J.R.; Carson, E.M. (May 2002). "Undergraduate students' understandings of entropy and Gibbs free energy." University Chemistry Education. 6 (1): 4. ISSN 1369-5614