Molar Heat Capacity Definition and Examples

What Is Molar Heat Capacity in Chemistry?

test tube being heated with flame
Molar heat capacity is the amount of heat required to raise the temperature of one mole of a substance one degree (Celsius or Kelvin). WLADIMIR BULGAR/Getty Images

Molar Heat Capacity Definition

Molar specific heat capacity is the amount of heat energy required to raise the temperature of 1 mole of a substance.

In SI units, molar heat capacity (symbol: cn) is the amount of heat in joules required to raise 1 mole of a substance 1 Kelvin.

cn = Q/ΔT

where Q is heat and ΔT is the change in temperature. For most purposes, heat capacity is reported as an intrinsic property, meaning it is a characteristic of a specific substance.

Heat capacity is measured using a calorimeter. A bomb calorimeter is used for calculations at constant volume. Coffee cup calorimeters are appropriate for finding constant pressure heat capacity.

Units of Molar Heat Capacity

Molar heat capacity is expressed in units of J/K/mol or J/mol·K, where J is joules, K is Kelvin, and m is number of moles. The value assumes no phase changes occur. You'll typically start out with the value for molar mass, which is in units of kg/mol. A less common unit of heat is the kilogram-Calorie (Cal) or the cgs variant, the gram-calorie (cal). It's also possible to express heat capacity in terms of pound-mass using temperatures in degrees Rankine or Fahrenheit.

Molar Heat Capacity Examples

Water has a molar specific heat capacity of 75.32 J/mol·K. Copper has a molar specific heat capacity of 24.78 J/mol·K.

Molar Heat Capacity Versus Specific Heat Capacity

While molar heat capacity reflects the heat capacity per mole, the related term specific heat capacity is the heat capacity per unit mass.

Specific heat capacity is also known simply as specific heat. Sometimes engineering calculations apply volumetric heat capacity, rather than specific heat based on mass.