Temperature Definition in Science

We think of temperature as a measure of hot or cold, but what we're really talking about is thermal energy.
We think of temperature as a measure of hot or cold, but what we're really talking about is thermal energy. Petra Schramböhmer, Getty Images

Temperature Definition

Temperature is an objective measurement of how hot or cold an object is. It can be measured with a thermometer or a calorimeter. It is a means of determining the internal energy contained within the system.

Because humans instantly perceive the amount of heat and cold within an area, it is understandable that temperature is a feature of reality that we have a fairly intuitive grasp on.

Indeed, temperature is a concept that arises as crucial within a wide variety of scientific disciplines. Consider that many of us have our first interaction with a thermometer in the context of medicine, when a doctor (or our parent) uses one to discern our temperature, as part of diagnosing our illness.

Heat versus Temperature

Note that temperature is different from heat, though the two concepts are linked. Temperature is a measure of the internal energy of the system, while heat is a measure of how energy is transferred from one system (or body) to another. This is roughly as described by the kinetic theory, at least for gases and fluids. The greater the heat absorbed by a material, the more rapidly the atoms within the material begin to move, and thus the greater the rise in temperature. Things get a little more complicated for solids, of course, but that's the basic idea.

Temperature Scales

Several temperature scales exist.

In America, the Fahrenheit temperature is most commonly used, though the SI unit Centrigrade (or Celsius) is used in most of the rest of the world. The Kelvin scale is used often in physics, and is adjusted so that 0 degrees Kelvin is absolute zero, in theory the coldest possible temperature, in which all kinetic motion ceases.

Measuring Temperature

A traditional thermometer measures temperature by containing a fluid that expands as it gets hotter and contracts as it gets cooler. As the temperature changes, the liquid within a contained tube moves along a scale on the device.

As with much of modern science, we can look back to the ancients for the origins of the ideas about how to measure temperature back to the ancients. Specifically, in the first century BCE the philosopher Hero of Alexandria wrote in Pneumatics about the relationship between temperature and the expansion of air. This book was published in Europe in 1575, inspiring the creation of the earliest thermometers throughout the following century.

Galileo was one of the first scientists recorded to have actually used such a device, though it's unclear whether he actually built it himself or acquired the idea from someone else. He used a device, called a thermoscope, to measure the amount of heat and cold, at least as early as 1603.

Throughout the 1600s, various scientists tried to create thermometers that measured temperature by a change of pressure within a contained measurement device. Robert Fludd built a thermoscope in 1638 that had a temperature scale built into the physical structure of the device, resulting in the first thermometer.

Without any centralized system of measurement, each of these scientists developed their own measurement scales, and none of them really caught on until Daniel Gabriel Fahrenheit built his in the early 1700's. He built a thermometer with alcohol in 1709, but it was really his mercury-based thermometer of 1714 that became the gold standard of temperature measurement.

Edited by Anne Marie Helmenstine, Ph.D.