Science, Tech, Math › Math Variance and Standard Deviation Definition and Examples Share Flipboard Email Print Utamaru Kido / Getty Images Math Statistics Statistics Tutorials Formulas Probability & Games Descriptive Statistics Inferential Statistics Applications Of Statistics Math Tutorials Geometry Arithmetic Pre Algebra & Algebra Exponential Decay Functions Worksheets By Grade Resources View More By Ashley Crossman Updated April 22, 2019 Variance and standard deviation are two closely related measures of variation that you will hear about a lot in studies, journals, or statistics class. They are two basic and fundamental concepts in statistics that must be understood in order to understand most other statistical concepts or procedures. Below, we’ll review what they are and how to find the variance and standard deviation. Key Takeaways: Variance and Standard Deviation The variance and standard deviation show us how much the scores in a distribution vary from the average.The standard deviation is the square root of the variance.For small data sets, the variance can be calculated by hand, but statistical programs can be used for larger data sets. Definition By definition, variance and standard deviation are both measures of variation for interval-ratio variables. They describe how much variation or diversity there is in a distribution. Both the variance and standard deviation increase or decrease based on how closely the scores cluster around the mean. Variance is defined as the average of the squared deviations from the mean. To calculate the variance, you first subtract the mean from each number and then square the results to find the squared differences. You then find the average of those squared differences. The result is the variance. The standard deviation is a measure of how spread out the numbers in a distribution are. It indicates how much, on average, each of the values in the distribution deviates from the mean, or center, of the distribution. It is calculated by taking the square root of the variance. A Conceptual Example The variance and standard deviation are important because they tell us things about the data set that we can’t learn just by looking at the mean, or average. As an example, imagine that you have three younger siblings: one sibling who is 13, and twins who are 10. In this case, the average age of your siblings would be 11. Now imagine that you have three siblings, ages 17, 12, and 4. In this case, the average age of your siblings would still be 11, but the variance and standard deviation would be larger. A Quantitative Example Let’s say we want to find the variance and standard deviation of the age among your group of 5 close friends. The ages of you and your friends are 25, 26, 27, 30, and 32. First, we must find the mean age: (25 + 26 + 27 + 30 + 32) / 5 = 28. Then, we need to calculate the differences from the mean for each of the 5 friends. 25 – 28 = -326 – 28 = -227 – 28 = -130 – 28 = 232 – 28 = 4 Next, to calculate the variance, we take each difference from the mean, square it, then average the result. Variance = ( (-3)2 + (-2)2 + (-1)2 + 22 + 42)/ 5 = (9 + 4 + 1 + 4 + 16 ) / 5 = 6.8 So, the variance is 6.8. And the standard deviation is the square root of the variance, which is 2.61. What this means is that, on average, you and your friends are 2.61 years apart in age. Although it’s possible to calculate the variance by hand for smaller data sets such as this one, statistical software programs can also be used to calculate the variance and standard deviation. Sample Versus Population When conducting statistical tests, it’s important to be aware of the difference between a population and a sample. To calculate the standard deviation (or variance) of a population, you would need to collect measurements for everyone in the group you’re studying; for a sample, you would only collect measurements from a subset of the population. In the example above, we assumed that the group of five friends was a population; if we had treated it as a sample instead, calculating the sample standard deviation and sample variance would be slightly different (instead of dividing by the sample size to find the variance, we would have first subtracted one from the sample size and then divided by this smaller number). Importance of the Variance and Standard Deviation The variance and standard deviation are important in statistics, because they serve as the basis for other types of statistical calculations. For example, the standard deviation is necessary for converting test scores into Z-scores. The variance and standard deviation also play an important role when conducting statistical tests such as t-tests. References Frankfort-Nachmias, C. & Leon-Guerrero, A. (2006). Social Statistics for a Diverse Society. Thousand Oaks, CA: Pine Forge Press.