Chebyshev’s inequality says that at least 1-1/*K*^{2} of data from a sample must fall within *K* standard deviations from the mean (here *K* is any positive real number greater than one).

Any data set that is normally distributed, or in the shape of a bell curve, has several features. One of them deals with the spread of the data relative to the number of standard deviations from the mean. In a normal distribution, we know that 68% of the data is one standard deviation from the mean, 95% is two standard deviations from the mean, and approximately 99% is within three standard deviations from the mean.

But if the data set is not distributed in the shape of a bell curve, then a different amount could be within one standard deviation. Chebyshev’s inequality provides a way to know what fraction of data falls within *K* standard deviations from the mean for *any* data set.

### Facts About the Inequality

We can also state the inequality above by replacing the phrase “data from a sample” with probability distribution. This is because Chebyshev’s inequality is a result from probability, which can then be applied to statistics.

It is important to note that this inequality is a result that has been proven mathematically. It is not like the empirical relationship between the mean and mode, or the rule of thumb that connects the range and standard deviation.

### Illustration of the Inequality

To illustrate the inequality, we will look at it for a few values of *K*:

- For
*K*= 2 we have 1 – 1/*K*^{2}= 1 - 1/4 = 3/4 = 75%. So Chebyshev’s inequality says that at least 75% of the data values of any distribution must be within two standard deviations of the mean.

- For
*K*= 3 we have 1 – 1/*K*^{2}= 1 - 1/9 = 8/9 = 89%. So Chebyshev’s inequality says that at least 89% of the data values of any distribution must be within three standard deviations of the mean. - For
*K*= 4 we have 1 – 1/*K*^{2}= 1 - 1/16 = 15/16 = 93.75%. So Chebyshev’s inequality says that at least 93.75% of the data values of any distribution must be within two standard deviations of the mean.

### Example

Suppose we have sampled the weights of dogs in the local animal shelter and found that our sample has mean of 20 pounds with standard deviation of 3 pounds. With the use of Chebyshev’s inequality, we know that at least 75% of the dogs that we sampled have weights that are two standard deviations from the mean. Two times the standard deviation gives us 2 x 3 = 6. Subtract and add this from the mean of 20. This tells us that 75% of the dogs have weight from 14 pounds to 26 pounds.

### Use of the Inequality

If we know more about the distribution that we’re working with, then we can usually guarantee that more data is a certain number of standard deviations away from the mean. For example, if we know that we have a normal distribution, then 95% of the data is two standard deviations from the mean. Chebyshev’s inequality says that in this situation we know that *at least* 75% of the data is two standard deviations from the mean. As we can see in this case, it could be much more than this 75%.

The value of the inequality is that it gives us a “worse case” scenario in which the only things we know about our sample data (or probability distribution) is the mean and standard deviation. When we know nothing else about our data, Chebyshev’s inequality provides some additional insight into how spread out the data set is.

### History of the Inequality

The inequality is named after the Russian mathematician Pafnuty Chebyshev, who first stated the inequality without proof in 1874. Ten years later the inequality was proved by Markov in his Ph.D. dissertation. Due to variances in how to represent the Russian alphabet in English, it is Chebyshev is also spelled as Tchebysheff.