One way to calculate the mean and variance of a probability distribution is to find the expected values of the random variables *X* and *X*^{2}. We use the notation *E* (*X*) and *E*(*X*^{2}) to denote these expected values. In general, it is difficult to calculate *E* (*X*) and *E*(*X*^{2}) directly. To get around this difficultly, we use some more advanced mathematical theory and calculus. The end result is something that makes our calculations easier.

The strategy for this problem is to define a new function, of a new variable *t* that is called the moment generating function. This function allows us to calculate moments by simply taking derivatives.

### The Assumptions

Before we define the moment generating function, we begin by setting the stage with notation and definitions. We let *X* be a discrete random variable. This random variable has probability mass function *f*( *x*). The sample space that we are working with will be denoted by *S*.

Rather than calculating the expected value of *X*, we want to calculate the expected value of an exponential function related to *X*. If there is a positive real number *r* such that *E*(*e ^{tX}*) exists and is finite for all

*t*in the interval [-

*r*,

*r*], then we can define the moment generating function of

*X*.

### Definition of the Moment Generating Function

The moment generating function is the expected value of the exponential function above. In other words, we say that the moment generating function of *X* is given by:

*M*(*t*) = *E* (*e ^{tX}*)

This expected value is the formula Σ *e*^{tx} *f* (*x*), where the summation is taken over all *x* in the sample space *S*. This can be a finite or infinite sum, depending upon the sample space being used.

### Properties of the Moment Generating Function

The moment generating function has many features that connect to other topics in probability and mathematical statistics. Some of its most important features include:

- The coefficient of
*e*is the probability that^{tb}*X*=*b*. - Moment generating functions possess a uniqueness property. If the moment generating functions for two random variables match one another, then the probability mass functions must be the same. In other words, the random variables describe the same probability distribution.
- Moment generating functions can be used to calculate moments of
*X*.

### Calculating Moments

The last item in the list above explains the name of moment generating functions and also their usefulness. Some advanced mathematics says that under the conditions that we laid out, the derivative of any order of the function *M* (*t*) exists for when *t* = 0. Furthermore, in this case, we can change the order of summation and differentiation with respect to *t* to obtain the following formulas (all summations are over the values of *x* in the sample space *S*):

*M*’(*t*) = Σ*xe*^{tx}*f*(*x*)*M*’’(*t*) = Σ*x*^{2}e^{tx}*f*(*x*)*M*’’’(*t*) = Σ*x*^{3}e^{tx}*f*(*x*)*M*^{(n)}’(*t*) = Σ*x*^{n}e^{tx}*f*(*x*)

If we set *t* = 0 in the above formulas, then the *e*^{tx} term becomes *e*^{0} = 1. Thus we obtain formulas for the moments of the random variable *X*:

*M*’(0) =*E*(*X*)*M*’’(0) =*E*(*X*^{2})*M*’’’(0) =*E*(*X*^{3})*M*^{(n)}(0) =*E*(*X*^{n})

This means that if the moment generating function exists for a particular random variable, then we can find its mean and its variance in terms of derivatives of the moment generating function. The mean is *M*’(0), and the variance is *M*’’(0) – [ *M*’(0)]^{2}.

### Summary

In summary, we had to wade into some pretty high-powered mathematics (some of which was glossed over). Although we must use calculus for the above, in the end, our mathematical work is typically easier than by calculating the moments directly from the definition.