Science, Tech, Math › Math Use of the Moment Generating Function for the Binomial Distribution Share Flipboard Email Print A histogram of a binomial distribution. C.K.Taylor Math Statistics Statistics Tutorials Formulas Probability & Games Descriptive Statistics Inferential Statistics Applications Of Statistics Math Tutorials Geometry Arithmetic Pre Algebra & Algebra Exponential Decay Functions Worksheets By Grade Resources View More By Courtney Taylor Professor of Mathematics Ph.D., Mathematics, Purdue University M.S., Mathematics, Purdue University B.A., Mathematics, Physics, and Chemistry, Anderson University Courtney K. Taylor, Ph.D., is a professor of mathematics at Anderson University and the author of "An Introduction to Abstract Algebra." our editorial process Courtney Taylor Updated January 05, 2019 The mean and the variance of a random variable X with a binomial probability distribution can be difficult to calculate directly. Although it can be clear what needs to be done in using the definition of the expected value of X and X2, the actual execution of these steps is a tricky juggling of algebra and summations. An alternate way to determine the mean and variance of a binomial distribution is to use the moment generating function for X. Binomial Random Variable Start with the random variable X and describe the probability distribution more specifically. Perform n independent Bernoulli trials, each of which has probability of success p and probability of failure 1 - p. Thus the probability mass function is f (x) = C(n , x)px(1 – p)n - x Here the term C(n , x) denotes the number of combinations of n elements taken x at a time, and x can take the values 0, 1, 2, 3, . . ., n. Moment Generating Function Use this probability mass function to obtain the moment generating function of X: M(t) = Σx = 0n etxC(n,x)>)px(1 – p)n - x. It becomes clear that you can combine the terms with exponent of x: M(t) = Σx = 0n (pet)xC(n,x)>)(1 – p)n - x. Furthermore, by use of the binomial formula, the above expression is simply: M(t) = [(1 – p) + pet]n. Calculation of the Mean In order to find the mean and variance, you'll need to know both M’(0) and M’’(0). Begin by calculating your derivatives, and then evaluate each of them at t = 0. You will see that the first derivative of the moment generating function is: M’(t) = n(pet)[(1 – p) + pet]n - 1. From this, you can calculate the mean of the probability distribution. M(0) = n(pe0)[(1 – p) + pe0]n - 1 = np. This matches the expression that we obtained directly from the definition of the mean. Calculation of the Variance The calculation of the variance is performed in a similar manner. First, differentiate the moment generating function again, and then we evaluate this derivative at t = 0. Here you'll see that M’’(t) = n(n - 1)(pet)2[(1 – p) + pet]n - 2 + n(pet)[(1 – p) + pet]n - 1. To calculate the variance of this random variable you need to find M’’(t). Here you have M’’(0) = n(n - 1)p2 +np. The variance σ2 of your distribution is σ2 = M’’(0) – [M’(0)]2 = n(n - 1)p2 +np - (np)2 = np(1 - p). Although this method is somewhat involved, it is not as complicated as calculating the mean and variance directly from the probability mass function.