The proof and intuition presented here come from this excellent writeup by Yuval Filmus, which in turn draws upon ideas in this book by Fumio Hiai and Denes Petz. Suppose that we have a sequence of real-valued random variables

(1)

Define the random variable

(2)

to be a scaled sum of the first variables in the sequence. Now, we would like to make interesting statements about the sequence

(3)

The central limit theorem is quite general. To simplify this exposition, I will make a number of assumptions. First, I will assume that the are independent and identically distributed. Second, I will assume that each has mean and variance . Finally, I will assume that the moment generating function (to be defined below) converges (this condition requires all moments of the distribution to exist).

Under these conditions, the central limit theorem tells us that

(4)

where is the normal distribution with density function

(5)

and where means that for all intervals . It is not immediately obvious that the sequence should converge, but if it does converge, the normal distribution is the natural candidate. Suppose it converges to some distribution . Now, consider the random variable

(6)

Of course, is just , so . But the first term on the RHS is , and the second term on the RHS has the same distribution as , so we have

(7)

where the two ‘s are independent random variables with the same distribution. So and must have the same distribution. By grouping terms in different proportions, we can derive similar properties of . Since the normal distribution satisfies

(8)

it is a natural candidate for the distribution .

To prove the central limit theorem, we will make use of the moment generating function

(9)

and the cumulant generating function

(10)

The coefficients of the moment generating function and the cumulant generating function (divided by ) are referred to as “moments” and “cumulants” respectively. The cumulants and the moments are closely related, and the values of one determine the values of the other. Incidentally, the moment generating function and the cumulant generating function of the normal distribution are given by

(11)

Note that the moment generating function satisfies

(12)

It follows that the cumulant generating function satisfies

(13)

Now, we are going to use all of these tools. Let’s inspect the cumulant generating function of . We have

(14)

Let be the th coefficient of . Equating powers of above, we get

(15)

From the case , we see that

(16)

as expected. From the case , we see that

(17)

also as expected. But what about higher cumulants? For , as , we have

(18)

Therefore, the higher cumulants all vanish. It follows that the cumulants of the sequence converge to the cumulants of . Therefore, the moments of the sequence converge to the moments of . It follows from Levy’s continuity theorem, that , as desired.

## 0 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.

You must be logged in to post a comment.