4 AM Math Post Anyone?

Suppose two random variables X and Y have a discrete joint distribution p(x,y) = \frac{{C }}{{x!(y - x)!}}, for some real constant C. Suppose we want to find the moment-generating function for this distribution when y = 0,1,2,... and x = 0,1,2,...,y.

By definition, the MGF of this distribution is M(t_1 ,t_2 ) = E(e^{t_1 x} e^{t_2 y}), where the function E represents the expected value. Since its a discrete distribution, we write M as:

C \cdot \sum\limits_{y = 0}^\infty  {\sum\limits_{x = 0}^y {\frac{{e^{t_1 x} e^{t_2 y}}}{{x!(y - x)!}}} } = C \cdot \sum\limits_{y = 0}^\infty  {\frac{1}{{y!}}\sum\limits_{x = 0}^y {\frac{{y!e^{t_1 x} e^{t_2 y} }}{{x!(y - x)!}}} }

= C \cdot \sum\limits_{y = 0}^\infty  {\frac{1}{{y!}}\sum\limits_{x = 0}^y {\left( {\begin{array}{*{20}c} y  \\ x  \\ \end{array} } \right)e^{t_1 x} e^{t_2 y} } } … by the definition of the binomial coefficient.

Multiply through by \frac{{e^{t_2 x} }}{{e^{t_2 x} }} and rearrange terms to get:

C \cdot \sum\limits_{y = 0}^\infty  {\frac{1}{{y!}}\sum\limits_{x = 0}^y {\left( {\begin{array}{*{20}c} y  \\ x  \\ \end{array} } \right)\left( {e^{t_1  + t_2 } } \right)^x \left( {e^{t_2 } } \right)^{y - x}}}

The inner summation reduces nicely by the binomial formula and we get:

C \cdot \sum\limits_{y = 0}^\infty  {\frac{{\left( {e^{t_1 }e^{t_2 }  + e^{t_2}}\right)^y}}{{y!}}}

By the Taylor series expansion for the exponential function, we can rewrite this as:

C \cdot \exp (e^{t_1}e^{t_2}  + e^{t_2 } )

Isn’t multiplying by one such an awesome little trick?

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: