Not all random variables have moment-generating functions, however, it is an alternative specification of probability distribution of a random variable.

In probability theory, the moment=generating function of a random variable X is defined as

 M_X(t) := E\left[e^{tX}\right], \quad t \in \mathbb{R},

wherever expectation of it exists. Therefore it has a value 1 when t is equal to zero.

Here we need a function that is absolutely integrable. More generally, an n-dimensional random vector is applied as,

 M_{\mathbf X}(\mathbf t) := E\left(e^{\mathbf t^\mathrm T\mathbf X}\right). with  \mathbf X = ( X_1, \ldots, X_n).

Then we can expand exponential function into the series,
e^{tX} = 1 + tX + \frac{t^2X^2}{2!} + \frac{t^3X^3}{3!} + \cdots +\frac{t^nX^n}{n!} + \cdots.

So:


M_X(t) = E(e^{tX}) = 1 + tm_1 + \frac{t^2m_2}{2!} + \frac{t^3m_3}{3!}+\cdots + \frac{t^nm_n}{n!}+\cdots,

and  mi is ith moment.

 

If we have a probability density function, ƒ(x), and cumulative distribution funcion ,F , then we have a moment-generating function which is Riemann-Stieltjes integral

M_X(t) = \int_{-\infty}^\infty e^{tx}\,dF(x).

 

Now, trasmform this equation into two-sided Laplace transform of ƒ(x),


\begin{align}
M_X(-t) & = \int_{-\infty}^\infty e^{tx} f(x)\,dx \\
& = \int_{-\infty}^\infty \left( 1+ tx + \frac{t^2x^2}{2!} + \cdots + \frac{t^nx^n}{n!} + \cdots\right) f(x)\,dx \\
& = 1 + tm_1 + \frac{t^2m_2}{2!} +\cdots + \frac{t^nm_n}{n!} +\cdots,
\end{align}

 

In the case of Normal(gaussian) distribution, we have MGF as \exp\{ \mu t + \frac{1}{2}\sigma^2t^2 \}.

 

For the last, we have a characteristic function of random variable is,


    \varphi_X(t) = \operatorname{E}[\,e^{itX}\,]
  . and  
    \varphi_X(-it) = M_X(t). \,
   Hence CF always exist.

 

 

 

+ Recent posts