# Aside

He will post last assignment today

# Continous distribution

\(\mu (a,b)\)

\(E(X) = \int\limits_a^b x \frac 1 {b-a} dx = \frac {a+b} 2\)

In similar fashion, we can find \(E(X^2) = \int\limits_a^b x^2 \frac 1 {b-a} dx\). And hence \(Var(X) = E(X^2) - (E(X))^2 = \frac {(a-b)^2} {12}\)

# Gamma distribution

\(E(X) = \int\limits_0^\infty x \frac 1 {\Gamma(x)\beta^\alpha} x^{\alpha-1} e ^{-\frac x \beta}dx\)

The idea is to reduce the integral to a gamma function.

E(X) = frac 1 {Gamma(x)beta\^alpha} intlimits_0\^infty x\^{(alpha+1)-1}e \^{-frac x beta}dx

let \(y=\frac x \beta\) , let \(dx=\beta dy\) and let \(x=\beta y\). As \(x\) goes from \(0\) to \(\infty\), so does \(y\). Then we get:

begin{aligned} E(X) &= frac 1 {Gamma(x)beta\^alpha} intlimits_0\^infty y\^{(alpha+1)-1} beta\^alpha beta e\^{-y} dy \ &= frac 1 {Gamma(x)} beta Gamma(alpha+1) \ &= frac 1 {Gamma(x)} beta Gamma(alpha) \ &= alpha beta end{aligned}

In similar fashion, we find \(E(X^2)\) and hence \(Var(X) = \alpha\beta^2\). A special case we have immediately.

For the exponential distribution ( \(\alpha = 1\) )

\(E(X) = \beta\) \(Var(X) = \beta^2\)

If \(X\sim\chi^2\), then by a setting we get \(E(X) = k\) and \(Var(X) = 2k\)

# The normal distribution

By definiton

E(X) = intlimits_{-infty}\^infty x frac{1}{sqrt{2pisigma\^2}}operatorname{exp}left{-frac{left(x-muright)\^2}{2sigma\^2}right} = mu

This is calculated using integration by part, which he didn’t show and will not ask on the exam. :D

Similarily for \(E(X^2)\) and hence \(Var(X) = \sigma^2\)

# Geometric distribution

E(X) = sumlimits_{x=1}\^infty x (1-p)\^{x-1}p

(Which is kinda difficult to evaluate directly), However, note that \(-\frac d {dp} (1-p)^x = x(1-p)^{x-1}\)

Therefore

begin{aligned} E(X) &= p sumlimits_{x=1}\^infty frac d {dp} (1-p)\^x\ &= -p frac d {dp} sumlimits_{x=1}\^infty (1-p)\^x\ &=-p frac d {dp} frac {(1-p)\^1} {1-(1-p)}\ &=-p frac d {dp} frac {1-p} {p} \ &=frac 1 p end{aligned}

To find \(E(X^2)\), we first need to find \(E(X(X-1))\) so that the diffetiation trick works. Thus we find \(Var(X) = \frac {(1-p)^2} p\)

# Moment Generating Fucntion

Let \(X\) be a r.v, then we define the moment generating function (mgf) of \(X\) to be \(E(e^{tx})\), donted by \(M_X(t)\)

# Note

\(M_X(t)\) is only well defined if \(E(e^{tx})\) is finite in an interval around \(t=0\)

So there are indeed distributions that do not have a mgf

We have \(M_X(t) = \int\limits_{-\infty}^{\infty} e^{tx} f_X(x)dx\) for continous case

For discrete case \(M_X(t) = \sum\limits_{all\, x} e^{tx} P(X=x)\)

It can be shown ( by The Uniqueness Theorem) that there is a one-to-one correspondance between an mgf and its probability distribution. i.e once you know \(M_X(t)\) there can be only one \(f_X\) or \(P(X)\) that correspond to \(M_X\)

Because of this correspondance it is sometimes useful to work with the mgf rather than with the distribution and vice versa.

It is easy to see that if a and b are constants then

M_{aX+b}(t) = e\^{tb} M_X(at)

lhs

E[-e\^{t(aX+b)}] &= E[e\^{atX}e\^{bt}]\ &=e\^{bt} Eleft[ e\^{atX} right] \ &=e\^{bt}M_X(at)

Reason for the name

THM: \(M^{(k)}(0) = E\left[X^k\right]\)

We need only carry out one integration (summation) to find \(M_X(t)\) and then we can obtain the moment by differenciation.

Proof (continous case)

M_X(t) = intlimits_{-infty}\^infty e\^{tx} f_X(x)dx

can’t typeset this math, will copy from somewhere else later.

# Application

- Find the mgf of a Binomial r.v. and use it to find \(E(X)\) and \(Var(X)\)

# Solution

M_X(t) &= sumlimits_{x=0}\^n e\^{tx} {n choose x} p\^x (1-p)\^{n-x}\ &= sumlimits_{x=0}\^n {n choose x} (e\^t p)\^x (1-p)\^{n-x} \ &= (pe\^t + (1-p))\^n

We have

E(X) = M\^prime (0) = n(pe\^t + 1-p)\^{n-1} pe\^t = np

Now we find \(E(X^2)\), we find \(M^{\prime\prime} (0)\)

We get

[…]