Convergence in Probability Implies Convergence in Distribution

A sequence of random variables \left< X_n \right> is convergent in probability to a random variable X if for any \epsilon > 0

\lim_{n\rightarrow \infty} P[ |X_n - X| > \epsilon] = 0

It is convergent in distribution if for all points x for which F is continuous at x,

\lim_{n \rightarrow \infty} F_n(x) = F(x)

where F_n is the distribution function of X_n and F is the distribution function of X

Convergence in probability implies convergence in distribution.

Since

X \le x - \epsilon \leftrightarrow (X \le x - \epsilon \wedge X_n \le x) \vee (X \le x - \epsilon \wedge X_n > x) \\ \rightarrow X_n \le x \vee X_n - X > \epsilon

X_n \le x \leftrightarrow (X_n \le x \wedge X \le x + \epsilon) \vee (X_n \le x \wedge X > x + \epsilon) \\ \rightarrow X \le x + \epsilon \vee X - X_n > \epsilon

F(x - \epsilon) \le F_n (x) + P(|X_n - X| > \epsilon)

F_n(x) \le F(x + \epsilon) + P(|X_n - X| > \epsilon)

So

F(x - \epsilon) \le \liminf_n F_n(x)

\limsup_n F_n(x) \le F(x + \epsilon)

Let F be continuous at x. Then \lim_{\epsilon \downarrow 0} F(x \pm \epsilon) = F(x). Since \liminf_n F_n(x) \le \limsup_n F_n(x) it follows that

F(x) \le \liminf_n F_n(x) \le \limsup_n F_n(x) \le F(x)

and so \lim_{n\rightarrow \infty} F_n(x) = F(x). That is, the sequence of random variables is convergent in probability.

Navigation

About

Raedwulf ….