A sequence of random variables is convergent in probability to a random variable
if for any
It is convergent in distribution if for all points for which
is continuous at
,
where is the distribution function of
and
is the distribution function of
Convergence in probability implies convergence in distribution.
Since
So
Let be continuous at
. Then
. Since
it follows that
and so . That is, the sequence of random variables is convergent in probability.
