Arrival Times

An exponential random variable with rate \lambda > 0 is a continuous random variable X whose density is f(x)=\lambda e^{-\lambda x} \mathbb{I}(x\ge 0). Its distribution function is F(x) = \left(1 - e^{-\lambda x} \right) . It can be used to model the arrival time of a single event. Note that P[X > t] = e^{-\lambda t}.

An important property is the so-called memoryless property:

P[X > s + t | X > s] = P[X > t].

which states that given the event has not occurred in the interval [0,s] the probalility that it does not occur in the interval [0,s+t] is identical to the probability is does not occur in the interval [0,t]. The proof is simple

P[X > s + t | X > s] = \frac{P[X > s+t]}{P[X>s]} = e^{-\lambda(s+t)}/e^{-\lambda s} = e^{-\lambda t} = P[X > t].

The expected arrival time is \mathbb{E} X = \lambda^{-1}. The generating function \phi_X(t) = \frac{\lambda}{\lambda-t}.

Given that events arrive at a constant exponential rate, with the inter-arrival times being iid exponentially distributed, the arrival time of the nth event is T_n = X_1 + X_2 + \ldots + X_n. The joint density of the first k arrival times is

f_{T_1,\ldots,T_k}(t_1,\ldots,t_k) = f_{X_1}(t_1) f_{X_2}(t_2 - t_1) \ldots f_{X_k}(t_k - t_{k-1}) = \\ \lambda^k e^{-\lambda t_k} \mathbb{I}(0 < t_1 < \ldots < t_k)

The distribution function of T_k could be calculated by computing the marginal distribution function for it by integrating the joint distribution function above, but it is more convenient to use the fact that T_k is the sum of k iid random exponential random variables, its moment generating function is the k-fold product of the moment generating function of an exponential, and then make the identification that the resultant moment generating function is that of a gamma(k,\lambda) random variable:

\phi_{T_k}(t) = \mathbb{E}e^{tT_k} = \prod_{i=1..k} \mathbb{E} e^{tX_i} = \left( \frac{\lambda}{\lambda - t}\right)^k

Hence

f_{T_k}(t) = \frac{\lambda^k}{(k-1)!} e^{-\lambda t} \mathbb{I}(0 < t)

Order Statistics

Suppose it is known that n events occur in the time interval [0,t] and the arrival times are independent and uniformly distributed. The ith event arrives at time U_i \in [0,t]. The density function is

f_{U_1,\ldots,U_n}(t_1,\ldots,t_n) = \frac{1}{t^n} \prod_{i=1..n} \mathbb{I}(0 < U_i < t).

However the sequence U_1,\ldots,U_n is not temporally ordered. By inductively defining i(k) = \arg \min \{ U_i : \forall j < k , i \neq i(j) \}, the sequence Y_k = U_i(k) is such that the index k now is associated with the time of arrival of the kth event.

Since there are n! outcomes of U_1,\ldots,U_n that lead to the same ordering Y_1,\ldots,Y_n, the density, and each of these outcomes have the same density,

f_{Y_1,\ldots,Y_n}(t_1,\ldots,t_n) = n! \frac{1}{t^n} \mathbb{I}(0 < t_1 < \ldots < t_n < t)

Relation of Order Statistics and Joint Arrivals

The joint density of T_1,\ldots,T_k can be re-written as

f_{T_1,\ldots,T_k}(t_1,\ldots,t_k) =  \frac{1}{(k-1)!} (\lambda t_k)^k e^{-\lambda t_k} (k-1)! \frac{1}{t_k^k} \mathbb{I}(0 < t_1 < \ldots < t_k)

which is the product of a gamma(k,\lambda) density evaluated at t_k with an order statistic of k variables over the interval [0,t_k]. Hence the distribution of T_1,\ldots,T_k given that T_k = t is identical to a distribution where t \sim \text{gamma}(k,\lambda) and the arrival times are randomly distributed on the interval [0,t].

Conversely, given U_1,\ldots,U_k iid uniform([0,1]) random variables and a t \sim \text{gamma}(k,\lambda), the sequence

tY_1,t(Y_2-Y_1),\ldots,t(Y_k-Y_{k-1}) is a sequence of iid exponential \lambda random variables, where Y are the order statistics of U.

Poisson Distribution

Given a bounded series of non-negative terms \sum_{i\ge 0} a_i, a probability mass function on the non-negative integers can always be defined:

P[X=i] = \frac{a_i}{\sum_i a_i}

The simplest series to use is the series representation of the exponential function: e^\lambda= \sum_{i \ge 0} \frac{1}{i!} \lambda^{i}.

The probability mass function that results is called the Poisson distribution with parameter \lambda:

P[X = i] = e^{-\lambda} \frac{1}{i!} \lambda^i

The generating function of a Poisson random variable X is

\phi_X(t) = \mathbb{E}{e^tX} = e^{-\lambda} \sum_{i \ge 0} \frac{1}{i!} (\lambda e^t)^i = e^{\lambda(e^t - 1)}

An interesting property is that the finite sum of independent Poisson random variables is a Poisson random variable.

\phi_{\sum{i=1..n} X_i} = \prod_{i=1..n}  e^{\lambda_i(e^t - 1)} = e^{\sum_{i=1..n} \lambda_i (e^t - 1)}

Relation between Poisson Distribution and Arrival Times

Let T_k be the kth arrival time of events whose interarrival times are exponentially distributed with rate \lambda. Given a time t \ge 0, let N(t) = \inf \{ k : T_k \le t < T_{k+1} \} count the number of events that have occurred by time t. Since the event N(t) = n is identical to the event T_n \le t < T_{n+1} it follows that

P[N(t) = n] = P[t_n \le t < t_{n+1}] = \int_{0 < t_1 < \ldots < t_n \le t < t_{n+1}} \lambda^{n+1} e^{-\lambda t_{n+1}} dt_1 \ldots dt_n dt_{n+1}

and so

P[N(t) = n] = \lambda^{n} e^{-\lambda t} \text{measure}(0 < t_1 < \ldots < t_n \le t) = \lambda^{n} e^{-\lambda t} \frac{t^n}{n!}

So the number of arrivals in an interval [0,t] is poisson distributed with parameter \lambda.

Poisson Process

The Poisson Process is a counting process that counts the number of events that occur in an inteval [0,t] subject to the requirement that the events have iid inter-arrival times that are exponential(\lambda) distributed. It can be written in terms of the indicator functions for individual event times

N(t) = \sum_{i \ge 1} \mathbb{I}(T_i \le t)

to be continued…

Navigation

About

Raedwulf ….