# Law of series/Poisson process

## Contents |

### The definition

A **Poisson process** is a homogeneous signal process with continuous time characterized by two properties:

1. The probability of two or more signals arriving at the same time (trajectories with jumps by more than one unit) is zero, and

2. The increments \(X_{t_1}-X_{t_0}, X_{t_2}-X_{t_1}, ..., X_{t_k}-X_{t_{k-1}}\) are stochastically independent, for any natural \(k\) and every \(0<t_0<t_1<...<t_k\).

These properties imply that for every \(t\) the distribution of \(X_t\) is the Poisson distribution with parameter \(\lambda t\) for some real parameter \(\lambda\ge 0\) and so: \[ P\{X_t = n\} = e^{-\lambda t}\frac{(\lambda t)^n}{n!} \ \ \ \ (n\ge 0). \]

The expected value of \(X_t\) is \(\lambda t\), thus \(\lambda\) is interpreted as the *average number of signals per unit of time* and called the **intensity**.

If \(\lambda= 0\), then the process is the trivial zero process, so it is natural to assume that \(\lambda> 0\). In such a case the distribution of the waiting time \(V\) can be easily computed as follows \[ F(t) = P\{V\le t\} = P\{X_t\ge 1\} = 1 - P\{X_t = 0\} = 1 - e^{\lambda t}. \]

The expected value of \(V\) is \(\frac 1 \lambda\).

### Interpretation

The Poisson process models the signals arriving *by pure chance*, independently from each-other, yet maintaining a constant intensity (expected number of signals per unit of time). It is widely used to model some processes in reality, for example incoming telephone calls, malfunctions of some device, etc. The interpretation is best understood via approximation, as described below.

### Approximating the Poisson process

Poisson process can be obtained as the limit process in at least two ways; via Bernoulli schemes, and via independent signals on a bounded interval.

#### Bernoulli schemes

Consider an infinite sequence of 0-1 Bernoulli trials (i.e., i.i.d. 0-1 valued random variables) performed at equal (small) time distances \(s\), where the probability of success (i.e., of 1) is \(\lambda s\). It is clear that the generated signal process is homogeneous and satisfies both postulates in the definition of the Poisson process (of intensity \(\lambda\)). The only difference is that it has discrete time with increment \(s\) rather than continuous time. The probability of exactly \(n\) successes observed in time \([0,t]\) (for simplicity take \(t\) a multiple of \(s\)), is, by an elementary combinatorial formula, equal to \[ P\{X_t = n\} = \begin{pmatrix}\frac ts \\ n\end{pmatrix} (\lambda s)^n (1-\lambda s)^{\frac ts-n} \ \ \ (n\le \frac ts). \] Letting \(s\) go to zero we arrive at a process with continuous time. On the other hand, applying elementary calculus the above formula is seen to converge to the formula for the Poisson distribution.

#### Independent signals in a bounded interval

Now consider the interval of time \([0,T]\) where \(T\) is very large. Suppose for simplicity that \(\lambda T\) is an integer and consider \(\lambda T\) independent variables with the uniform distribution on \([0,T]\). These variables interpreted as times of arrivals of signals form a process satisfying both conditions postulated in the definition of the Poisson process (of intensity \(\lambda\)) but only for \(t\le T\). Now, for \(t<T\) and \(n<\lambda T\), the probability of exactly \(n\) signals in time \([0,t]\) equals \[ P\{X_t = n\} = \begin{pmatrix}\lambda T \\ n\end{pmatrix} (\frac tT)^n (1-\frac tT)^{\lambda T-n}. \] This time we let \(T\) grow to infinity, and once again, we obtain the formula for the Poisson distribution.