Law of series/Stochastic process

From Scholarpedia
Jump to: navigation, search

    Contents

    Basic definitions

    A stochastic process is a family of real random variables \((X_t)_{t\in T}\) defined on a probability space \((\Omega,\Sigma,P)\), where the set \(T\) is interpreted as time. Time should have at least semigroup structure, usually it is either \(\mathbb R\), \(\mathbb R_{+\{0\}}\) (continuous time), \(\mathbb Z\) or \(\mathbb N\) (discrete time). Each \(\omega\in \Omega\) determins a trajectory (or realization) of the process, i.e., the function \(t\mapsto X_t(\omega)\). A process is considered to be defined by its joint distributions and so the particular space on which the variables \( X_t \) are defined is not important and all that really matters is the distribution of the process. Thus one is free to choose the underlying space \((\Omega,\Sigma,P)\) as long as the joint distribution is left unchanged. The joint distribution of the process is determined by the finite-dimensional distributions, i.e., by the joint distributions of the tuples \(X_{t_1}, X_{t_2},...,X_{t_k}\) for all \(k\ge 1\) and all vectors \(t_1, t_2 ,..., t_k\) of times.

    Special types of processes are defined by additional properties. Some of them are listed below:

    Stationary and homogeneous processes

    A stochastic process whose time is a semigroup is stationary if for every \(s\in T\) the process \(Y_t=X_{t+s}\) has the same finite-dimensional distributions as \(X_t\).

    A similar condition defines homegeneous processes. For every \(s\in T\) the process \(Y_t=X_{t+s}-X_s\) has the same finite-dimensional distributions as \(X_t\). For example, if \(X_t\) is stationary and has integrable (with respect to the time) trajectories, then the integral process \(Z_t\) defined for each \(\omega\) and \(t\) by \(Z_t(\omega) = \int_0^t X_s(\omega) ds\) is homogeneous.

    Signal processes

    A signal process is a continuous time integer valued stochastic process with the following two properties: 1. \(X_0 = 0\) almost surely, 2. the trajectories \(t\mapsto X_t(\omega)\) are almost surely nondecreasing in \(t\). Clearly, the trajectories must have discontinuities (jumps from one integer to a higher one). These jumps are interpreted as signals and then \(X_t\) counts the number of signals in the time interval \([0,t]\).

    Homogeneous signal processes are an important class and an example of such a process is the Poisson process. For a homogeneous signal process, the waiting time is the random variable defined on \(\Omega\) as the time of the first signal after time 0: \[V(\omega) = \inf\{t: X_t(\omega)\ge 1\}.\] The distribution function \(F\) of \(V\) depends only on the one-dimensional distributions of the process, namely, we have \[ F(t) = P\{V\le t\} = P\{X_t\ge 1\} = 1 - P\{X_t = 0\}. \]

    See also

    Wikipedia: Stochastic process]

    Personal tools
    Namespaces

    Variants
    Actions
    Navigation
    Focal areas
    Activity
    Tools