Você está na página 1de 7

Chapter 2 Stochastic processes

2.1 Denition

Definition 2.1 1. A stochastic process is a collection of random variables (Xt , t T ) = (Xt (), t T, ), dened on some probability space (, F, P ). 2. We call X a continuous-time process if T is an interval, such as [a, b], [a, b), or [a, ) with a < b; Stochastic calculus deals with continuous-times process. 3. We call X a discrete-time process if T is a nite or countably innite set, such as T = 0, 1, 2, .... Such processes are also called time series. 4. A stochastic process is a function of two variables t and . (i) For a xed time t, it is a r.v. Xt = Xt (), (ii) For a xed time , it is a function of time: Xt = Xt (), t T, which is called a realization, a trajectory, or a sample path of the process X. It may be useful for the intuition to think of t as time and each as an individual particle, stock price or experiment. Then the picture Xt () would represent the position (or the result) at time t of the particle (stock price, or experiment) . The idea is that if we run an experiment and observe the random values of Xt () as time evolves, we are in fact looking at a sample path {Xt () : t 0} for some xed . If we rerun the experiment, we will in general observe a dierent sample path.

Sometimes, it is convenient to write X(t, ) instead of Xt (). Then, we may regard the process as a function (t, ) X(t, ) from T into R. This is often a natural point view in stochastic analysis, because it it is crucial to have X(t, ) jointly measurable in (t, ). Here are some plots of continuous-time and discrete-time processes.

An important concept in stochastic process is the following. Definition 2.2 The nite-dimensional distributions (dis or fdds) of the stochastic process are the distribution of the nite-dimensional vectors (Xt1 , ..., Xtn ), t1 , ..., tn T,

for all possible choices of times t1 , ..., tn T and every n 1. The family of fdds determine many (but not all) important properties of the stochastic process, which is an innite-dimensional curve. Conversely, given a family of probability measures, {t1 ,...,tn , k N, ti T }, it is important to be able to construct a stochastic process Xt as its fdds. This can be done if {t1 ,...,tn } satises two natural consistency conditions, as given by the famous Kolmogorovs extension theorem. The details are omitted here. Another important concept is stationarity. Definition 2.3 1. The process {Xt , t T } is strictly stationary if the fdds are invariant under shifts of the index t: (Xt1 , ..., Xtn ) =d (Xt1 +h , ..., Xtn +h ), for all possible choices of times t1 , ..., tn T , n 1, and h such that t1 + h, ..., tn + h T. 2. The process {Xt , t T } is weakly stationary if its mean and covariance functions are invariant under shifts of the index t: EXt+h = EXt , Cov(Xs+h , Xt+h ) = Cov(Xs , Xt )

for all possible choices of times s, t T , and h such that s + h, t + h T .

The two modes of stationarity do not imply each other. Clearly, weak stationarity does not imply strictly stationarity. On the other hand, if {Xt , t T } is strictly stationary, it may not have rst and/or second moments, which is required for weak stationarity. Of course, if the second moments of Xt exist, then strictly stationarity does imply weak stationarity. One example in which the two modes are equivalent is the stationary Gaussian process. We leave this as an exercise.

2.2

Class of processes

There are many dierent types of stochastic processes. We shall list some of them below. Suppose that {Xt , t 0} is a stochastic process on a probability space (, F, P ). 1. Martingales Definition 2.4 Let X = {X(t) : t 0} be a right-continuous stochastic process with left-hand limits and {Ft : t 0} a ltration, dened on a common probability space. X is called a martingale w.r.t. {Ft : t 0} if (i) X is adapted to {Ft : t 0}, i.e., Xt is Ft -measurable, (ii) E|X(t)| < for all t < , (iii) E[X(t)|Fs ] = X(s) for all 0 s t. One can also dene upper-martingales, sub-martingales, semi-martingales, local martingales, etc. 2. Gaussian processes Definition 2.5 A vector X Rn has the multivariate normal or Gaussian distribution in n dimensions if all linear combinations a X = n ai Xi of its i=1 components are normally distributed (in one dimension). Write X N (, ), where = EX1 is the mean vector and = E(X )(X ) is the covariance matrix. Note that above is symmetric, non-negative denite. But it may not be positive denite. Theorem 2.1 A n-dimensional vector X N (, ) if and only if it has characteristic function (c.f.) 1 (t) =: Eeit X = exp{it t t}. 2 Further, if is positive denite (so non-singular), X has pdf fX (x) = 1 (2)n/2 (det )1/2 exp 1 (x )1 (x ) 2 , x Rn

with parameters Rn and . Definition 2.6 A stochastic process {Xt , t T } is called a Gaussian process if all its fdds are multivariate Gaussian.

Gaussian processes can be specied by (1), mean function (t) = EXt , (2), covariance function (s, t) = cov(Xs , Xt ).

Gaussian processes have many interesting properties. Among these, we quote Belyaevs dichotomy: with probability one, the paths of a Gaussian process are either continuous, or extremely pathological: for example, unbounded above and below on any time interval, however short. Naturally, we shall conne attention in this course to continuous Gaussian precesses.

3. Markov processes. A continuous-time Markov process is a stochastic process {Xt : t 0} that satises the Markov property: at any times t > s 0, the conditional probability distribution of the process at time t given the whole history of the process up to and including time s, Fs = (Xr , 0 r s), depends only on the state of the process at time s. P (Xt x|Fs ) = P (Xs x). In eect, the state of the process at time t is conditionally independent of the history of the process before time s, given the state of the process at time s. The process X is said to be strong Markov if the above holds with the xed time t replaced by a stopping time T (a random variable). This is a real restriction of the Markov property in the continuous-time case (though not in discrete-time).

Clearly, strong Markov property implies Markov property, but not vice verse. Here are two examples where X is Markov but not strong Markov. (i) Perhaps the simplest example is Xt = 0, = t T, t T, t T,

where T is an exponentially distributed r.v. Then X is Markov (from the lack of memory property of the exponential distribution), but not strong Markov (the Markov property fails at the stopping time T ). One must expect that Markov property to fail in cases, as here, when all the action is at random times. (ii) The second example is a left-continuous Poisson process obtained by taking a Poisson process and modifying it paths to be left-continuous rather than right-continuous. 5

4. Diusion. A diusion is a path-continuous strong Markov process such that for each time t and state x the following limits exist: (t, x) =: 2 (t, x) =:
h0+ h0+

lim E[(Xt+h Xt )|Xt = x]

lim E[(Xt+h Xt )2 |Xt = x]

Then, (t, x) is called the drift, 2 (t, x) the diusion coecient. The term diusion derives from physical situations involving Brownian motion. The mathematics of heat diusing through a conducting medium (which goes back to Fourier) in the early 19th century) is intimately linked with Brownian motion (mathematics of which is 20th century). The theory of diusions can be split according to dimension. (a) For one-dim diusion, there are a number of ways of treating the theory, see Breiman, L. (Probability, 1968), or Doob (Stochastic process, 1953). (b) For higher-dim diusion, there is basically one way: via the SDE methodology (or its reformulation in terms of a martingale problem). This approach is preferred since it generalizes. It also shows that Markov processes and martingales are also intimately linked technically.

5. Levy processes Many important processes deal with properties of increments. Definition 2.7 Let {Xt , t T } be a stochastic process and T R be an interval. Xt is said to have stationary increments if Xt Xs =d Xt+h Xs+h , for all t, s and h with t + h, s + h T .

Xt is said to have independent increments if for for all ti T with t1 < t2 < ... < tn and n 1 Xt2 Xt1 , Xt3 Xt2 , , Xtn Xtn1 are independent random variables. Independence and stationarity are clearly two distinct concepts. Processes with increments satisfying both requirements are an important class of processes in stochastic calculus. 6

Definition 2.8 (Levy processes) We say that a stochastic process {Xt , t 0} is a Levy process if (a) It starts at zero: X0 = 0 a.s.; (b) It has stationary and independent increments; (c) X is stochastically continuous, i.e., for all a > 0 and for all s 0, lim P (|Xt Xs | > a) = 0
ts

or equivalently Xt p Xs as t s. The third condition does not imply in any way that the sample paths are continuous: e.g. Poisson process is stochastically continuous, but not continuous. It serves to exclude processes with jumps at xed (non-random) times, which can be regarded as calendar eects and are not interesting for our purposes. It means that for given time t, the probability of seeing a jump at t is zero: discontinuities occur at random times.

Two most important examples of Levy processes are: (1) Brownian motion: Xt N (0, t); (2) Poisson process: Xt P oisson(t).

6. Self-similar processes Definition 2.9 A stochastic process {Xt , t 0} is H-self-similar for some H > 0 if its f.d.d.s satisfy {Xct1 , ..., Xctn } =d {cH Xt1 , ..., cH Xtn }, for every c > 0, and any choice of ti 0, i = 1, ..., n and n 1. For simplicity, we sometimes write (2.1) as {Xct , t 0} =d {cH Xt , t 0}. We should note that self-similarity is a distributional, not a pathwise property. One can not replace =d by =. (2.1)

Você também pode gostar