Você está na página 1de 17

Covariance Stationary Time Series Stochastic Process: sequence of rvs ordered by time {Yt} = {. . . , Y1, Y0, Y1, . . .

.} Defn: {Yt} is covariance stationary if E [Yt] = for all t cov (Yt, Ytj ) = E [(Yt )(Ytj )] = j for all t and any j

Example: Independent White Noise (IW N (0, 2)) E [Yt] = 0, var (Yt) = 2, j = 0, j 6= 0 Example: Gaussian White Noise (GW N (0, 2)) Yt = t, t iid N (0, 2) Example: White Noise (W N (0, 2)) Yt = t E [t] = 0, var (t) = 2, cov (t, tj ) = 0 Yt = t, t iid (0, 2)

Remarks

j = j th lag autocovariance; 0 = var(Yt) j = j / 0 = j th lag autocorrelation

Nonstationary Processes Example: Deterministically trending process Yt = 0 + 1t + t, t W N (0, 2)

Wolds Decomposition Theorem Any covariance stationary time series {Yt} can be represented in the form Yt = + 0 = 1, Properties: E [Yt] = 0 = var(Yt) = 2
j =0 X X

E [Yt] = 0 + 1t depends on t

Note: A simple detrending transformation yield a stationary process: Xt = Yt 0 1t = t Example: Random Walk Yt = Yt1 + t, t W N (0, 2), Y0 is xed t X = Y0 + j var(Yt) = 2t depends on t j =1 Note: A simple detrending transformation yield a stationary process: Yt = Yt Yt1 = t

j =0

j =0 X

j tj , t W N (0, 2)

2 j <

2 j <

j = E [(Yt )(Ytj )]

= E [(t + 1t1 + + j tj + ) (tj + 1tj 1 + )


X

= 2( j + j +1 1 + ) = 2 k k+j
k=0

Autoregressive moving average models (ARMA) Models (Box-Jenkins 1976) Idea: Approximate Wold form of stationary time series by parsimonious parametric models (stochastic dierence equations) ARMA(p,q) model: Yt = 1(Yt1 ) + + p(Yt2 ) t W N (0, 2) Lag operator notation: (L)(Yt ) = (L)t (L) = 1 1L pLp (L) = 1 + 1L + + q Lq +t + 1t1 + + q tq

ARMA(1,0) Model (1st order SDE) Yt = Yt1 + t, t W N (0, 2) Solution by recursive substitution: Yt = t+1Y1 + tY0 + t0 + + t1 + t = t+1Y1 + = t+1Y1 +
i=0 t X i=0 t X

iti iti, i = i

Alternatively, solving forward j periods from time t : Yt+j = j +1Yt1 + j t + + t+j 1 + t+j = j +1Yt1 + Dynamic Multiplier: dYj dYt+j = = j = j d0 dt
i=0 j X

it+j i

Stochastic dierence equation (L)Xt = wt Xt = Yt , wt = (L)t

Impulse Response Function (IRF) Plot j vs. j Cumulative impact (up to horizon j )
j X

Stability and Stationarity Conditions If || < 1 then


j

lim j = lim j = 0
j

i=1

Long-run cumulative impact


X

and the stationary solution (Wold form) for the AR(1) becomes. Yt =
X

j = (1) = (L) evaluated at L = 1

j =0

j tj =

j =0

i=1

j tj

This is a stable (non-explosive) solution. Note that (1) =


X

j =

j =0

1 < 1

If = 1 then Yt = Y0 +
t X

j =0

j , j = 1, (1) =

which is not stationary or stable.

AR(1) in Lag Operator Notation (1 L)Yt = t If || < 1 then (1 L)1 = such that (1 L)1(1 L) = 1 Trick to nd Wold form: Yt = (1 L)1(1 L)Yt = (1 L)1t = = =
j =0 X j =0 X j =0 X X

Moments of Stationary AR(1) Mean adjusted form: Yt = (Yt1 ) + t, t W N (0, 2), || < 1 Regression form: Yt = c + Yt1 + t, c = (1 ) Trick for calculating moments: use stationarity properties E [Yt] = E [Ytj ] for all j cov (Yt, Ytj ) = cov (Ytk , Ytkj ) for all k, j Mean of AR(1) E [Yt] = c + E [Yt1] + E [t] = c + E [Yt] E [Yt] = c = 1

j =0

j Lj = 1 + L + 2L2 +

j Lj t j tj j tj , j = j

Variance of AR(1) 0 = var(Yt) = E [(Yt )2] = E [((Yt1 ) + t)2] = 2 0 + 2 (by stationarity) 2 0 = 1 2 Note: From the Wold representation

Autocovariances and Autocorrelations Trick: multiply Yt by Ytj and take expectations j = E [(Yt )(Ytj )] = E [(Yt1 )(Ytj )] + E [t(Ytj )]

= 2E [(Yt1 )2] + 2E [(Yt1 )t] + E [2 t]

X 2 j 2 2 j 0 = var tj = = 1 2 j =0 j =0

= j 1 (by stationarity) 2 j = j 0 = j 1 2 Autocorrelations: j = j 0 =

j 0 = j = j 0

Note: for the AR(1), j = j . However, this is not true for general ARMA processes. Autocorrelation Function (ACF) plot j vs. j

0.99 0.9 0.75 0.5 0.25

half-life 68.97 6.58 2.41 1.00 0.50

Application: Half-Life of Real Exchange Rates The real exchange rate is dened as zt = st pt + p t

Table 1: Half lives for AR(1) Half-Life of AR(1): lag at which IRF decreases by one half j = j = 0.5 j ln = ln(0.5) ln(0.5) j= ln The half-life is a measure of the speed of mean reversion.

st = log nominal exchange rate

pt = log of domestic price level p t = log of foreign price level Purchasing power parity (PPP) suggests that zt should be stationary.

ARMA(p, 0) Model Mean-adjusted form: Yt = 1(Yt1 ) + + p(Ytp ) + t E [Yt] = t W N (0, 2)

Stability and Stationarity Conditions Trick: Write pth order SDE as a 1st order vector SDE

Lag operator notation: (L)(Yt ) = t (L) = 1 1L pLp Yt = + Xt (L)Xt = t Regression Model formulation Yt = c + 1Yt1 + + pYtp + t

or

Xt Xt1 Xt2 . . Xtp+1

1 2 1 0 0 1 . . . . 0 0

... ...

p X t1 0 Xt2 0 Xt3 . . . . Xtp 1 0

t 0 0 . . 0

Unobserved Components representation

t = F t1 + vt (pp)(p1) (p1) (p1)


Use insights from AR(1) to study behavior of VAR(1):

t+j = Fj +1t1 + Fj vt + + Fvt+j 1 + vt Fj = F F F (j times)


Intuition: Stability and stationarity requires
j

lim Fj = 0

(L)Yt = c + t, c = (1)

Initial value has no impact on eventual level of series.

Example: AR(2) Xt = 1Xt1 + 2Xt2 + t or


"

Result: The ARMA(p, 0) model is covariance stationary and has Wold representation
" #

Iterating j periods out gives


"

Xt Xt1

" #

1 2 1 0
"

#"

Xt1 Xt2

Yt = +

t 0
#

j =0

j tj , 0 = 1

with j = (1, 1) element of Fj provided all of the eigenvalues of F have modulus less than 1. Finding Eigenvalues
#

"

Xt+j Xt+j 1
#j "

=
#

1 2 1 0
"

#j +1 "

Xt1 Xt2
#"

1 2 1 0

t 0

+ +

1 2 1 0

t+j 1 0

"

t+j 0

is an eigenvalue of F and x is an eigenvector if

First row gives Xt+j Xt+j = [f11 Xt1 + f12 Xt2] + f11 t + + f11t+j 1 + t+j
(j ) f11 = (1, 1) element of (j +1) (j +1 (j )

Fx = x (F Ip)x = 0 F Ip is singular det(F Ip) = 0


Example: AR(2) det(F I2) = det

Fj

1 2 1 0

Note:

F2 =

"

1 2 1 0

#"

1 2 1 0

"

2 1 + 2 12 1 2

= det

= 2 1 2

1 2 1

"

0 0

#!

The eigenvalues of F solve the reverse characteristic equation 2 1 2 = 0 Using the quadratic equation, the roots satisfy , i = 1, 2 2 These root may be real or complex. Complex roots induce periodic behavior in yt. Recall, if i is complex then i = a + bi a = R cos(), b = R sin() R =
q

To see why |i| < 1 implies limj Fj = 0 consider the AR(2) with real-valued eigenvalues. By the spectral decomposition theorem
0 F = T T1, # T1 = T " " # 1 0 t11 t12 = , T= , 0 2 t21 t22

i =

2 1 + 42

T1 =
Then

"

t11 t12 t21 t22

Fj = (TT1) (TT1) = Tj T1
and
j

a2 + b2 = modulus

lim Fj = T lim j T1 = 0
j

provided |1| < 1 and |2| < 1.

Note:
j 1 Fj = T " T #" t11 t12 = t21 t22 j 1

Examples of AR(2) Processes


#"

0
j

0 j 2

t11 t12 t21 t22

so that f11 where c1 + c2 = 1 Then, lim j = lim (c11 + c22) = 0 j j


j j (j )

1 + 2 = 0.8 < 1 " # 0.6 0.2 F = 1 0 The eigenvalues are found using i = 1 = 2 =
q

Yt = 0.6Yt1 + 0.2Yt2 + t

= (t11t11)1 + (t12t22)2 = c11 + c22 = j


j j

2 1 + 42 (0.6)2 + 4(0.2) 2 (0.6)2 + 4(0.2)

0.6 + 0.6

2 q
q

= 0.84 = 0.24

2 j j = c1(0.84) + c2(0.24)j

Here Yt = 0.5Yt1 0.8Yt2 + t 2.95 0.5 a = = 0.25, b = = 0.86 2 2 i = 0.25 0.86i q q modulus = R = a2 + b2 = (0.25)2 + (0.86)2 = 0.895 i = a + bi s.t. a = R cos(), b = R sin() = R cos() + R sin()i = R ei Frequency satises a a cos() = = cos1 R R 2 period =

1 + 2 = 0.3 < 1 " # 0.5 0.8 F = 1 0 Note:

Polar co-ordinate representation:


2 2 1 + 42 = (0.5) 4(0.8) = 2.95

complex eigenvalues

Then i = a bi, i = 1 q (2 1 1 + 42) a = , b= 2 2

Here, R = 0.895 0.25 1 = cos = 1.29 0.985 2 period = = 4.9 1.29 Note: the period is the length of time required for the process to repeat a full cycle. Note: The IRF has the form j = c11 + c22 Rj [cos(j ) + sin(j )]
j j

Stationarity Conditions on Lag Polynomial (L) Consider the AR(2) model in lag operator notation (1 1L 2L2)Xt = (L)Xt = t For any variable z, consider the characteristic equation (z ) = 1 1z 2z 2 = 0 By the fundamental theorem of algebra 1 1z 2z 2 = (1 1z )(1 2z ) so that 1 1 , z2 = 1 2 are the roots of the characteristic equation. The values 1 and 2 are the eigenvalues of F. z1 = Note: If 1 + 2 = 1 then (z = 1) = 1 (1 + 2) = 0 and z = 1 is a root of (z ) = 0.

Result: The inverses of the roots of the characteristic equation (z ) = 1 1z 2z 2 pz p = 0 are the eigenvalues of the companion matrix F. Therefore, the AR(p) model is stable and stationary provided the roots of (z ) = 0 have modulus greater than unity (roots lie outside the complex unit circle). Remarks: 1. The reverse characteristic equation for the AR(p) is z p 1z p1 2z p2 p1z p = 0 This is the same polynomial equation used to nd the eigenvalues of F.

2. If the AR(p) is stationary, then (L) = 1 1L pLp

= (1 1L)(1 2L) (1 pL)


X j

where |i| < 1. Suppose, all i are all real. Then (1 iL)1 = i Lj
j =0 1 (L) = (1 1L)1 (1 pL)1 X X X j j j = 1Lj 2Lj . . . 2Lj j =0 j =0 j =0

The Wold solution for Xt may be found using Xt = (L)1t =

j =0 X j

1Lj

j =0

2Lj . . .

j =0

2Lj t

3. A simple algorithm exists to determine the Wold form. To illustrate, consider the AR(2) model. By denition (L)1 = (1 1L 2L2) = (L), (L) =
j =0 X

Moments of Stationary AR(p) Model Yt = 1(Yt1 ) + + p(Ytp ) + t t W N (0, 2) or Yt = c + 1Yt1 + + pYtp + t = 1 + 2 + + p Note: if = 1 then (1) = 1 = 0 and z = 1 is a root of (z ) = 0. In this case we say that the AR(p) process has a unit root and the process is nonstationary. c = (1 )

j Lj

1 = (1 1L 2L2) Collecting coecients of powers of L gives

1 = (L) (L)

(1 + 1L + 2L2 + )

1 = 1 + ( 1 + 1)L + ( 2 1 1 2)L2 + Since all coecients on powers of L must be equal to zero, it follows that 1 = 1 2 = 1 1 + 2 3 = 1 2 + 2 1 . . j = 1 j 1 + 2 j 2

Straightforward algebra shows that E [Yt] = 0 = 1 1 + 2 2 + + p p + 2 j = 1 j 1 + 2 j 2 + + p j p j = 1j 1 + 2j 2 + + pj p

ARMA(0,1) Process (MA(1) Process) (L) = 1 + L, t W N (0, 2) Moments: E [Yt] = var(Yt) = 0 = E [(Yt )2] = E [(t + t1)2] = 2(1 + 2) 1 = E [(Yt )(Yt1 )] = 2 1 = 1 = 0 1 + 2 j = 0, j > 1 Yt = + t + t1 = + (L)t

The recursive equations for j are called the Yule-Walker equations. Result: ( 0, 1, . . . , p1) is determined from the rst p elements of the rst column of the (p2 p2) matrix 2[Ip2 (F F)]1 where F is the state space companion matrix for the AR(p) model.

= E [(t + t1)(t + t1)]

Remark: There is an identication problem for 0.5 < 1 < 0.5

The values and 1 produce the same value of 1. For example, = 0.5 and 1 = 2 both produce 1 = 0.4. Invertibility Condition: The MA(1) is invertible if || < 1 Result: Invertible MA models have innite order AR representations (Yt ) = (1 + L)t, || < 1
X

(1 L)1(Yt ) = t so that t = (Yt ) + (Yt1 ) + ()2 (Yt2 ) +


j =0

= (1 L)t, =

()j Lj (Yt ) = t

Você também pode gostar