Você está na página 1de 45

D

E
T
A
R
G
E
T
N
I
E
V
I
S
S
E
R
G
E
R
O
T
AU
)
A
M
I
R
A
(
E
G
A
R
E
V
A
G
N
I
V
MO
NTOS-EMBILE
SA
PRESENTED BY: RAYCHELL

What ARIMA stands for


A series which needs to be differenced to be made
stationary is an integrated (I) series

Lags of the stationarized series are called autoregressive (AR) terms

Lags of the forecast errors are called moving average


(MA) terms

ARIMA Model
Building Process

Table 2. Characteristics of a good forecasting


model
1. If fits the past data well.
Plots of actual versus fitted are good.
2 is high.
RSE is low relative to other models.

The MAPE is good.


2.The model has intuitive appeal.
3. It forecast the future and withheld (i.e., out-of-sample) data well.
4. It is parsimonious, simple but effective, not having too many
coefficients.

Table 2. Characteristics of a good forecasting


model (concluded)
5. The estimated coefficients and
and not redundant or unnecessary.
p

are statistically significant

6. The model is stationary and invertible.


7. No patterns left in the ACFs and PACFs.
8. The residuals are white noise, or they have no patterns
denoting model deficiencies.
9. The Schwarz Bayesian or Akaike information criteria are lower
than those of other models.

ACFs-PACFs Table
Process

ACFs

PACFs

ARIMA (0,0,0)

No significant lags

No significant lags

ARIMA (0,1,0)

Linear decline at lag 1, with many


lags significant

Single significant peak at lag 1

ARIMA (1,0,0)
t >0

Exponential decline, with first 2 or


more lags significant

Single significant peak at lag 1

ARIMA (1,0,0)
t < 0

Alternating exponential decline


starting with a negative ACF(1)

Single significant negative peak at


lag 1

ARIMA (0,0,1)
t >0

Single significant negative peak at


lag 1

Exponential decline of negative


values, with first 2 or 3 lags
significant

ARIMA (0,0,1)
t < 0

Single significant positive peak at


lag 1

Alternating exponential decline


starting with a positive

The Q-Statistic and White Noise Diagnosis


In extracting information form a time series, we use the patterns in the
ACFs to guide information of the ARIMA models.

Model has a high R

(i.e., low sum of squares), statistically significant


coefficients, nonredundant coefficients, and no patterns left in the
ACFs and PACFs and residual plots, we conclude that a good ARIMA
model has been identified.
2

However, while individual t-tests can be performed on specific lags of


ACFs using the standard error of the ACFs, the definition of no
pattern is subjective until it can be quantified.

The Q-Statistic and White Noise Diagnosis


The Q-statistic is use as objective diagnostic measure of white noise for a
time series, assessing whether there are patterns in a group of
autocorrelation.
2

ACF
(
i
)
The Q-statistic is,
Q n(n 2)

n i
k

i 1 to k

when the ACFs are from a white noise series this statistic is chi-square
distributed with k-p-q degrees of freedom, where p and q are the number of
AR and MA coefficients of the model, Thus, we see that Q is proportional to
the sum of the ACFs through lag k, where typically k is selected to be two
seasonal cycles or in general about 20 when two seasonal cycles is much
different than 20.

The Q-Statistic and White Noise Diagnosis


The Q-statistic is used in the following hypothesis test:
Ho: The residual ACFs are consistent with white noise ACFs.
If
table, where df=k-p-q and
, then infer that the
2
Q
0.05 different than those of
ACF patterns are not statistically significantly
white noise.
If Q 2 table, where df=k-p-q and 0.05
, then infer that the
ACF patterns are statistically significantly different than those of white
noise.

Autoregressive Process
ARIMA(1,0,0)
Autoregression is an extension of simple linear regression (it is a
simple linear relationship between and )

An ARIMA(1,0,0) model commonly called an AR(1) is written


as:

where

and

are coefficients chosen to minimize


1

the sum of squared errors.

Autoregressive Process
ARIMA(1,0,0)
Example:

Time Series Plot of sales_dairy

Figure

206
204
202

sales_dairy

in the right illustrates the


daily sales of a dairy product
during a 100-day period. This
products is known to be
nonseasonal by the day of the
week and week of the year. The
plot ofis somewhat random.
However, there is some
wandering of about its mean of
199.02.

200
198
196
194
192
190
1

10

20

30

40

50

Index

60

70

80

90

100

Autoregressive Process
ARIMA(1,0,0)

Example: The ACFs and PACFs


Autocorrelation Function for sales_dairy

Partial Autocorrelation Function for sales_dairy

(with 5% significance limits for the autocorrelations)

(with 5% significance limits for the partial autocorrelations)

0.80

1.0

0.72

0.8

0.64

Partial Autocorrelation

0.56

Autocorrelation

0.48
0.40
0.32
0.24
0.16
0.08
0.00
-0.08
-0.16
-0.24

0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8

-0.32

-1.0

-0.40

10

12

14

Lag

16

18

20

22

24

10

12

14

Lag

16

18

20

22

24

The ACFs of the figure above (left) show an exponential decline


from about 0.75 at lag 1. Meanwhile, the PACFs (right) have a
significant spike at 1. Both the ACFs and the PACFs suggest an
autoregressive model, ARIMA (1,0,0).

Autoregressive Process
ARIMA(1,0,0)

Example:

Time Series Plot of sales_dairy, FITS1

Versus Order

206

(response is sales_dairy)

Variable
sales_dairy
FITS1

204

5.0

202

2.5

Residual

Data

200
198
196
194

0.0

-2.5

192
-5.0

190
1

10

20

30

40

50

60

70

80

90

100

Index

Time Series Plot of Actual vs Fitted


values

10

20

30

40

50

60

70

80

90

Observation Order

Residual Plots (no pattern left)

100

Autoregressive Process
ARIMA(1,0,0)
Example:
The ACFs and PACFs of
ACF of Residuals for sales_dairy

PACF of Residuals for sales_dairy

(with 5% significance limits for the autocorrelations)

(with 5% significance limits for the partial autocorrelations)

1.0

1.0

0.8

0.8

0.6

0.6

Partial Autocorrelation

Autocorrelation

residuals

0.4
0.2
0.0
-0.2
-0.4
-0.6

0.4
0.2
0.0
-0.2
-0.4
-0.6

-0.8

-0.8

-1.0

-1.0
2

10

12

14

Lag

16

18

20

22

24

10

12

14

16

18

20

22

24

Lag

As shown, no low-order or seasonal ACFs and PACFs of the residuals are


statistically significant; thus, this seems to be a statistically defensible model.

Autoregressive Process
ARIMA(1,0,0)

Example:

Final Estimates of Parameters


Type
AR
1
Constant
Mean

Coef
0.7539
49.0086
199.121

SE Coef
0.0668
0.1923
0.781

T
11.29
254.83

P
0.000
0.000

The final model is,

Modified Box-Pierce (Ljung-Box) Chi-Square


statistic
Lag
Chi-Square
DF
P-Value

12
12.4
10
0.257

24
26.5
22
0.231

36
36.9
34
0.335

Also, based on the result, the


coefficients are significant
with p-value less than 0.05.

48
53.9
46
0.197

49.0086 0.7539Y t 1 et

Autoregressive Process
ARIMA(p,0,0)

A process considered AR (1) is the first order process, meaning


that the current value is based on the immediately preceding
value. An AR(2) process has the current value based on the
previous two values.

Integrated Process ARIMA (0,1,0)

Integrated processes are level-nonstationary series.


Trends and random walks are level-nonstationary

because their means (i.e., levels) are not constant or


not stationary).

A random walk model, which is equal to ARIMA(0,1,0)


is written as:

Integrated Process ARIMA(0,1,0)


Example:

Autocorrelation Function for daily_stock_prices


(with 5% significance limits for the autocorrelations)
1
.0
0.8

Time Series Plot of daily_stock_prices

Autocorrelation

0.6

1
9

daily_stock_prices

1
8
1
7
1
6

0.4
0.2
0.0
-0.2
-0.4
-0.6

1
5

-0.8

1
4

-1
.0

1
3

1
0

1
2

1
4

1
6

1
8

20

22

24

Lag

1
2
1
1
1
0
1
0

20

30

40

50

60

70

80

90

Partial Autocorrelation Function for daily_stock_prices

1
00

(with 5% significance limits for the partial autocorrelations)

Index

1
.0
0.8

Partial Autocorrelation

The time series plot of the daily price for a stock


traded data on a major exchange ACFs and
PACFs. As shown in the plot in the figure of the
original series (figure above), this series drifts
and therefore has a nonstationary (i.e.,
nonconstant) mean. The linearly declining ACFs
and have at least 3 significant spikes and 1
significant spike in lag 1 in the PACFs is indicative
of a random walk series, thus differencing is
necessary.

0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8
-1
.0
2

1
0

1
2

1
4

Lag

1
6

1
8

20

22

24

Integrated Process ARIMA(0,1,0)


Example:

Autocorrelation Function for first_diff


(with 5% significance limits for the autocorrelations)

Time Series Plot of first_diff (Daily Stock Prices)


1
.0
0.8

1
.5

0.6

Autocorrelation

first_diff

1
.0

0.5

0.0

0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8

-0.5

-1
.0
2

-1
.0

1
0

1
2

1
4

1
6

1
8

20

22

24

Lag
1

1
0

20

30

40

50

60

70

80

90

1
00

Index

Partial Autocorrelation Function for first_diff


1
.0
0.8

Partial Autocorrelation

As shown the first differences have a mean


of zero and are distributed as white noise,
having no statistically significant ACFs and
PACFs. Thus, the modeling process yields
the random walk model.

(with 5% significance limits for the partial autocorrelations)

0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8
-1
.0
2

1
0

1
2

1
4

Lag

1
6

1
8

20

22

24

Integrated Process ARIMA(0,1,0)


Example:
The expert modeler in

IBM SPSS also


selected ARIMA(0,1,0)
as the best ARIMA
model with higher
hich is equal to 0.929
and not significant
Ljung-Box (Q) with pvalue=0.929 which
infers that ACF
patterns are not
statistically different
than those of white
noise.

Moving Average Process


ARIMA(0,1,1)
Example:
Figure in the right illustrates the

Time Series Plot of fad

weekly

20.0

17.5

15.0

fad

demand of fad products


which is known to be
nonseasonal. The plot ofhas a
nonstationary mean. The ACFs
and PACFs are also indicative of
a nonstationary series; they
remain very high and significant
for many lags. Because of the
level nonstationary, first
difference of are taken.

12.5

10.0

7.5

5.0

10

20

30

40

50

Index

60

70

80

90

100

Moving Average Process


ARIMA(0,1,1)
Example:

Autocorrelation Function for fad

Partial Autocorrelation Function for fad

(with 5% significance limits for the autocorrelations)

(with 5% significance limits for the partial autocorrelations)


1.0

0.8

0.8

Partial Autocorrelation

1.0

Autocorrelation

0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6

0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8

-0.8

-1.0

-1.0

10

12

14

Lag

16

18

20

22

24

10

12

14

Lag

16

18

20

22

24

Moving Average Process


ARIMA(0,1,1)
Example: First Difference

Autocorrelation Function for first_diff_fad


(with 5% significance limits for the autocorrelations)
1.0

Time Series Plot of first_diff_fad

0.8
0.6

Autocorrelation

first_diff_fad

0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8

-1

-1.0
2

10

12

-2

14

16

18

20

22

24

Lag

-3
1

1
0

20

30

40

50

60

70

80

90

Partial Autocorrelation Function for first_diff_fad

1
00

(with 5% significance limits for the partial autocorrelations)

Index
1
.0
0.8

Partial Autocorrelation

As shown in the figure, the first difference have


stationary mean; however, the ACFs and PACFs
show a pattern at the low-order lags in the
figure. The single peak in the ACFs and the
negative exponential decline in the PACFs
starting at lag 1 are indicative of an MA(1)
model.

0.6
0.4
0.2
0.0
-0.2
-0.4
-0.6
-0.8
-1
.0
2

1
0

1
2

1
4

Lag

1
6

1
8

20

22

24

Moving Average Process


ARIMA(0,1,1)
Example: Fitting an ARIMA(0,1,1)
model

The graph of the residuals appears quite


random. Also, the ACFs and PACFs are both
indicative of white noise.

Moving Average Process


ARIMA(0,1,1)
Example: Final estimates of the
parameter

Final Estimates of Parameters


Type
MA
1

Coef
0.3955

SE Coef
0.0930

T
4.25

Modified Box-Pierce (Ljung-Box) Chi-Square


statistic

P
0.000

Differencing: 1 regular difference


Number of observations: Original series 100,
after differencing 99
Residuals:
SS = 147.326 (backforecasts
excluded)
The coefficientMSis =significant
1.503 DF with
= 98 p-value less

than 0.05

Lag
Chi-Square
DF
P-Value

12
8.3
11
0.684

24
18.0
23
0.760

36
28.5
35
0.773

48
35.1
47
0.901

The coefficient is significant


with p-value less than 0.05 and
Ljung-Box Q stat have no
significant lags
The final model
t 1 0.3955et 1 et
t
is,

Y Y

ARIMA (0,1,1)1,1 Model


The time series plot on the left is the
data on Monthly US Stock Index Value,
starting in December 1970 to 1992
(from Survey of Current Business). This
series is clearly non-stationary in level
and its variance is also nonstationary.
Thus, differencing is necessary to make
the data stationary.

Time Series Plot of USIND


500

USIND

400

300

200

100

27

54

81

108

135

Index

162

189

216

243

270

ARIMA (0,1,1)1,1 Model


The time series plot on the left is the
data on Monthly US Stock Index Value,
starting in December 1970 to 1992
(from Survey of Current Business). This
series is clearly non-stationary in level
and its variance is also nonstationary.
Thus, differencing and transformation
are necessary to make the data
stationary.

Time Series Plot of USIND


500

USIND

400

300

200

100

27

54

81

108

135

Index

162

189

216

243

270

ARIMA(0,1,0) model
Autocorrelation Function for USIND

Partial Autocorrelation Function for USIND


(with 5% significance limits for the partial autocorrelations)

1.0

1.0

0.8

0.8

0.6

0.6

Partial Autocorrelation

Autocorrelation

(with 5% significance limits for the autocorrelations)

0.4
0.2
0.0
-0.2
-0.4
-0.6

0.4
0.2
0.0
-0.2
-0.4
-0.6

-0.8

-0.8

-1.0

-1.0

10

15

20

25

30

Lag

35

40

45

50

55

60

10

15

20

25

30

35

40

45

50

55

Lag

The ACFs and PACFs graph are somewhat the same with random walk model. The ACFs
linearly decline at lag 1 with many lags significant while the PACFs shows single significant
peak at lag 1.

60

Example: US Stock Index


First Difference
After applying the first difference on the
data, the time series plot seems
stationary at level but the variance
increases with the level level. This
series therefore requires a
transformation to achieve variance
stationairty.

Time Series Plot of usind_diff1


40
30

usind_diff1

20
10
0
-10
-20
-30
-40
-50
1

27

54

81

108

135

Index

162

189

216

243

270

Example: US Stock Index


First difference of ln(USIND)
The time series plot of the US Stock
Index after transformation by using
natural logarithms shows that the first
differences have variance stationarity,
the period-to-period changes are
constant as ln Yt increases. Thus, the
percentage variation in Yt is modeled
well by taking natural logarithms.

Example: US Stock Index


The ACFs and PACFs of the first differences of ln(USIND)

As shown in the correlogram above, all ACFs except lag 1 are insignificant (single significant
peak at lag 1) and the PACFs have alternating exponential decline starting with a positive.
The single positive spike in the ACFs is indicative of a moving average model MA(1). In
addition, the first alternating spikes of the PACFs confim an MA(1) model.
The
anticiipated
1
MA(1) model will have a negative
.

Example: US Stock Index


Fitting an ARIMA(0,1,1)1,1 model
Final Estimates of Parameters
Type
MA
1
Constant
Mean

Coef
-0.3256
0.005621
0.005621

SE Coef
0.0578
0.002886
0.002886

T
-5.63
1.95

P
0.000
0.053

Modified Box-Pierce (Ljung-Box) Chi-Square


statistic
Lag
Chi-Square
DF
P-Value

12
6.9
10
0.736

24
16.8
22
0.777

36
25.4
34
0.855

48
33.7
46
0.911

Inferences:
1. The first-order MA coefficient
is
1
statistically significantly different
than zero, with p-value < 0.05.
2. The Q-statistic is indicative of
white noise residual.

Example: US Stock Index


Fitting an ARIMA(0,1,1)1,1 model

The patternless ACFs and PACFs appear to be those of white noise series.

Descriptive Statistics of US Stock Index


n

Mean

Std. Dev

Min

Max

Stationarity
Level

Variance

White
Noise

Yt

271

188.02

107.26

73.00

452.60

No

No

No

Yt - Yt-1

270

1.309

7.718

-41.80

40.00

Yes

No

No

ln (Yt)

271

5.0951

0.5151

4.2905

6.1150

No

Yes

No

ln(Yt) - ln( Yt-1)

270

0.00565

0.03752

-0.13428 0.11021

Yes

Yes

No

ARIMA(0,1,1)1,1

270

0.00006

0.0357

-0.1162

Yes

Yes

Yes

0.1092

The resulting model is:

e e
ln(Y ) e e
ln(Y ) + 0.005621 + 0.3256e

ln(Y t ) ln(Y t 1)
ln(Y t )
ln(Y t )

t 1
t 1

t 1

t 1

t
t 1

Results from Automatic ARIMA Forecasting


(Eviews)
Automatic ARIMA Forecasting
Selected dependent variable: DLOG(USIND)
Date: 10/06/16 Time: 03:26
Sample: 1 271
Included observations: 270
Forecast length: 0
Number of estimated ARMA models: 9
Number of non-converged estimations: 0
Selected ARMA model: (0,1)
SIC value: -3.75317283253

Dependent Variable: DLOG(USIND)


Method: ARMA Maximum Likelihood (BFGS)
Date: 10/06/16 Time: 03:26
Sample: 2 271
Included observations: 270
Convergence achieved after 3 iterations
Coefficient covariance computed using outer product of gradients
Variable

Coefficient

Std. Error

t-Statis tic

Prob.

C
MA(1)
SIGMASQ

0.005621
0.324384
0.001271

0.003000
0.052560
8.63E-05

1.873819
6.171738
14.72464

0.0620
0.0000
0.0000

R-squared
Adjusted R-squared
S.E. of regression
Sum squared resid
Log likelihood
F-statistic
Prob(F-s tatistic)

0.093437
0.086646
0.035857
0.343288
516.9581
13.75950
0.000002

Inverted MA Roots

-.32

Mean dependent var


S.D. dependent var
Akaike info criterion
Schwarz criterion
Hannan-Quinn criter.
Durbin-Wats on stat

0.005650
0.037519
-3.807097
-3.767115
-3.791042
2.002421

Results from Automatic ARIMA Forecasting


(Eviews)
Model Selection Criteria Table
Dependent Variable: DLOG(USIND)
Date: 10/06/16 Time: 03:26
Sample: 1 271
Included observations: 270

-3.71
-3.72
-3.73
-3.74
-3.75

(0,0)

(2,2)

-3.76
(2,1)

-3.77703...
-3.76545...
-3.76443...
-3.76443...
-3.76353...
-3.75661...
-3.75172...
-3.74409...
-3.69243...

-3.70

(1,2)

-3.75317...
-3.74159...
-3.73261...
-3.73261...
-3.73171...
-3.71683...
-3.71194...
-3.69636...
-3.67652...

HQ

(2,0)

-3.79304...
-3.78146...
-3.78578...
-3.78578...
-3.78488...
-3.78329...
-3.77840...
-3.77612...
-3.70310...

BIC*

(1,1)

516.958097...
515.388780...
516.974096...
516.973954...
516.852077...
517.636657...
516.974160...
517.664274...
503.770881...

-3.69

(0,2)

(0,1)
(1,0)
(0,2)
(1,1)
(2,0)
(1,2)
(2,1)
(2,2)
(0,0)

AIC

-3.68

(1,0)

LogL

-3.67

(0,1)

Model

Schwarz Criteria

Other Residual Diagnostic Tools/Measures


Example: Daily sales of dairy products - AR(1)
model
RSE

2
t

n 1
2
t

n 1

361.969
99
= 1.91213

Results from Automatic Arima Forecasting


(Eviews)
Example: Daily sales of dairy products - AR(1)
model
Equation Output
Summary

Automatic ARIMA Forecasting


Selected dependent variable: SALES
Date: 10/06/16 Time: 05:19
Sample: 1 100
Included observations: 100
Forecast length: 0
Number of estimated ARMA models: 25
Number of non-converged estimations: 0
Selected ARMA model: (1,0)
SIC value: 4.2716647438

Dependent Variable: SALES


Method: ARMA Maximum Likelihood (BFGS)
Date: 10/06/16 Time: 05:19
Sample: 1 100
Included observations: 100
Convergence achieved after 4 iterations
Coefficient covariance computed using outer product of gradients
Variable

Coefficient

Std. Error

t-Statistic

Prob.

C
AR(1)
SIGMASQ

199.1172
0.746485
3.623663

0.760346
0.070736
0.545074

261.8770
10.55313
6.648017

0.0000
0.0000
0.0000

R-squared
Adjusted R-squared
S.E. of regression
Sum squared resid
Log likelihood
F-statistic
Prob(F-statistic)
Inverted AR Roots

0.564192
0.555206
1.932805
362.3663
-206.6755
62.78757
0.000000
.75

Mean dependent var


S.D. dependent var
Akaike info criterion
Schwarz criterion
Hannan-Quinn criter.
Durbin-Watson stat

199.0239
2.898069
4.193510
4.271665
4.225140
1.759122

Results from Automatic Arima Forecasting


(Eviews)
Example: Daily sales of dairy products - AR(1)
model
4.38
4.36
4.34
4.32
4.30
4.28

(3,2)

(4,1)

(1,4)

(3,4)

(2,4)

(0,2)

(3,3)

(0,4)

(4,0)

(2,3)

(2,2)

(3,1)

(1,3)

4.26
(0,3)

4.225140
4.218787
4.232889
4.222271
4.230272
4.245388
4.263882
4.251637
4.252488
4.252515
4.244560
4.260225
4.273982
4.243348
4.310126
4.252822
4.248629
4.282030
4.282944
4.283221
4.264799
4.374361
4.277712
4.313175
5.017003

(3,0)

4.271665
4.280820
4.294922
4.299812
4.307813
4.322928
4.341422
4.344686
4.345537
4.345564
4.353117
4.353274
4.367030
4.367413
4.372158
4.376887
4.388202
4.390586
4.391501
4.391778
4.404372
4.420885
4.432793
4.437240
5.048020

HQ

(1,2)

4.193510
4.176613
4.190715
4.169553
4.177555
4.192670
4.211164
4.188375
4.189227
4.189253
4.170755
4.196963
4.210720
4.158999
4.267952
4.168473
4.153737
4.208225
4.209139
4.209416
4.169906
4.342730
4.172276
4.228826
4.995916

BIC*

(2,1)

-206.675482
-204.830635
-205.535751
-203.477668
-203.877725
-204.633490
-205.558198
-203.418775
-203.461340
-203.462675
-201.537758
-203.848173
-204.536006
-199.949951
-209.397577
-200.423652
-198.686835
-203.411225
-203.456960
-203.470794
-199.495316
-214.136502
-198.613806
-203.441295
-247.795808

4.40
AIC

(2,0)

(1,0)
(1,1)
(2,0)
(2,1)
(1,2)
(3,0)
(0,3)
(1,3)
(3,1)
(2,2)
(2,3)
(4,0)
(0,4)
(3,3)
(0,2)
(2,4)
(3,4)
(1,4)
(4,1)
(3,2)
(4,3)
(0,1)
(4,4)
(4,2)
(0,0)

LogL

(1,1)

Model

Schwarz Criteria (top 20 models)

(1,0)

ARIMA Criteria Table

Model Selection Criteria Table


Dependent Variable: SALES
Date: 10/06/16 Tim e: 05:19
Sam ple: 1 100
Included observations: 100

Schwarz and Akaike Criteria


Schwarz Bayesian Information Criterion
(BIC)

BIC nLog ( SSE ) + kLog (n)

Akaike Information Criterion

AIC nLog ( SSE ) + 2k


where
k = Number of parameters that are fitted in the model
Log = Natural logarithm
N = Number of observations in the series
SSE = Sum of the squared errors

Schwarz and Akaike Criteria


Example: Daily sales of dairy products - AR(1)
model
4.38
4.36
4.34
4.32
4.30
4.28

(3,2)

(4,1)

(1,4)

(3,4)

(2,4)

(0,2)

(3,3)

(0,4)

(4,0)

(2,3)

(2,2)

(3,1)

(1,3)

4.26
(0,3)

4.225140
4.218787
4.232889
4.222271
4.230272
4.245388
4.263882
4.251637
4.252488
4.252515
4.244560
4.260225
4.273982
4.243348
4.310126
4.252822
4.248629
4.282030
4.282944
4.283221
4.264799
4.374361
4.277712
4.313175
5.017003

(3,0)

4.271665
4.280820
4.294922
4.299812
4.307813
4.322928
4.341422
4.344686
4.345537
4.345564
4.353117
4.353274
4.367030
4.367413
4.372158
4.376887
4.388202
4.390586
4.391501
4.391778
4.404372
4.420885
4.432793
4.437240
5.048020

HQ

(1,2)

4.193510
4.176613
4.190715
4.169553
4.177555
4.192670
4.211164
4.188375
4.189227
4.189253
4.170755
4.196963
4.210720
4.158999
4.267952
4.168473
4.153737
4.208225
4.209139
4.209416
4.169906
4.342730
4.172276
4.228826
4.995916

BIC*

(2,1)

-206.675482
-204.830635
-205.535751
-203.477668
-203.877725
-204.633490
-205.558198
-203.418775
-203.461340
-203.462675
-201.537758
-203.848173
-204.536006
-199.949951
-209.397577
-200.423652
-198.686835
-203.411225
-203.456960
-203.470794
-199.495316
-214.136502
-198.613806
-203.441295
-247.795808

4.40
AIC

(2,0)

(1,0)
(1,1)
(2,0)
(2,1)
(1,2)
(3,0)
(0,3)
(1,3)
(3,1)
(2,2)
(2,3)
(4,0)
(0,4)
(3,3)
(0,2)
(2,4)
(3,4)
(1,4)
(4,1)
(3,2)
(4,3)
(0,1)
(4,4)
(4,2)
(0,0)

LogL

(1,1)

Model

Schwarz Criteria (top 20 models)

(1,0)

ARIMA Criteria Table

Model Selection Criteria Table


Dependent Variable: SALES
Date: 10/06/16 Tim e: 05:19
Sam ple: 1 100
Included observations: 100

Additional ARIMA Notation


(AR I MA)C, T (AR I MA) s (AR I MA) S

q C T

P D Q

P D Q

where p, d , and q are as defined previously


P = seasonal level of autoregression
D = seasonal level of differences
Q = seasonal level of moving averages
s = first period of seasonality
S = second period of seasonality
C = 1 for differenced models with nonzero 0, otherwise 0

T = power transformations of Y t , 0 none,1 log s, # power , e.g ., T 0.5

Additional ARIMA Notation


Consider the following examples of ARIMA notation:
Model

Description

ARIMA (1,0,0)

A simple autoregressive series AR(1)

ARIMA (0,0,1)

A simple moving averages MA (1)

ARIMA (0,1,0)

A simple random walk series I(1)

ARIMA (0,1,0)1

A simple trending series I(1)1

ARIMA(0,1,0)121

A seasonal monthly model with trend

ARIMA(1,0,0)4

A quarterly model with seasonal autoregression

ARIMA(1,0,0)1(0,1,0)12

A monthly model with seasonal trend and nonseasonal


autoregression

ARIMA(1,0,0)1(0,1,0)7(0,1,1)36 A daily model with seasonality and nonseasonal


4
autoregression

THANK
YOU!!!

Você também pode gostar