Você está na página 1de 35

ECONOMETRICS I

CHAPTER 3: TWO VARIABLE


REGRESSION MODEL:
THE PROBLEM OF ESTIMATION

Textbook: Damodar N. Gujarati (2004) Basic Econometrics,


4th edition, The McGraw-Hill Companies
3.1 THE METHOD OF
ORDINARY LEAST SQUARES
• PRF:
• SRF:

• How is SRF determined?

• We do not minimize the sum of the residuals!

• Why not?
Least squares criterion
3.1 THE METHOD OF ORDINARY LEAST SQUARES

• We adopt the least-squares criterion


• We want to minimize the sum of the squared
residuals.

• This sum is a function of estimated


parameters:
• Normal equations:
3.1 THE METHOD OF ORDINARY LEAST SQUARES

• Solving the normal equations simultaneously, we


obtain the following:

• Beta2-hat can be alternatively


expressed as the following:
Three Statistical Properties of OLS
Estimators
I. The OLS estimators are expressed solely in terms
of the observable quantities (i.e. X and Y).
Therefore they can easily be computed.
II. They are point estimators (not interval
estimators). Given the sample, each estimator
provide only a single (point) value of the
relevant population parameter.
III. Once the OLS estimates are obtained from the
sample data, the sample regression line can be
easily obtained.
The properties of the regression line
1. It passes through the sample means of Y and
X.
The properties of the regression line
2.
The properties of the regression line
3. The mean value of the residuals is zero.
The properties of the regression line
4.
 uˆ Y  0
i i

5.
3.2 The Classical Linear Regression Model:
The Assumptions Underlying the Method of Least Squares
3.2 The Classical Linear Regression Model:
The Assumptions Underlying the Method of Least Squares
3.2 The Classical Linear Regression Model:
The Assumptions Underlying the Method of Least Squares
3.2 The Classical Linear Regression Model:
The Assumptions Underlying the Method of Least Squares
3.2 The Classical Linear Regression Model:
The Assumptions Underlying the Method of Least Squares
3.2 The Classical Linear Regression Model:
The Assumptions Underlying the Method of Least Squares
3.2 The Classical Linear Regression Model:
The Assumptions Underlying the Method of Least Squares

• Example of perfect multicollinearity: X1 = 2X2+X3


Y X1 X2 X3
6 5 2 1
11 10 4 2
17 11 5 1
22 16 6 4
25 19 8 3
33 22 10 2
15 11 3 5
PRECISION OR STANDARD ERRORS OF
LEAST SQUARES ESTIMATES
• var: variance
• se: standard error
•  2 : the constant homoscedastic
variance of ui
•  : the standard error of the
estimate

• ̂ : OLS estimator of 
Gauss – Markov Theorem

• An estimator, say the OLS estimator , is said to be a best


linear unbiased estimator (BLUE) of β2 if the following hold:
The coefficient of determination r2

• TSS: total sum of squares


• ESS: explained sum of squares
• RSS: residual sum of squares
The coefficient of determination r2

The quantity r2 thus defined is known as the (sample) coefficient of determination and is the most
commonly used measure of the goodness of fit of a regression line. Verbally, r2 measures the
proportion or percentage of the total variation in Y explained by the regression model.
The coefficient of determination r2

The coefficient of determination r2

The coefficient of correlation r

r is the sample correlation coeffient


Some of the properties of r
Homework
• Study the numerical example on pages 87-90.
There will be questions on the midterm exam
similar to the ones in this example.

• Data on page 88:


Homework
Homework

Você também pode gostar