当前位置:文档之家› 伍德里奇计量经济学 (3)

伍德里奇计量经济学 (3)


Introductory Econometrics
4
Motivation : Advantage
It can explain more of the variation in the dependent variable.
It can incorporate more general functional form.
If other factors that affecting y are not correlated with x, changing x can ensure that u is not changed, and the effect of x on y can be identified.
Multiple regression analysis is more amenable to ceteris paribus analysis because it allows us to explicitly control for many other factors that simultaneously affect the dependent variable.
6
Motivation: An Example
Consider a model that says family
consumption is a quadratic function of family income:
Cons = b0 + b1 inc+b2 inc2 +u
Now the marginal propensity to consume is approximated by
pcolGPA: predicted values of college grade point average
pcolGPA:大学绩点预测值
hsGPA : high school GPA hsGPA : 高中绩点
ACT : achievement test score ACT :成绩测验分数
pcolGPA = 1.29 + 0.453hsGPA+0.0094ACT
yˆ bˆ1x1, that is each b has
a ceteris paribus interpretation
Introductory Econometrics
10
Example 3.4: Determinants of College
GPA (GPA1.dta)
Two-independent-variable regression
The multiple regression model is the most widely used vehicle for empirical analysis.
Introductory Econometrics
5
Motivation: An Example
Consider a simple version of the wage equation for obtaining the effect of education on hourly wage:
Introductory Econometrics
9
Interpreting Multiple Regression
yˆ bˆ0 bˆ1x1 bˆ2 x2 ... bˆk xk , so yˆ bˆ1x1 bˆ2 x2 ... bˆk xk ,
so holding x2,...,xk fixed implies that
exper: years of labor market experience
wage b0 b1educ b2exper u
In this example experience is explicitly taken out of the error term.
Introductory Econometrics
the residuals from the estimated
regression xˆ1 ˆ0 ˆ2 xˆ2
Introductory Econometrics
17
A “Partialling Out” Interpretation
Regress our first independent variable x1 on our second independent variable x2 ,
13
Example: Determinants of College GPA
One-independent-variable regression
pcolGPA = 2.4 +0.0271ACT
The coefficients on ACT is three times larger.
If these two regressions were both true, they can be considered as the results of two different experiments.
Introductory Econometrics
16
A “Partialling Out” Interpretation
Consider the case where k 2, i.e.
yˆ bˆ0 bˆ1x1 bˆ2 x2 , then
bˆ1 rˆ1i yi
rˆ12i , where rˆ1i are
MPC= b1 +2b2 inc
Introductory Econometrics
7
The Model with k Independent Variables
The general multiple linear regression model can be written as
yi b0 b1x1i b2 x2i bk xki ui
and then obtain the residual r1 .
Then, do to obtain
a simple bˆ1 .
regression
of
y
on
r1
Introductory Econometrics
18
“Partialling Out” continued
Previous equation implies that regressing y
Still need to make a zero conditional mean assumption, so now assume that
E(u|x1,x2, …,xk) = 0 Still minimizing the sum of squared residuals, so have k+1 first order conditions
Introductory Econometrics
8
Parallels with Simple Regression
b0 is still the intercept b1 to bk all called slope parameters
u is still the error term (or disturbance)
on x1 and x2 gives same effect of x1 as regressing y on residuals from a regression
of x1 on x2
This means only the part of x1 that is uncorrelated with x2 are being related to y so we’re estimating the effect of x1 on y after x2 has been “partialled out”
Introductory Econometrics
14
Holdier of multiple regression analysis is that it allows us to do in non-experimental environments what natural scientists are able to do in a controlled laboratory setting: keep other factors fixed.
Whether the ceteris paribus effects are reliable or not depends on whether the conditional mean assumption is realistic.
Introductory Econometrics
2
Motivation: Advantage
x1 and x2 are uncorrelated in the sample
Introductory Econometrics
20
“Partialling Out” continued
In the general model with k explanatory
variables, equation
bbˆˆ11cann
still
rˆ1i
yi
be written as in n rˆ1i2 , but the
rxe1soidnuxa2l…r1
3. Multiple Regression
Analysis: Estimation
yi = b0 + b1x1i + b2x2i + . . . bkxki + ui
Introductory Econometrics
1
Motivation: Advantage
相关主题