多元线性回归模型:假设检验
becausewehave to estimates 2by sˆ 2
Note the degrees of freedom: n k 1
6
The t Test (cont)
Knowing the sampling distribution for the standardized estimator allows us to carry out hypothesis tests Start with a null hypothesis
10
One-Sided Alternatives (cont)
bˆj is distributed normally becauseit
is a linear combination of the errors
5
The t Test
Under theCLM assumptions
bˆj b j se bˆ j ~ tnk1
Note this is a t distribution (vs normal)
H1: bj > 0 and H1: bj < 0 are one-sided H1: bj 0 is a two-sided alternative
If we want to have only a 5% probability of rejecting H0 if it is really true, then we say our significance level is 5%
y|x ~ Normal(b0 + b1x1 +…+ bkxk, s2)
While for now we just assume normality, clear that sometimes not the case Large samples will let us drop normality
3
The homoskedastic normal distribution with a single explanatory variable
y
f(y|x)
.
.
E(y|x) = b0 + b1x
Normal distributions
x1
x2
4
Normal Sampling Distributions
mean and variance s2: u ~ Normal(0,s2)
2
CLM Assumptions (cont)
Under CLM, OLS is not only BLUE, but is the minimum variance unbiased estimator We can summarize the population assumptions of CLM as follows
Under theCLM assumptions, conditional on the sample values of theindependent variables
bˆ j ~ Normal b j ,源自ar bˆ j , so that
bˆ j b j sd bˆ j ~ Normal0,1
9
One-Sided Alternatives (cont)
Having picked a significance level, a, we look up the (1 – a)th percentile in a t distribution with n – k – 1 df and call this c, the critical value We can reject the null hypothesis if the t statistic is greater than the critical value If the t statistic is less than the critical value then we fail to reject the null
"the"t statistic for bˆj : tbˆ j bˆ j se bˆ j
We will then use our t statistic along with a rejection rule to determine whether to accept thenull hypothesis, H0
For example, H0: bj=0
If accept null, then accept that xj has no effect on y, controlling for other x’s
7
The t Test (cont)
To performour test we first need to form
8
t Test: One-Sided Alternatives
Besides our null, H0, we need an alternative hypothesis, H1, and a significance level H1 may be one-sided, or two-sided
Multiple Regression Analysis
y = b0 + b1x1 + b2x2 + . . . bkxk + u
2. Inference
1
Assumptions of the Classical Linear Model (CLM)
So far, we know that given the GaussMarkov assumptions, OLS is BLUE, In order to do classical hypothesis testing, we need to add another assumption (beyond the Gauss-Markov assumptions) Assume that u is independent of x1, x2,…, xk and u is normally distributed with zero