In Lecture 12, we showed that using the geometric property of simple regression, we could derive the least-squares estimator for β1 from solving x ∗0  = 0, where x ∗ is of the mean-deviation form.

statistics

Description

1. (2 points) In Lecture 12, we showed that using the geometric property of simple regression, we could derive the least-squares estimator for β1 from solving x ∗0  = 0, where x ∗ is of the mean-deviation form. Take this approach and derive the least-squares estimator for β in multiple regression y = Xβ + .


 2. (2 points) Consider a simple linear model in mean-deviation form. We have x ∗ = x−x¯, and y ∗ = y −y¯. Then, yˆ∗ = β1x ∗ . Prove


3. (2 points) Exercise 10.7 in textbook. Use the prestige data. 


4. (3 points) Consider the model yi = β0 + β1xi1 + β2xi2 + i . Recall that the matrix V 11 is defined as the square submatrix consisting of the entries in the q rows and q columns of (X0X) −1 that pertain to the coefficients in βˆ 1 = [βˆ 1, · · · , βˆ q] 0 (see Equation 9.16 on page 218 in textbook). Show that V −1 11 for the slope coefficients β1 and β2 contains mean deviation sums of squares and products for the explanatory variables; that is

Instruction Files
HW3.pdf
99.1 KB

Related Questions in statistics category