STA258 Lecture 20
- Regression Analysis.
- Example:
- Suppose you have which is a predictor.
- is your response type.
- Suppose we have at we have a response at , etc.
- Regression wants us to find a line of best fit.
- Lets us assume a set of points of observations is given.
- Assumes the predictors and response variables are real-valued.
- What's the line of fit?
- If we have observations, what are the optimal values of and ?
- Estimates are: and
- Ordinary least squares (OLS) is the most common method to find the line of best fit.
- Minimize the sum of squared residuals:
-
- We need to differentiate
-
- So first we need to find . Then we can estimate .
-
- We know that
- We can get that
- So
- We can also do this with matrices. Hessian Matrix and such.
- Use some elementary results from MAT223 and matrix calculus.
- Example:
- We have for
- We have predictors.
- is a design matrix. This has all observations.
- This means
-
- We want to minimize with respect to .
-
-
- is a scalar. So it's equal to its transpose.
- Now we need to differentiate between the scalars and vectors.
- We'll use two identities from matrix calculus.
- If is symmetric then and
- So
- To isolate
- Same estimates as before, but now we can do this with more predictors.