review of regression and the general linear model -and ... · a review of multiple regression and...
TRANSCRIPT
-
Lecture 1 Psychology 791
Review of Regression andthe General Linear Model
-and-Building a Regression Model I:Model Selection and Validation
Lecture 1January 25, 2007Psychology 791
-
Overview Todays Lecture
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Todays Lecture
A review of multiple regression and the general linear model.
A bit of Chapter 9 - the practical side of model fitting.
-
Lecture 1 Psychology 791
General Linear Model
-
Overview
General LinearModel Regression Error Distribution Estimation
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Regression Analysis with Matrices
Recall the simple regression equation (for the ith
observation, prediction of Yi by k variables Xik):
Yi = b0 + b1Xi1 + e
The equation above can be expressed more compactly by aset of matrices:
Y = Xb + e
Y is of size (N 1).
X is of size (N (1 + k)).
b is of size (k 1).
e is of size (N 1).
-
Lecture 1 Psychology 791
Regression with Matrices
Y1
Y2
Y3
Y4...
YN
=
1 X11
1 X21
1 X31
1 X41...
...1 XN1
[
b0
b1
]
+
e1
e2
e3
e4...
eN
Y = X b + e
(N 1) (N (1 + k)) (k 1) (N 1)
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
YN
=
1 X11
1 X21
1 X31
1 X41...
...1 XN1
[
b0
b1
]
+
e1
e2
e3
e4...
eN
Y1
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
YN
=
1 X11
1 X21
1 X31
1 X41...
...1 XN1
[
b0
b1
]
+
e1
e2
e3
e4...
eN
Y1 = b0
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
YN
=
1 X11
1 X21
1 X31
1 X41...
...1 XN1
[
b0
b1
]
+
e1
e2
e3
e4...
eN
Y1 = b0 + b1X11
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
YN
=
1 X11
1 X21
1 X31
1 X41...
...1 XN1
[
b0
b1
]
+
e1
e2
e3
e4...
eN
Y1 = b0 + b1X11 + e1
-
Overview
General LinearModel Regression Error Distribution Estimation
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Regression with Matrices
Note that most everything is really straightforward in terms ofmatrix algebra.
The matrix of predictors, X, has the first column containingall ones.
This represents the intercept parameter a.
This is also an introduction to setting columns of the Xmatrix to represent design and or group controls (as inANOVA).
-
Overview
General LinearModel Regression Error Distribution Estimation
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Distribution of Errors
Recall from previous classes that we often placedistributional assumptions on our error terms, allowing for thedevelopment of tractable hypothesis tests.
With matrices, the distributional assumptions are no different,except for things are approached in a multivariate fashion:
e NN (0, 2e IN )
Having a multivariate normal distribution with uncorrelatedvariables (from IN ) is identical to saying:
ei N(0, 2e)
for all i observations.
-
Overview
General LinearModel Regression Error Distribution Estimation
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Regression Estimation with Matrices
Regression estimates are typically found via least squares(called L2 estimates).
In least squares regression, the estimates are found byminimizing:
N
i=1
e2 =
N
i=1
(Yi Y
i )2 =
N
i=1
(Yi a + b1Xi1 + . . . + bkXik)2
As you could guess, we could accomplish all of this viamatrices.
Equivalently:
N
i=1
e2 =N
i=1
(Yi xib)2 = (Y Xb)(Y Xb) = ee
-
Lecture 1 Psychology 791
Regression Estimation with Matrices
Thankfully, there are people to figure out the equation for b that minimizesee.
b = (XX)1XY
b = ( X X )1 X Y
((k + 1) 1) ((k + 1) N) (N (k + 1)) ((k + 1) N) (N 1)
This equation is what I have been talking about for quite some time, theGeneral Linear Model.
For many types of data, in many differing analyses, this equation will provideestimates:
Multiple Regression
ANOVA
Analysis of Covariance (ANCOVA).
Multiple Regression with Curvilinear relationships in X .
-
Lecture 1 Psychology 791
Introductory Example
-
Lecture 1 Psychology 791
Call for Data
The examples in this class aretaken from various textbooks usedfor linear models.
Books are written very generallywith broad audiences in mind.
Psych examples are hard to findusually.
So here is my offer to you: if you have data from a lab/thesis/whatever, Iwould be happy to use it as an example during class.
It may make things a bit more relevant.
Please help if you can...
helpme.wmvMedia File (video/x-ms-wmv)
-
Overview
General LinearModel
IntroductoryExample Call for Data Example Data Set
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Todays Example Data Set
From Weisberg (1985, p. 240).
Property taxes on a house are supposedly dependent on thecurrent market value of the house. Since houses actually sellonly rarely, the sale price of each house must be estimatedevery year when property taxes are set. Regression methodsare sometimes used to make up a prediction function.
-
Overview
General LinearModel
IntroductoryExample Call for Data Example Data Set
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Erie, Pennsylvania
We have data for 27 houses sold in the mid 1970s in Erie,Pennsylvania:
X1: Current taxes (local, school, and county) 100 (dollars). X2: Number of bathrooms. X3: Lot size 1000 (square feet). X4: Living space 1000 (square feet). X5: Number of garage spaces. X6: Number of rooms. X7: Number of bedrooms. X8: Age of house (years). X9: Number of fireplaces. Y : Actual sale price 1000 (dollars).
-
Lecture 1 Psychology 791
Erie, Pennsylvania
Lake Erie
-
Lecture 1 Psychology 791
Erie, Pennsylvania
-
Lecture 1 Psychology 791
Erie, Pennsylvania
Jack
-
Lecture 1 Psychology 791
Erie, Pennsylvania
Paula
-
Lecture 1 Psychology 791
Multiple Regression
lookout.wmvMedia File (video/x-ms-wmv)
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
Regression Analysis
Recall the simple regression equation (for the ith
observation, prediction of Yi by k variables Xik):
Yi = 0 + 1Xi1 + i
The equation above can be expressed more compactly by aset of matrices:
Y = X +
Y is of size (n 1).
X is of size (n 2).
is of size (2 1).
is of size (n 1).
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
Multiple Regression: Adding a Parameter
Now the multiple regression equation (for the ith
observation, prediction of Yi by p 1 variables Xik, wherek = 1, . . . , p 1):
Yi = 0 + 1Xi1 + 2Xi2 + . . . + p1Xi,p1 + i
The equation above can be expressed more compactly by aset of matrices:
Y = X +
Y is of size (n 1).
X is of size (n p).
is of size (p 1).
is of size (n 1).
-
Lecture 1 Psychology 791
Multiple Regression with Matrices
Y1
Y2
Y3
Y4...
Yn
=
1 X11 X12 . . . X1,p1
1 X21 X22 . . . X2,p1
1 X31 X32 . . . X3,p1
1 X41 X42 . . . X4,p1...
......
......
1 Xn1 Xn2 . . . Xn,p1
0
1
2...
p1
+
1
2
3
4...n
Y = X +
(n 1) (n p) (p 1) (n 1)
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
Yn
=
1 X11 X12 . . . X1,p1
1 X21 X22 . . . X2,p1
1 X31 X32 . . . X3,p1
1 X41 X42 . . . X4,p1...
......
......
1 Xn1 Xn2 . . . Xn,p1
0
1
2...
p1
+
1
2
3
4...n
Y1
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
Yn
=
1 X11 X12 . . . X1,p1
1 X21 X22 . . . X2,p1
1 X31 X32 . . . X3,p1
1 X41 X42 . . . X4,p1...
......
......
1 Xn1 Xn2 . . . Xn,p1
0
1
2...
p1
+
1
2
3
4...n
Y1 = 0
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
Yn
=
1 X11 X12 . . . X1,p1
1 X21 X22 . . . X2,p1
1 X31 X32 . . . X3,p1
1 X41 X42 . . . X4,p1...
......
......
1 Xn1 Xn2 . . . Xn,p1
0
1
2...
p1
+
1
2
3
4...n
Y1 = 0 + 1X11
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
Yn
=
1 X11 X12 . . . X1,p1
1 X21 X22 . . . X2,p1
1 X31 X32 . . . X3,p1
1 X41 X42 . . . X4,p1...
......
......
1 Xn1 Xn2 . . . Xn,p1
0
1
2...
p1
+
1
2
3
4...n
Y1 = 0 + 1X11 + 2X12
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
Yn
=
1 X11 X12 . . . X1,p1
1 X21 X22 . . . X2,p1
1 X31 X32 . . . X3,p1
1 X41 X42 . . . X4,p1...
......
......
1 Xn1 Xn2 . . . Xn,p1
0
1
2...
p1
+
1
2
3
4...n
Y1 = 0 + 1X11 + 2X12 + . . .
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
Yn
=
1 X11 X12 . . . X1,p1
1 X21 X22 . . . X2,p1
1 X31 X32 . . . X3,p1
1 X41 X42 . . . X4,p1...
......
......
1 Xn1 Xn2 . . . Xn,p1
0
1
2...
p1
+
1
2
3
4...n
Y1 = 0 + 1X11 + 2X12 + . . . + p1X1,p1
-
Lecture 1 Psychology 791
Regression with Matrices
Working the matrix multiplication and addition for a single case gives:
Y1
Y2
Y3
Y4...
Yn
=
1 X11 X12 . . . X1,p1
1 X21 X22 . . . X2,p1
1 X31 X32 . . . X3,p1
1 X41 X42 . . . X4,p1...
......
......
1 Xn1 Xn2 . . . Xn,p1
0
1
2...
p1
+
1
2
3
4...n
Y1 = 0 + 1X11 + 2X12 + . . . + p1X1,p1 + 1
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
Multiple Regression: Adding a Parameter
Lets focus on the multiple regression model with twoparameters, or two independent variables.
Yi = 0 + 1Xi1 + 2Xi2 + i
What are some differences now that we move to multipleregression?
Now that we have a 3-dimensional space (the number ofdimensions is equal to the number of variables), we are nolonger plotting a line. (Note: At higher dimensions we aredealing with a hyperplane, or a plane of more than twodimensions).
Our model above now predicts our responses on a geometricplane, as opposed to simple regression which used aprediction line.
Does that mean our interpretation of our regressionparameters (0, 1, 2) is different?
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
Interpreting
Interpretation of regression weights are almost the same.
k still represents a change in Y for each unit increase in Xkvariable, assuming all other variables are held constant.
However, it is not the slope anymore, since we are dealingwith a plane, not a line.
Also, this change is in Y only occurs when the other Xvariables are held constant.
So, our interpretation of is as follows: The amount ofincrease in Y that occurs for each unit increase in X1, whenall other X2, . . . , Xk are held constant.
So each individual relationship with Y for each X is thoughtof linearly, however, the joint relationship is not.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
Estimation of b
How do we go get model estimates?
Ok, everyone say it with me now:
b = (XX)1 XY
Yep, same formula. Same result.
Happiness is a non-singular matrix.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
Estimation of Additional Matrices
All of the estimation procedures and formulas are the sameones we went over last time, so I wont repeat myself.
For example, the following:
Fitted Values, Y
Residuals, e
SSTO
SSE
SSR
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
Estimation Using SAS: Input
Using our house data, we would like to predict the actualsale price of a house using two variables:
1. Living space (in square feet).
2. Age of house (in years).
The SAS syntax is very straightforward for estimating amultiple regression:
proc glm data=house; model saleprice=livingsize age;
-
Lecture 1 Psychology 791
Estimation Using SAS: OutputThe SAS System 12:56 Monday, November 13, 2006 2
The GLM Procedure
Dependent Variable: saleprice saleprice
Sum of
Source DF Squares Mean Square F Value Pr > F
Model 2 4707.083338 2353.541669 91.81 F
livingsize 1 4591.667413 4591.667413 179.12 F
livingsize 1 4194.650765 4194.650765 163.63 |t|
Intercept 9.09682489 4.22266618 2.15 0.0415
livingsize 23.12153669 1.80752483 12.79
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
F Test for Regression
We are now going to test a significant regression relationship.
Our new null hypothesis is:
H0 : 1 = 2 = . . . = p1 = 0
Our new alternative hypothesis is:
HA : not all k equal 0
Notice the distinction: We are testing them all at the sametime. We can tell at least one is different from 0, but we dontknow which one.
You then have to perform your t-test on each individual todetermine which ones differ significantly from 0.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression Simple
Regression Multiple
Regression Interpreting Getting Estimates SAS Hypothesis Tests
R2 and r
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Lecture 1 Psychology 791
Coefficient of Multiple Determination
In simple regression, we called R2 the coefficient ofdetermination, now we just rename it to the coefficient ofMultiple Determination.
Good news, it still means the same thing: It is the amount ofreduction of the total variance of Y accounted for by (now) allof our X variables in the model.
-
Lecture 1 Psychology 791
Model Building/Selection Process
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess Process Building a
Regression Model
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
The Path to Finding a Model
Up until now, we have discussed various regression modelsthat can be applied to data.
One thing that we have failed to discuss is, which model isthe "best" model.
How do you choose which model is the most appropriate forthe data?
We will discuss the idea of model selection and validation.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess Process Building a
Regression Model
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Process
Model Building can be thought of as a 3 (or 4) step process:
1. Data collection and preparation.
2. Reduction of explanatory or predictor variables.
3. Model refinement and selection.
4. Model validation.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess Process Building a
Regression Model
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Building a Regression Model
-
Lecture 1 Psychology 791
Data Collection
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection Step 1: Data
Collection Controlled
Experiments Controlled
Experiments withCovariates
ConfirmatoryObservationalStudies
ExploratoryObservationalStudies
Data Preparation
Lecture 1 Psychology 791
Step 1: Data Collection
The data collection process really stems from the design ofthe research study.
Four types of research designs are outlined in the book:
Controlled Experiments.
Controlled Experiments with Covariates.
Confirmatory Observational Studies.
Exploratory Observational Studies.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection Step 1: Data
Collection Controlled
Experiments Controlled
Experiments withCovariates
ConfirmatoryObservationalStudies
ExploratoryObservationalStudies
Data Preparation
Lecture 1 Psychology 791
Controlled Experiments
In a controlled experiment, the experimenter controls thelevels of the explanatory variables and assigns treatments.
With control over the experimental variables, data collectionsimply stems from collecting observations from the treatmentconditions.
Example: Subjects are randomly assigned to four treatmentgroups to determine which condition is best for weight loss.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection Step 1: Data
Collection Controlled
Experiments Controlled
Experiments withCovariates
ConfirmatoryObservationalStudies
ExploratoryObservationalStudies
Data Preparation
Lecture 1 Psychology 791
Controlled Experiments with Covariates
This type of design incorporates the idea above, in that theexperimenter controls the levels of the explanatory variable.
There are, however, other uncontrolled variables -or-
Example: Subjects are randomly assigned to four treatmentgroups to determine which condition is best for weight loss.
However, they also believe that age and gender might alsobe a factor in weight loss.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection Step 1: Data
Collection Controlled
Experiments Controlled
Experiments withCovariates
ConfirmatoryObservationalStudies
ExploratoryObservationalStudies
Data Preparation
Lecture 1 Psychology 791
Confirmatory Observational Studies
Studies that are observational in nature that are designed totest specific hypotheses.
The response variable cannot be controlled and are simplyobserved.
Example: Previous research has shown that high stresslevels lead to weight gain.
They are unable to control the stress levels of thesubjects, so they simply measure it and see how it relatesto weight gain.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection Step 1: Data
Collection Controlled
Experiments Controlled
Experiments withCovariates
ConfirmatoryObservationalStudies
ExploratoryObservationalStudies
Data Preparation
Lecture 1 Psychology 791
Exploratory Observational Studies
These designs are when uncontrolled variables aremeasured in observational setting with no specifichypotheses.
Basically a bunch of variables are collected and they want tosee which ones have the most effect the response variable.
Example: Researchers are interested in the stability ofweight over time.
They are unsure of which variables are most important, sothey gather a bunch of information they think might bepredictive, for example, gender, amount of exercise, diet,etc.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation Data Preparation
Preliminary ModelInvestigation
Wrapping Up
Lecture 1 Psychology 791
Data Preparation
Each of these four research designs outlines the way inwhich the data is collected.
Once data is collected and is put into a data file, it should bechecked for errors.
Extreme outliers that may be input errors, measurementerrors, etc.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation Model
Investigation
Wrapping Up
Lecture 1 Psychology 791
Model Investigation
Once you are comfortable that the data is correct, you canbegin the analyses.
During this time, you look for any clues that the data mightgive you as to the nature of the relationship.
Check the scatter plots to determine the strength and natureof the relationship.
Check residuals to determine what the functional form of therelationship is (linear, non-linear, etc).
Check relationships between independent variables todetermine is some interaction may exist.
This step should involve a combination of prior researchknowledge and brute force.
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up Final Thought Next Class
Lecture 1 Psychology 791
Final Thought
Today we reviewedregression and the generallinear model.
Linear models are thebasis for many statisticaltests, so we are going toget really familiar with theirproperties.
Data analysis is a process, and not a simple solution.
As you work through the model building process, be aware ofthe data and what it is trying to tell you.
mashed.wmvMedia File (video/x-ms-wmv)
-
Overview
General LinearModel
IntroductoryExample
Multiple Regression
ModelBuilding/SelectionProcess
Data Collection
Data Preparation
Preliminary ModelInvestigation
Wrapping Up Final Thought Next Class
Lecture 1 Psychology 791
Next Time
More from Chapter 9.
Model selection criteria.
Methods for searching for variables that have an effect on theresponse.
Good clean fun.
OverviewToday's Lecture
General Linear ModelRegression Analysis with MatricesRegression with MatricesRegression with MatricesRegression with MatricesRegression with MatricesRegression with Matrices
Regression with MatricesDistribution of ErrorsRegression Estimation with MatricesRegression Estimation with Matrices
Introductory ExampleCall for DataToday's Example Data SetErie, PennsylvaniaErie, PennsylvaniaErie, PennsylvaniaErie, PennsylvaniaErie, Pennsylvania
Multiple RegressionRegression Analysis Multiple Regression: Adding a ParameterMultiple Regression with MatricesRegression with MatricesRegression with MatricesRegression with MatricesRegression with MatricesRegression with MatricesRegression with MatricesRegression with Matrices
Multiple Regression: Adding a ParameterInterpreting Estimation of b Estimation of Additional MatricesEstimation Using SAS: InputEstimation Using SAS: OutputF Test for RegressionCoefficient of Multiple Determination
Model Building/Selection ProcessThe Path to Finding a ModelProcess Building a Regression Model
Data CollectionStep 1: Data CollectionControlled Experiments Controlled Experiments with Covariates Confirmatory Observational Studies Exploratory Observational Studies
Data PreparationData Preparation
Preliminary Model Investigation Model Investigation
Wrapping UpFinal ThoughtNext Time