3.2 least squares regression line
DESCRIPTION
3.2 Least Squares Regression Line. Regression Line. Describes how a response variable changes as an explanatory variable changes Formula sheet: Calculator version:. Slope. Formula Sheet - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/1.jpg)
3.2 Least Squares Regression Line
![Page 2: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/2.jpg)
Regression Line
• Describes how a response variable changes as an explanatory variable changes
• Formula sheet:
• Calculator version:
![Page 3: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/3.jpg)
Slope
• Formula Sheet
• Interpretation: how will the predicted response variable change for one increase in the explanatory variable?
![Page 4: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/4.jpg)
Y-Intercept• Formula Sheet
• Interpretation: what is the predicted response variable if there is no explanatory variable?
• Mathematically - needed!
• Realistically - might not make sense!
• Sometimes the explanatory variable might not make sense being zero
![Page 5: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/5.jpg)
Interpret the slope and the y-intercept from the given least squares regression line in context of
the problem. Determine if the y-intercept is realistic
for this problem, explain.
(I will write the equation on the board)
![Page 6: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/6.jpg)
Extrapolation
• When using a regression line to predict a variable outside the range of the data gathered
• Unreliable predictions!
![Page 7: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/7.jpg)
Multiple Choice Problems
![Page 8: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/8.jpg)
![Page 9: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/9.jpg)
![Page 10: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/10.jpg)
![Page 11: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/11.jpg)
![Page 12: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/12.jpg)
Let's do p. 160!
![Page 13: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/13.jpg)
3.2 - Least-Squares
Regression(Residuals)
![Page 14: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/14.jpg)
Where else have we seen
“residuals?”Sx = data point - mean (observed - predicted)
z-scores = observed - expected
* note: this is just the numerator of these calculations
Remember:Remember:APAP
![Page 15: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/15.jpg)
Below is the LSRL for sprint time (seconds) and the long jump distance (inches)Find and interpret the residual for John who had a time of 8.09 seconds and a
jump of 151 inches.
predicted long jump distance = 304.56 - 27.63(sprint time)
residual = observed - predicted
151
residual = 69.97 inchesJohn jumped much farther than what was
predicted by our least squares regression line. He jumped almost 70 inches farther, based on his
sprint time.
- 81.03
![Page 16: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/16.jpg)
So why least squared regression
line?
http://bcs.whfreeman.com/tps4e/#628644__666392__
Graph (0,0), (0,2), (2,2), and (2,4) and find the least squares regression line. Then find the residuals.Windows - find the sum of the square of the residualsDoor - find the sum of the absolute value of the residuals
Now, what if I said the least squares regression line was y = 0.2 + 1.6x? y = x?Windows find the sum of the square of the residualsDoor - find the sum of the absolute value of the residuals
![Page 17: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/17.jpg)
Stop notes for todayHomework is p193
#43,45,47,53
Activity - "Matching Descriptions to Scatterplots"
Homework hint: you will need to be familiar with the formulas on your sheet to write the LSRL
![Page 18: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/18.jpg)
Residual Plots
a scatterplot of the residuals against the explanatory variable.
used to help assess the strength of your regression line
![Page 19: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/19.jpg)
Residual Plotswith Normal Probability Plots we want the graphs to be linear to support the Normality of our data.
with Residual Plots we want the residuals to be very scattered so our data is can be model with a linear regression.
Remember:Remember:Correlation does NOT assess linearity, just Correlation does NOT assess linearity, just
strength and direction!strength and direction!
![Page 20: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/20.jpg)
What’s a Good Residual Plot?No obvious pattern - the LSRL would be in the middle of the data, some data above and some below
Relatively small residuals - the data points are close to the LSRL
![Page 21: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/21.jpg)
Do the following residual plots support or refute a linear model?
![Page 22: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/22.jpg)
http://content.ebscohost.com/pdf23_24/pdf/2009/D8Y/01Sep09/43669525.pdf?T=P&P=AN&K=43669525&S=R&D=aph&EbscoContent=dGJyMNHX8kSeqK84yOvqOLCmr0qep7RSs6%2B4S7aWxWXS&ContentCustomer=dGJyMPGssk2xqLJNuePfgeyx44Hy
![Page 23: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/23.jpg)
How to Graph?Take each data point and determine the residual
Plot the residuals versus the explanatory variable
i.e. (explanatory data, residual)
explanatory variable
residual
21.51
0.50
-0.5-1
-1.5-2
use the same numbers as your scatterplot
![Page 24: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/24.jpg)
Calculator Construction
If you have a lot of data, follow the instructions on page 178
to construct your residual plot (you will also have to have done the
technology corner on p. 170)
![Page 25: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/25.jpg)
What is Standard
Deviation?the average squared distance a data point is from the mean
Is there a sx? Is there a sy?
So why not s? (standard deviation of residuals)
![Page 26: 3.2 Least Squares Regression Line](https://reader035.vdocument.in/reader035/viewer/2022062309/56814c50550346895db96021/html5/thumbnails/26.jpg)
Standard Deviation of Residuals
gives the approximate size of an “average” or “typical” prediction error from our LSRL
formula on page 177
Why divide by n-2?