moving further
DESCRIPTION
Moving further. - Word counts - Speech error counts - Metaphor counts - Active construction counts. Categorical count data. Hissing Koreans. Winter & Grawunder (2012). No. of Cases. Bentz & Winter (2013). Poisson Model. The Poisson Distribution. few deaths. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/1.jpg)
- Word counts
- Speech error counts
- Metaphor counts
- Active construction counts
Moving furtherCategorical count data
![Page 2: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/2.jpg)
Hissing Koreans
Winter & Grawunder (2012)
![Page 3: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/3.jpg)
No. of Cases
Bentz & Winter (2013)
![Page 4: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/4.jpg)
![Page 5: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/5.jpg)
![Page 6: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/6.jpg)
Poisson Model
![Page 7: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/7.jpg)
Siméon Poisson
1898: Ladislaus Bortkiewicz
Army Corps
with few Horses
Army Corpslots of Horses
few deaths
lowvariability
many deaths
highvariability
The Poisson Distribution
![Page 8: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/8.jpg)
![Page 9: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/9.jpg)
Poisson Regression= generalized linear
model with Poisson error structure
and log link function
![Page 10: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/10.jpg)
The Poisson ModelY ~ log(b0 + b1*X1 + b2*X2)
![Page 11: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/11.jpg)
In R:
lmer(my_counts ~ my_predictors +(1|subject), mydataset, family="poisson")
![Page 12: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/12.jpg)
Poisson model output
logvalues
predicted mean
rate
exponentiate
![Page 13: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/13.jpg)
Poisson Model
![Page 14: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/14.jpg)
- Focus vs. no-focus
- Yes vs. No
- Dative vs. genitive
- Correct vs. incorrect
Moving furtherBinary categorical data
![Page 15: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/15.jpg)
Bentz & Winter (2013)
Case yes vs. no ~ Percent L2 speakers
![Page 16: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/16.jpg)
![Page 17: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/17.jpg)
![Page 18: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/18.jpg)
![Page 19: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/19.jpg)
![Page 20: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/20.jpg)
Logistic Regression= generalized linear
model with binomial error structure
and logistic link function
![Page 21: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/21.jpg)
The Logistic Modelp(Y) ~ logit-1(b0 + b1*X1 + b2*X2)
![Page 22: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/22.jpg)
In R:
lmer(binary_variable ~ my_predictors +(1|subject), mydataset,family="binomial")
![Page 23: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/23.jpg)
Probabilities and OddsProbability of
anEvent
Odds of anEvent
![Page 24: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/24.jpg)
Intuition about Odds
N = 12
What are the odds that I pick a blue
marble?
Answer:2/10
![Page 25: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/25.jpg)
Log odds
= logit function
![Page 26: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/26.jpg)
Representative valuesProbability Odds Log odds (= “logits”)0.1 0.111 -2.1970.2 0.25 -1.3860.3 0.428 -0.8470.4 0.667 -0.4050.5 1 00.6 1.5 0.4050.7 2.33 0.8470.8 4 1.3860.9 9 2.197
![Page 27: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/27.jpg)
Snijders & Bosker (1999: 212)
![Page 28: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/28.jpg)
Bentz & Winter (2013)
![Page 29: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/29.jpg)
Estimate Std. Error z value Pr(>|z|)(Intercept) 1.4576 0.6831 2.134 0.03286Percent.L2 -6.5728 2.0335 -3.232 0.00123
Case yes vs. no ~ Percent L2 speakers
Log odds when Percent.L2 = 0
![Page 30: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/30.jpg)
Bentz & Winter (2013)
![Page 31: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/31.jpg)
Estimate Std. Error z value Pr(>|z|)(Intercept) 1.4576 0.6831 2.134 0.03286Percent.L2 -6.5728 2.0335 -3.232 0.00123
Case yes vs. no ~ Percent L2 speakers
For each increase in Percent.L2 by 1%, how much the log odds decrease (= the slope)
![Page 32: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/32.jpg)
Bentz & Winter (2013)
![Page 33: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/33.jpg)
Estimate Std. Error z value Pr(>|z|)(Intercept) 1.4576 0.6831 2.134 0.03286Percent.L2 -6.5728 2.0335 -3.232 0.00123
Case yes vs. no ~ Percent L2 speakers
Logits or“log odds”
Exponentiate
Transform byinverse logit
Odds
Proba-bilities
![Page 34: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/34.jpg)
Estimate Std. Error z value Pr(>|z|)(Intercept) 1.4576 0.6831 2.134 0.03286Percent.L2 -6.5728 2.0335 -3.232 0.00123
Case yes vs. no ~ Percent L2 speakers
Logits or“log odds”
Transform byinverse logit
Odds
Proba-bilities
exp(-6.5728)
![Page 35: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/35.jpg)
Estimate Std. Error z value Pr(>|z|)(Intercept) 1.4576 0.6831 2.134 0.03286Percent.L2 -6.5728 2.0335 -3.232 0.00123
Case yes vs. no ~ Percent L2 speakers
Logits or“log odds”
exp(-6.5728)
Transform byinverse logit
0.001397878
Proba-bilities
![Page 36: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/36.jpg)
Odds
> 1
< 1
Numeratormore likely
Denominator more likely
= event happens more often than
not
= event is more likely not to
happen
![Page 37: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/37.jpg)
Estimate Std. Error z value Pr(>|z|)(Intercept) 1.4576 0.6831 2.134 0.03286Percent.L2 -6.5728 2.0335 -3.232 0.00123
Case yes vs. no ~ Percent L2 speakers
Logits or“log odds”
exp(-6.5728)
Transform byinverse logit
0.001397878
Proba-bilities
![Page 38: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/38.jpg)
Estimate Std. Error z value Pr(>|z|)(Intercept) 1.4576 0.6831 2.134 0.03286Percent.L2 -6.5728 2.0335 -3.232 0.00123
Case yes vs. no ~ Percent L2 speakers
Logits or“log odds” logit.inv(1.4576) 0.81
![Page 39: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/39.jpg)
Bentz & Winter (2013)
About 80%(makes sense)
![Page 40: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/40.jpg)
Estimate Std. Error z value Pr(>|z|)(Intercept) 1.4576 0.6831 2.134 0.03286Percent.L2 -6.5728 2.0335 -3.232 0.00123
Case yes vs. no ~ Percent L2 speakers
Logits or“log odds” logit.inv(1.4576) 0.81
logit.inv(1.4576+-6.5728*0.3) 0.37
![Page 41: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/41.jpg)
Bentz & Winter (2013)
![Page 42: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/42.jpg)
= logit function
= inverse logit
function
![Page 43: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/43.jpg)
= inverse logit
function
This is the famous “logistic
function”
logit-1
![Page 44: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/44.jpg)
Inverse logit function
(transforms back toprobabilities)
logit.inv = function(x){exp(x)/(1+exp(x))}
(this defines the function in R)
![Page 45: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/45.jpg)
GeneralLinear Model
GeneralizedLinear Model
GeneralizedLinearMixed Model
![Page 46: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/46.jpg)
GeneralLinear Model
GeneralizedLinear Model
GeneralizedLinearMixed Model
![Page 47: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/47.jpg)
GeneralLinear Model
GeneralizedLinear Model
GeneralizedLinearMixed Model
![Page 48: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/48.jpg)
GeneralizedLinear Model
= “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones)
= Consists of two things: (1) an error distribution, (2) a link function
![Page 49: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/49.jpg)
= “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones)
= Consists of two things: (1) an error distribution, (2) a link function
Logistic regression: Binomial distributionPoisson regression:Poisson distribution
Logistic regression:Logit link function
Poisson regression:Log link function
![Page 50: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/50.jpg)
= “Generalizing” the General Linear Model to cases that don’t include continuous response variables (in particular categorical ones)
= Consists of two things: (1) an error distribution, (2) a link function
Logistic regression: Binomial distributionPoisson regression:Poisson distribution
Logistic regression:Logit link function
Poisson regression:Log link function
lm(response ~ predictor)
glm(response ~ predictor,family="binomial")
glm(response ~ predictor,family="poisson")
![Page 51: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/51.jpg)
Categorical Data
Dichotomous/Binary Count
Logistic Regression
PoissonRegression
![Page 52: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/52.jpg)
General structure
Linear Modelcontinuous ~ any type of variable
Logistic Regressiondichotomous ~ any type of variable
Poisson Regressioncount ~ any type of variable
![Page 53: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/53.jpg)
For the generalized linearmixed model…
… you only have to specify the family.
lmer(…)lmer(…,family="poisson")lmer(…,family="binomial")
![Page 54: Moving further](https://reader035.vdocument.in/reader035/viewer/2022062400/56816618550346895dd96748/html5/thumbnails/54.jpg)
That’s it(for now)