final project...final project meet with me over the next couple of weeks to discuss possibilities...
TRANSCRIPT
![Page 1: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/1.jpg)
Final Project meet with me over the next couple of
weeks to discuss possibilities
![Page 2: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/2.jpg)
READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations in visual working memory. Nature, 453, 233-235. Along with the Lewandowsky & Farrell chapters
![Page 3: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/3.jpg)
Using Models to Test Hypotheses
![Page 4: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/4.jpg)
Prototype Model Exemplar Model
Mixture Model
![Page 5: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/5.jpg)
Prototype Model Exemplar Model
Mixture Model
compare nonnested models
compare nested models
compare nested models
![Page 6: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/6.jpg)
Prototype Model Exemplar Model
Mixture Model
compare nonnested models
compare nested models
compare nested models
Saturated Model
![Page 7: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/7.jpg)
Prototype Model Exemplar Model
Mixture Model
compare nonnested models
compare nested models
compare nested models
Saturated Model
Null Model
![Page 8: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/8.jpg)
Some issues regarding fit measures
• speed of computation • consistency
– allow you to recover the true parameters? – all we’ve discussed are consistent
• efficiency – minimum variance of parameter estimates? – SSE and %Var are inefficient – lnL, χ2, weighted SSE are efficient
• permit statistical tests
![Page 9: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/9.jpg)
Prototype Model Exemplar Model
Mixture Model
compare nonnested models
compare nested models
compare nested models
Saturated Model
Null Model
![Page 10: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/10.jpg)
Exemplar Model
Mixture Model
compare nested models
![Page 11: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/11.jpg)
Exemplar Model
Mixture Model
compare nested models
Null Model
Saturated Model this has to be as good as any model can do …
![Page 12: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/12.jpg)
Exemplar Model
Mixture Model
compare nested models
Null Model
Saturated Model this has to be as good as any model can do …
One parameter equal to each free data point
![Page 13: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/13.jpg)
Exemplar Model
Mixture Model
compare nested models
Null Model
Saturated Model this has to be as good as any model can do …
this is not as bad as any model could do, but it’s a good floor
![Page 14: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/14.jpg)
Exemplar Model
Mixture Model
compare nested models
Null Model
Saturated Model this has to be as good as any model can do …
this is not as bad as any model could do, but it’s a good floor
clearly, this model needs to fit better than the Null Model
![Page 15: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/15.jpg)
Exemplar Model
Mixture Model
compare nested models
Null Model
Saturated Model this has to be as good as any model can do …
this is not as bad as any model could do, but it’s a good floor
clearly, this model needs to fit better than the Null Model
logically, this model MUST fit better than the exemplar model … does the exemplar model fit significantly worse?
![Page 16: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/16.jpg)
Does a model have to account for all the variability in the observed data?
![Page 17: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/17.jpg)
Does a model have to account for all the variability in the observed data? Does it need to fit as well as the saturated model? Well, that’s perhaps the ultimate goal (and having no free parameters to boot). But accounting for some of the variability is what it means to have a theory. You explain some of the variability. Better models explain MORE of the variability. And some of the variability could simply be noise (of various sorts). see Dell, G.S., Schwartz, M.F., Martin, N., Saffran, E.M.,
Gagnon, D.A. (2000). The role of computational models in neuropsychological investigations of language: Reply to Ruml and Caramazza (2000). Psychological Review, 107, 635-645.
![Page 18: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/18.jpg)
![Page 19: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/19.jpg)
fit the exemplar model
fit the mixed model
does the exemplar model fit significantly worse than the exemplar model?
tests a hypothesis of whether people need do abstract a prototype on top of remembering specific exemplars
![Page 20: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/20.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
![Page 21: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/21.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
We will MAXIMIZE likelihood “Maximum Likelihood Parameter Estimation”
![Page 22: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/22.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
FLln
RLln
ln L of the full model (e.g., mixed model)
ln L of the restricted model (e.g., exemplar model)
![Page 23: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/23.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
FLln
RLln
RF LL lnln −
ln L of the full model (e.g., mixed model)
ln L of the restricted model (e.g., exemplar model)
difference between fit to full versus restricted model
![Page 24: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/24.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
FLln
RLln
RF LL lnln −
]ln[ln2 RF LL −×
ln L of the full model (e.g., mixed model)
ln L of the restricted model (e.g., exemplar model)
difference between fit to full versus restricted model
need to multiply by 2 (because God said so)
![Page 25: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/25.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
FLln
RLln
RF LL lnln −
]ln[ln2 RF LL −×
]ln[ln22RF LLG −×=
ln L of the full model (e.g., mixed model)
ln L of the restricted model (e.g., exemplar model)
difference between fit to full versus restricted model
need to multiply by 2 (because God said so)
log Likelihood ratio statistic distributed as χ2 with df = NparmsF - NparmsR
![Page 26: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/26.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
]ln[ln22RF LLG −×= log Likelihood ratio statistic
distributed as χ2 with df = NparmsF - NparmsR
If that statistic exceeds the critical χ2 with the specified df (at selected alpha level), then the restricted model fits significantly worse than the general model
![Page 27: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/27.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
]ln[ln22RF LLG −×= log Likelihood ratio statistic
distributed as χ2 with df = NparmsF - NparmsR
EXAMPLE
12.263ln −=FL12.293ln −=RL
19545240 =−=df
![Page 28: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/28.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
]ln[ln22RF LLG −×= log Likelihood ratio statistic
distributed as χ2 with df = NparmsF - NparmsR
EXAMPLE
12.263ln −=FL12.293ln −=RL
40.61]82.29312.263[22 =−−−×=G
19545240 =−=df
6.228)05.,195(2 === αχ dfC
![Page 29: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/29.jpg)
![Page 30: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/30.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
]ln[ln22RF LLG −×= log Likelihood ratio statistic
distributed as χ2 with df = NparmsF - NparmsR
EXAMPLE
12.263ln −=FL12.293ln −=RL
40.61]82.29312.263[22 =−−−×=G
19545240 =−=df
6.228)05.,195(2 === αχ dfC
the restricted model DOES NOT fit significantly worse …
![Page 31: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/31.jpg)
Likelihood (L) log Likelihood (ln L)
instead of SSE or r2
]ln[ln22RF LLG −×= log Likelihood ratio statistic
distributed as χ2 with df = NparmsF - NparmsR
Likelihood Ratio Testing
⎟⎟⎠
⎞⎜⎜⎝
⎛×=
R
F
LLG ln22
![Page 32: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/32.jpg)
Likelihood (L) log Likelihood (ln L)
What is Likelihood? I’ll start with just giving you the equations … more later (so you can do the homework)
![Page 33: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/33.jpg)
ln L We will stay concrete for now. There are n stimuli and m responses. For identification, each stimulus has a unique response. For categorization, groups of stimuli can have the same (category) response.
![Page 34: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/34.jpg)
ln L We will stay concrete for now. There are n stimuli and m responses. For identification, each stimulus has a unique response. For categorization, groups of stimuli can have the same (category) response.
NOTE: I will be giving you a maximum likelihood equation that can be used with this kind of choice data. Maximum likelihood methods are extremely general (and can get rather complicated). Maximum Likelihood techniques are not only used for evaluating computational models, but are also used widely in statistics.
![Page 35: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/35.jpg)
ln L We will stay concrete for now. There are n stimuli and m responses. For identification, each stimulus has a unique response. For categorization, groups of stimuli can have the same (category) response. In order to use the following form of maximum likelihood statistic, we need to have data in the form of response frequencies, not response probabilities (that limitation is only true for this example).
![Page 36: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/36.jpg)
ln L We will stay concrete for now. There are n stimuli and m responses. For identification, each stimulus has a unique response. For categorization, groups of stimuli can have the same (category) response. In order to use the following form of maximum likelihood statistic, we need to have data in the form of response frequencies, not response probabilities.
ijf
iN)|( ij SRP
observed frequency with which stimulus i is given response j
number of presentations of stimulus i
predicted probability with which stimulus i is given response j
![Page 37: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/37.jpg)
ln L
ijf
iN)|( ij SRP
observed frequency with which stimulus i is given response j
number of presentations of stimulus i
predicted probability with which stimulus i is given response j
∑ ∑∑ ∑∑+−=i i j i j
ijijiji SRPffNL )|(ln!ln!lnln
∏=
=N
iiN
1
! ∏=
=N
iiN
1
ln!ln ∑=
=N
iiN
1ln!ln
![Page 38: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/38.jpg)
ln L
ijf
iN)|( ij SRP
observed frequency with which stimulus i is given response j
number of presentations of stimulus i
predicted probability with which stimulus i is given response j
∑ ∑∑ ∑∑+−=i i j i j
ijijiji SRPffNL )|(ln!ln!lnln
imii fim
fi
fi
i imii
i SRPSRPSRPfff
NL )|()|()|( 21
2121
⋅⋅⋅⎟⎟⎠
⎞⎜⎜⎝
⎛
⋅⋅⋅=∏
![Page 39: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/39.jpg)
∑ ∑∑ ∑∑+−=i i j i j
ijijiji SRPffNL )|(ln!ln!lnln
imii fim
fi
fi
i imii
i SRPSRPSRPfff
NL )|()|()|( 21
2121
⋅⋅⋅⎟⎟⎠
⎞⎜⎜⎝
⎛
⋅⋅⋅=∏
Why are we taking logs?
![Page 40: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/40.jpg)
Why are we taking logs?
)()()log()log()log(
)log()log()/log()log()log()log(
))(log(
)(
xfexfeapa
babababa
xf
xf
p
=
=
×=
−=
+=×
![Page 41: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/41.jpg)
Why are we taking logs?
log(f(x)) is a “monotonic function” so max[f(x)] is the same as max[log(f(x))]
![Page 42: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/42.jpg)
ln L
ijf
iN)|( ij SRP
observed frequency with which stimulus i is given response j
number of presentations of stimulus i
predicted probability with which stimulus i is given response j
∑ ∑∑ ∑∑+−=i i j i j
ijijiji SRPffNL )|(ln!ln!lnln
You want to MAXIMIZE the lnL … which is the same as MINIMIZING the –lnL …
![Page 43: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/43.jpg)
![Page 44: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/44.jpg)
First, let’s talk about probability
Prob(data|parm) probability of some data given the parameters of some model
knowing parameters à predict some outcome
![Page 45: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/45.jpg)
imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.4
obviously, the probability of getting a head is .6
![Page 46: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/46.jpg)
imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.4
what is the probability of getting two heads on two flips?
coin flips are independent
![Page 47: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/47.jpg)
imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.4
what is the probability of getting two heads on two flips?
coin flips are independent
p(event A AND event B) = p(event A) x p(event B) if A and B are INDEPENDENT
p(head) x p(head) .6 x .6 .36
![Page 48: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/48.jpg)
imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.6
what is the probability of getting one head and one tail on two flips?
![Page 49: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/49.jpg)
imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.6
what is the probability of getting one head and one tail on two flips?
p(head) x p(tail) .6 x .4 .24
+ p(tail) x p(head) .4 x .6 .24
.48
![Page 50: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/50.jpg)
imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.6
what is the probability of getting x heads on N flips?
![Page 51: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/51.jpg)
imagine an unfair coin that gives heads with probability p=.6 and tails with probability q=1-p=.6
what is the probability of getting x heads on N flips?
Prob(x | p) = Nx
!
"#
$
%& px (1' p)N'x = N!
x!(N ' x)!px (1' p)N'x
Binomial Distribution f(x; p) give the probability of observing x “successes” for a Bernoulli process with probability p
![Page 52: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/52.jpg)
![Page 53: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/53.jpg)
Matlab Example
![Page 54: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/54.jpg)
coin flips have only two outcomes (heads or tails) that is, a flip of the coin can have only two mutually exclusive events … what about a situation with more than just two possible outcomes? what if an event has three possible outcomes? or four possible outcomes?
![Page 55: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/55.jpg)
Multinomial Distribution probability of outcome 1 is p1 probability of outcome 2 is p2 probability of outcome 3 is p3 we want to know what the probability of observing x1 events with outcome 1, x2 events with outcome 2, and x3 events with outcome 3
Prob(x1, x2, x3 | p1p2p3) =N
x1 x2 x3
!
"##
$
%&& p1
x1p2x2 p3
x3 =N!
x1!x2 !x3!p1x1p2
x2 p3x3
![Page 56: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/56.jpg)
Multinomial Distribution probability of outcome 1 is p1 probability of outcome 2 is p2 probability of outcome 3 is p3 we want to know what the probability of observing x1 events with outcome 1, x2 events with outcome 2, and x3 events with outcome 3
Prob(x1, x2,..., xm | p1p2,..., pm ) =N
x1 x2 ... xm
!
"##
$
%&& p1
x1p2x2 ' ' ' pm
xm =N!
x1!x2 !' ' ' xm!p1x1p2
x2 ' ' ' pmxm
![Page 57: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/57.jpg)
Matlab Example
![Page 58: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/58.jpg)
What is Likelihood?
prob(data|parm) probability of some data given the known parameters of some model
L(data|parm) likelihood of known data given particular candidate parameters of the model
know parameters à predict some outcome
observing some data à estimate some parameters that maximize the likelihood of the data
![Page 59: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/59.jpg)
imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p?
L(x | p) = Prob(x | p) = Nx
!
"#
$
%& px (1' p)N'x = N!
x!(N ' x)!px (1' p)N'x
now, x (and N) are fixed we want to find the value of p that maximizes the likelihood L of the data
![Page 60: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/60.jpg)
![Page 61: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/61.jpg)
![Page 62: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/62.jpg)
imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p? USING CALCULUS
L(x | p) = Prob(x | p) = Nx
!
"#
$
%& px (1' p)N'x = N!
x!(N ' x)!px (1' p)N'x
lnL = log Nx
!
"#
$
%&
!
"##
$
%&&+ x ln p+ (N ' x)ln(1' p)
d lnLdp
= x 1p
!
"#
$
%&+ (N ' x) 1
1' p!
"#
$
%&('1) = 0
x(1' p)' (N ' x)p = 0x ' xp' Np+ xp = 0
p = xN
![Page 63: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/63.jpg)
imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p? USING CALCULUS
L(x | p) = Prob(x | p) = Nx
!
"#
$
%& px (1' p)N'x = N!
x!(N ' x)!px (1' p)N'x
lnL = log Nx
!
"#
$
%&
!
"##
$
%&&+ x ln p+ (N ' x)ln(1' p)
d lnLdp
= x 1p
!
"#
$
%&+ (N ' x) 1
1' p!
"#
$
%&('1) = 0
x(1' p)' (N ' x)p = 0x ' xp' Np+ xp = 0
p = xN
![Page 64: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/64.jpg)
imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p? USING CALCULUS
L(x | p) = Prob(x | p) = Nx
!
"#
$
%& px (1' p)N'x = N!
x!(N ' x)!px (1' p)N'x
lnL = log Nx
!
"#
$
%&
!
"##
$
%&&+ x ln p+ (N ' x)ln(1' p)
d lnLdp
= 0+ x 1p
!
"#
$
%&+ (N ' x) 1
1' p!
"#
$
%&('1) = 0
x(1' p)' (N ' x)p = 0x ' xp' Np+ xp = 0
p = xN
![Page 65: Final Project...Final Project meet with me over the next couple of weeks to discuss possibilities READ FOR NEXT WEEK: Zhang, W., & Luck, S.J. (2008). Discrete fixed-resolution representations](https://reader034.vdocument.in/reader034/viewer/2022050610/5fb1a18d01e7a31f0f650298/html5/thumbnails/65.jpg)
imagine we flip a coin 10 times and get 4 heads (N=10, x=4) what is the maximum likelihood estimate of p? USING CALCULUS
L(x | p) = Prob(x | p) = Nx
!
"#
$
%& px (1' p)N'x = N!
x!(N ' x)!px (1' p)N'x
lnL = log Nx
!
"#
$
%&
!
"##
$
%&&+ x ln p+ (N ' x)ln(1' p)
d lnLdp
= 0+ x 1p
!
"#
$
%&+ (N ' x) 1
1' p!
"#
$
%&('1) = 0
x(1' p)' (N ' x)p = 0x ' xp' Np+ xp = 0
p = xN