linguistics association of great britain, annual meeting...
TRANSCRIPT
Introduction to the Rational Speech Acts Model
Christopher PottsStanford Linguistics
Linguistics Association of Great Britain, Annual MeetingSeptember 9, 2019
Overview
OverviewA companion to Goodman and Frank 2016, ‘Pragmatic language interpretation as probabilistic inference’
OverviewA companion to Goodman and Frank 2016, ‘Pragmatic language interpretation as probabilistic inference’
1. A bit of history
OverviewA companion to Goodman and Frank 2016, ‘Pragmatic language interpretation as probabilistic inference’
1. A bit of history2. Conceptual motivation for the Rational Speech Acts model (RSA)
OverviewA companion to Goodman and Frank 2016, ‘Pragmatic language interpretation as probabilistic inference’
1. A bit of history2. Conceptual motivation for the Rational Speech Acts model (RSA)3. Example calculations using the model:
a. Basic scalar implicatureb. The role of message costsc. The role of the alpha parameterd. The role of the referent prior
OverviewA companion to Goodman and Frank 2016, ‘Pragmatic language interpretation as probabilistic inference’
1. A bit of history2. Conceptual motivation for the Rational Speech Acts model (RSA)3. Example calculations using the model:
a. Basic scalar implicatureb. The role of message costsc. The role of the alpha parameterd. The role of the referent prior
4. A simple Python implementation
OverviewA companion to Goodman and Frank 2016, ‘Pragmatic language interpretation as probabilistic inference’
1. A bit of history2. Conceptual motivation for the Rational Speech Acts model (RSA)3. Example calculations using the model:
a. Basic scalar implicatureb. The role of message costsc. The role of the alpha parameterd. The role of the referent prior
4. A simple Python implementation5. Extensions: joint inference over utterance and context
Origin story
Origin story1. Rosenberg and Cohen 1964 (Science): early Bayesian model of production
and comprehension
Origin story1. Rosenberg and Cohen 1964 (Science): early Bayesian model of production
and comprehension
2. Lewis 1969: signaling systems in his thesis/book Convention
Origin story1. Rosenberg and Cohen 1964 (Science): early Bayesian model of production
and comprehension
2. Lewis 1969: signaling systems in his thesis/book Convention
3. Rabin 1990: recursive signaling in ‘Communication between rational agents’
Origin story1. Rosenberg and Cohen 1964 (Science): early Bayesian model of production
and comprehension
2. Lewis 1969: signaling systems in his thesis/book Convention
3. Rabin 1990: recursive signaling in ‘Communication between rational agents’
4. Camerer and Ho 2004: ‘A cognitive hierarchy models for games’ of conflict and coordination
Origin story1. Rosenberg and Cohen 1964 (Science): early Bayesian model of production
and comprehension
2. Lewis 1969: signaling systems in his thesis/book Convention
3. Rabin 1990: recursive signaling in ‘Communication between rational agents’
4. Camerer and Ho 2004: ‘A cognitive hierarchy models for games’ of conflict and coordination
5. Michael Franke and Gerhard Jäger: iterated best response (IBR)
Origin story1. Rosenberg and Cohen 1964 (Science): early Bayesian model of production
and comprehension
2. Lewis 1969: signaling systems in his thesis/book Convention
3. Rabin 1990: recursive signaling in ‘Communication between rational agents’
4. Camerer and Ho 2004: ‘A cognitive hierarchy models for games’ of conflict and coordination
5. Michael Franke and Gerhard Jäger: iterated best response (IBR)
6. Frank and Goodman 2012 (Science): very sophisticated pragmatic agents and a new Bayesian foundation
Major achievementsIncremental implicatures Cohn-Gordon, Goodman, Potts, ‘A incremental, iterated response model of
pragmatics’
Manner implicatures Bergen, Levy, Goodman, ‘Pragmatic reasoning through semantic inference’
I-implicatures and implicature blocking Potts & Levy, ‘Negotiating lexical uncertainty and speaker expertise with disjunction’
Embedded implicatures Potts, Lassiter, Levy, Frank, ‘Embedded implicatures as pragmatic inferences under compositional lexical uncertainty’
Hyperbole Kao, Wu, Bergen, Goodman, ‘Nonliteral understanding of number words’
Metaphor Kao, Bergen, Goodman, ‘Formalizing the pragmatics of metaphor understanding’
Politeness Yoon, Tessler, Goodman, Frank, ‘Polite speech emerges from competing social goals’
Irony Cohn-Gordon and Bergen, ‘Verbal irony, pretense, and the common ground’
Social meaning Work by E. Allyn Smith, Heather Burnett, Eric Acton, and others
Large-scale machine learning problems Work by Will Monroe, Jacob Andreas, Reuben Cohn-Gordon, and others
Pursuing a Gricean ideal
Pursuing a Gricean idealThe cooperative principle: Make your contribution as is required, when it is required, by the conversation in which you are engaged.
Pursuing a Gricean idealThe cooperative principle: Make your contribution as is required, when it is required, by the conversation in which you are engaged.
1. Quality: Contribute only what you know to be true. Do not say false things. Do not say things for which you lack evidence.
2. Quantity: Make your contribution as informative as is required. Do not say more than is required.
3. Relation (Relevance): Make your contribution relevant.
4. Manner: (i) Avoid obscurity; (ii) avoid ambiguity; (iii) be brief; (iv) be orderly.
5. ...
Pursuing a Gricean idealThe cooperative principle: Make your contribution as is required, when it is required, by the conversation in which you are engaged.
1. Quality: Contribute only what you know to be true. Do not say false things. Do not say things for which you lack evidence.
2. Quantity: Make your contribution as informative as is required. Do not say more than is required.
3. Relation (Relevance): Make your contribution relevant.
4. Manner: (i) Avoid obscurity; (ii) avoid ambiguity; (iii) be brief; (iv) be orderly.
5. ...
Pursuing a Gricean idealThe cooperative principle: Make your contribution as is required, when it is required, by the conversation in which you are engaged.
1. Quality: Contribute only what you know to be true. Do not say false things. Do not say things for which you lack evidence.
2. Quantity: Make your contribution as informative as is required. Do not say more than is required.
3. Relation (Relevance): Make your contribution relevant.
4. Manner: (i) Avoid obscurity; (ii) avoid ambiguity; (iii) be brief; (iv) be orderly.
5. ...
Pursuing a Gricean idealThe cooperative principle: Make your contribution as is required, when it is required, by the conversation in which you are engaged.
1. Quality: Contribute only what you know to be true. Do not say false things. Do not say things for which you lack evidence.
2. Quantity: Make your contribution as informative as is required. Do not say more than is required.
3. Relation (Relevance): Make your contribution relevant.
4. Manner: (i) Avoid obscurity; (ii) avoid ambiguity; (iii) be brief; (iv) be orderly.
5. ...
Pursuing a Gricean idealThe cooperative principle: Make your contribution as is required, when it is required, by the conversation in which you are engaged.
1. Quality: Contribute only what you know to be true. Do not say false things. Do not say things for which you lack evidence.
2. Quantity: Make your contribution as informative as is required. Do not say more than is required.
3. Relation (Relevance): Make your contribution relevant.
4. Manner: (i) Avoid obscurity; (ii) avoid ambiguity; (iii) be brief; (iv) be orderly.
5. ...
Pursuing a Gricean idealThe cooperative principle: Make your contribution as is required, when it is required, by the conversation in which you are engaged.
1. Quality: Contribute only what you know to be true. Do not say false things. Do not say things for which you lack evidence.
2. Quantity: Make your contribution as informative as is required. Do not say more than is required.
3. Relation (Relevance): Make your contribution relevant.
4. Manner: (i) Avoid obscurity; (ii) avoid ambiguity; (iii) be brief; (iv) be orderly.
5. ...
Pursuing a Gricean idealThe cooperative principle: Make your contribution as is required, when it is required, by the conversation in which you are engaged.
1. Quality: Contribute only what you know to be true. Do not say false things. Do not say things for which you lack evidence.
2. Quantity: Make your contribution as informative as is required. Do not say more than is required.
3. Relation (Relevance): Make your contribution relevant.
4. Manner: (i) Avoid obscurity; (ii) avoid ambiguity; (iii) be brief; (iv) be orderly.
5. ...
Conversational implicature
Conversational implicatureProposition q is a conversational implicature of an utterance U if and only if
Conversational implicatureProposition q is a conversational implicature of an utterance U if and only if
1. The speaker S believes it is mutual, public knowledge of all discourse participants that S is obeying the cooperative principle.
Conversational implicatureProposition q is a conversational implicature of an utterance U if and only if
1. The speaker S believes it is mutual, public knowledge of all discourse participants that S is obeying the cooperative principle.
2. S believes that, to maintain (1) given U, the discourse participants will assume S believes q.
Conversational implicatureProposition q is a conversational implicature of an utterance U if and only if
1. The speaker S believes it is mutual, public knowledge of all discourse participants that S is obeying the cooperative principle.
2. S believes that, to maintain (1) given U, the discourse participants will assume S believes q.
3. S believes it is mutual, public knowledge of all discourse participants that (2) holds.
Conversational implicatureProposition q is a conversational implicature of an utterance U if and only if
1. The speaker S believes it is mutual, public knowledge of all discourse participants that S is obeying the cooperative principle.
2. S believes that, to maintain (1) given U, the discourse participants will assume S believes q.
3. S believes it is mutual, public knowledge of all discourse participants that (2) holds.
Alex: What city does Paul live in?
Ryan: Hmm, he lives in California.
Conversational implicatureProposition q is a conversational implicature of an utterance U if and only if
1. The speaker S believes it is mutual, public knowledge of all discourse participants that S is obeying the cooperative principle.
2. S believes that, to maintain (1) given U, the discourse participants will assume S believes q.
3. S believes it is mutual, public knowledge of all discourse participants that (2) holds.
Alex: What city does Paul live in?
Ryan: Hmm, he lives in California.
A. Assume Ryan is cooperative.B. Ryan supplied less information that was required,
seemingly contradicting (A).C. Assume Ryan doesn’t know what city Paul lives in = q.D. Then Ryan’s answer is optimal given this evidence.
Conversational implicature
Conversational implicature
Conversational implicature
Conversational implicature
My friend has glasses.
Conversational implicature
My friend has glasses.
My listener knows I’m cooperative in the Gricean sense.
Conversational implicature
My friend has glasses.
My listener knows I’m cooperative in the Gricean sense.
So they will be able to work out that I mean “not hat” as well.
Conversational implicature
My friend has glasses.
My listener knows I’m cooperative in the Gricean sense.
The speaker’s utterance seems ambiguous or under-informative.
So they will be able to work out that I mean “not hat” as well.
Conversational implicature
My friend has glasses.
My listener knows I’m cooperative in the Gricean sense.
The speaker’s utterance seems ambiguous or under-informative.
But I’m assuming the speaker is cooperative in the Gricean sense!
So they will be able to work out that I mean “not hat” as well.
Conversational implicature
My friend has glasses.
My listener knows I’m cooperative in the Gricean sense.
The speaker’s utterance seems ambiguous or under-informative.
But I’m assuming the speaker is cooperative in the Gricean sense!
Ah, but if I assume they mean to convey “not hat” too, then all’s well!
So they will be able to work out that I mean “not hat” as well.
Conversational implicature
My friend has glasses.
My listener knows I’m cooperative in the Gricean sense.
The speaker’s utterance seems ambiguous or under-informative.
But I’m assuming the speaker is cooperative in the Gricean sense!
Ah, but if I assume they mean to convey “not hat” too, then all’s well!
So they will be able to work out that I mean “not hat” as well.
The Rational Speech Acts Model (RSA)
The Rational Speech Acts Model (RSA)
A simple reference game
A simple reference game
r1 r2
A simple reference game
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
r1 r2 ⟦⟧
A simple reference game
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
r1 0.5
r2 0.5
r1 r2 ⟦⟧ P
A simple reference game
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
r1 0.5
r2 0.5
‘hat’ 0
‘glasses’ 0
r1 r2 ⟦⟧ P C
A simple reference game
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
r1 0.5
r2 0.5
‘hat’ 0
‘glasses’ 0
r1 r2 ⟦⟧ P C
𝜶 = 1
A simple reference game
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
r1 0.5
r2 0.5
‘hat’ 0
‘glasses’ 0
r1 r2 ⟦⟧ P C
𝜶 = 1 r1 r2 C
‘hat’ 0 1 0
‘glasses’ 1 1 0
P 0.5 0.5
𝜶 = 1
A scalar implicature 𝜶 = 1 r1 r2 C
‘hat’ 0 1 0
‘glasses’ 1 1 0
P 0.5 0.5
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
referents r given messages m
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
truth value of message m for referent r
referents r given messages m
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
sum of all truth values of message m over all referents r′
truth value of message m for referent r
referents r given messages m
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
⟦‘hat’⟧(r1)
⟦‘hat’⟧(r1) + ⟦‘hat’⟧(r2)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
⟦‘hat’⟧(r1)
⟦‘hat’⟧(r1) + ⟦‘hat’⟧(r2)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
⟦‘hat’⟧(r1)
⟦‘hat’⟧(r1) + ⟦‘hat’⟧(r2)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
⟦‘hat’⟧(r1)
⟦‘hat’⟧(r1) + ⟦‘hat’⟧(r2)
0
0 + 1 =
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
⟦‘hat’⟧(r1)
⟦‘hat’⟧(r1) + ⟦‘hat’⟧(r2)
0
0 + 1 = = 0
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
⟦‘glasses’⟧(r2)
⟦‘glasses’⟧(r1) + ⟦‘glasses’⟧(r2)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
⟦‘glasses’⟧(r2)
⟦‘glasses’⟧(r1) + ⟦‘glasses’⟧(r2)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
⟦‘glasses’⟧(r2)
⟦‘glasses’⟧(r1) + ⟦‘glasses’⟧(r2)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
1
1 + 1 =
⟦‘glasses’⟧(r2)
⟦‘glasses’⟧(r1) + ⟦‘glasses’⟧(r2)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
1
1 + 1 = = 0.5
⟦‘glasses’⟧(r2)
⟦‘glasses’⟧(r1) + ⟦‘glasses’⟧(r2)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
r1
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
r1
r2
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’
r1
r2
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1
r2
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
messages m given referents r
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
messages m given referents r
literal listener, rather than truth conditions
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PLit(r2 | ‘hat’)
PLit(r2 | ‘hat’) + PLit(r2 | ‘glasses’)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PLit(r2 | ‘hat’)
PLit(r2 | ‘hat’) + PLit(r2 | ‘glasses’)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PLit(r2 | ‘hat’)
PLit(r2 | ‘hat’) + PLit(r2 | ‘glasses’)
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PLit(r2 | ‘hat’)
PLit(r2 | ‘hat’) + PLit(r2 | ‘glasses’)
1
1 + 0.5 =
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PLit(r2 | ‘hat’)
PLit(r2 | ‘hat’) + PLit(r2 | ‘glasses’)
1
1 + 0.5 = = 0.67
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
‘hat’
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
‘hat’
‘glasses’
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
r1
‘hat’
‘glasses’
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
r1 r2
‘hat’
‘glasses’
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
r1 r2
‘hat’ 0 0.67
‘glasses’ 1 0.33
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.75 0.25
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.75 0.25
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.75 0.25
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.75 0.25
A scalar implicaturer1 r2
‘hat’ 0 1
‘glasses’ 1 1PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.75 0.25
The calculation in more detail
The calculation in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
The calculation in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
Normalize the rows
The calculation in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
Normalize the rowsTranspose
The calculation in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
Normalize the rowsTranspose
Normalize the rows
The calculation in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
r1 r2
‘hat’ 0 0.67
‘glasses’ 1 0.33
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
Normalize the rowsTranspose Transpose
Normalize the rows
The calculation in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
r1 r2
‘hat’ 0 0.67
‘glasses’ 1 0.33
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.75 0.25
Normalize the rowsTranspose Transpose
Normalize the rows Normalize the rows
The role of message costs𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
= 1 so we can ignore it for now
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
from probabilities to scores
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
from probabilities to scores
so we can bring in real-valued costs
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
from probabilities to scores
so we can bring in real-valued costs
returns us normalizable values
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
from probabilities to scores
so we can bring in real-valued costs
often written e𝜶(log PLit(r|m) + C(m))
returns us normalizable values
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
from probabilities to scores
so we can bring in real-valued costs
Note: exp(log(x)) = x, which is why we could leave this out when we didn’t have costs and 𝜶 = 1
often written e𝜶(log PLit(r|m) + C(m))
returns us normalizable values
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
exp(log(PLit(r2 | ‘hat’) + C(‘hat’)))
exp(log(PLit(r2 | ‘hat’) + C(‘hat’))) + exp(log(PLit(r2 | ‘glasses’) + C(‘glasses’)))
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
exp(log(PLit(r2 | ‘hat’) + C(‘hat’)))
exp(log(PLit(r2 | ‘hat’) + C(‘hat’))) + exp(log(PLit(r2 | ‘glasses’) + C(‘glasses’)))
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
exp(log(PLit(r2 | ‘hat’) + C(‘hat’)))
exp(log(PLit(r2 | ‘hat’) + C(‘hat’))) + exp(log(PLit(r2 | ‘glasses’) + C(‘glasses’)))
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
exp(log(PLit(r2 | ‘hat’) + C(‘hat’)))
exp(log(PLit(r2 | ‘hat’) + C(‘hat’))) + exp(log(PLit(r2 | ‘glasses’) + C(‘glasses’)))
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
exp(log(PLit(r2 | ‘hat’) + C(‘hat’)))
exp(log(PLit(r2 | ‘hat’) + C(‘hat’))) + exp(log(PLit(r2 | ‘glasses’) + C(‘glasses’)))
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
exp(log(1) – 6)
exp(log(1) – 6) + exp(log(0.5) – 0)=
exp(log(PLit(r2 | ‘hat’) + C(‘hat’)))
exp(log(PLit(r2 | ‘hat’) + C(‘hat’))) + exp(log(PLit(r2 | ‘glasses’) + C(‘glasses’)))
The role of message costsPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.5012 0.4988
𝜶 = 1 r1 r2 C
‘hat’ 0 1 –6
‘glasses’ 1 1 0
The calculation with costs in more detail
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
Normalize
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
Transpose
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
‘hat’ ‘glasses’
r1 exp(log(0) – 6) exp(log(0.5) – 0)
r2 exp(log(1) – 6) exp(log(0.5) – 0)
Add costs
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
‘hat’ ‘glasses’
r1 exp(log(0) – 6) exp(log(0.5) – 0)
r2 exp(log(1) – 6) exp(log(0.5) – 0)
‘hat’ ‘glasses’
r1 0 0.5
r2 0.0025 0.5
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 0.0025 0.5
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
‘hat’ ‘glasses’
r1 0 0.5
r2 0.0025 0.5
Normalize
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
‘hat’ ‘glasses’
r1 0 0.5
r2 0.0025 0.5
r1 r2
‘hat’ 0 0.0049
‘glasses’ 1 0.9951
Transpose
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
‘hat’ ‘glasses’
r1 0 0.5
r2 0.0025 0.5
r1 r2
‘hat’ 0 0.0049
‘glasses’ 1 0.9951
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.5012 0.4988
Normalize
The calculation with costs in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
‘hat’ ‘glasses’
r1 0 0.5
r2 1 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.0049 0.9951
‘hat’ ‘glasses’
r1 0 0.5
r2 0.0025 0.5
r1 r2
‘hat’ 0 0.0049
‘glasses’ 1 0.9951
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.5012 0.4988
The role of the alpha parameter
The role of the alpha parameterPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
The role of the alpha parameterPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
𝜶 = 1.0
The role of the alpha parameterPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.94 0.06
𝜶 = 1.0
𝜶 = 4.0
The role of the alpha parameterPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.94 0.06
𝜶 = 1.0
𝜶 = 4.0
exp(1 * log(0.75)) = 3 * exp(1 * log(0.25))
The role of the alpha parameterPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.94 0.06
𝜶 = 1.0
𝜶 = 4.0
exp(1 * log(0.75)) = 3 * exp(1 * log(0.25))
exp(4 * log(0.75)) = 81 * exp(4 * log(0.25))
The role of the alpha parameterPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.5 0.5
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.67 0.33
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.94 0.06
𝜶 = 1.0
𝜶 = 4.0 Higher 𝜶 means stronger pragmatic inferences
exp(1 * log(0.75)) = 3 * exp(1 * log(0.25))
exp(4 * log(0.75)) = 81 * exp(4 * log(0.25))
The role of referent priorr1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
⟦‘glasses’⟧(r2) * P(r2)
⟦‘glasses’⟧(r1) * P(r1) + ⟦‘glasses’⟧(r2) * P(r2)
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
⟦‘glasses’⟧(r2) * P(r2)
⟦‘glasses’⟧(r1) * P(r1) + ⟦‘glasses’⟧(r2) * P(r2)
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
⟦‘glasses’⟧(r2) * P(r2)
⟦‘glasses’⟧(r1) * P(r1) + ⟦‘glasses’⟧(r2) * P(r2)
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
1*0.7
1*0.3 + 1*0.7 = = 0.7
⟦‘glasses’⟧(r2) * P(r2)
⟦‘glasses’⟧(r1) * P(r1) + ⟦‘glasses’⟧(r2) * P(r2)
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.51 0.49
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The role of referent priorPLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.51 0.49
r1 r2
‘hat’ 0 1
‘glasses’ 1 1
P 0.3 0.7
The calculation with referent prior in more detail
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
r1 r2
‘hat’ 0*0.3 1*0.7
‘glasses’ 1*0.3 1*0.7
Incorporate priors
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
r1 r2
‘hat’ 0*0.3 1*0.7
‘glasses’ 1*0.3 1*0.7
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7
Normalize
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
‘hat’ ‘glasses’
r1 0 0.3
r2 1 0.7
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7 Transpose
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
‘hat’ ‘glasses’
r1 0 0.3
r2 1 0.7
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7
Normalize
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
‘hat’ ‘glasses’
r1 0 0.3
r2 1 0.7
r1 r2
‘hat’ 0 0.59
‘glasses’ 1 0.41
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7 Transpose
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
‘hat’ ‘glasses’
r1 0 0.3
r2 1 0.7
r1 r2
‘hat’ 0 0.59
‘glasses’ 1 0.41
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0*0.3 0.59*0.7
‘glasses’ 1*0.3 0.41*0.7
Incorporate priors
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
‘hat’ ‘glasses’
r1 0 0.3
r2 1 0.7
r1 r2
‘hat’ 0 0.59
‘glasses’ 1 0.41
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 0.413
‘glasses’ 0.3 0.287
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
‘hat’ ‘glasses’
r1 0 0.3
r2 1 0.7
r1 r2
‘hat’ 0 0.59
‘glasses’ 1 0.41
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.51 0.49
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 0.413
‘glasses’ 0.3 0.287
Normalize
The calculation with referent prior in more detailr1 r2
‘hat’ 0 1
‘glasses’ 1 1
PLit r1 r2
‘hat’ 0 1
‘glasses’ 0.3 0.7
‘hat’ ‘glasses’
r1 0 0.3
r2 1 0.7
r1 r2
‘hat’ 0 0.59
‘glasses’ 1 0.41
PS ‘hat’ ‘glasses’
r1 0 1
r2 0.59 0.41
PL r1 r2
‘hat’ 0 1
‘glasses’ 0.51 0.49
r1 r2
‘hat’ 0 0.7
‘glasses’ 0.3 0.7
r1 r2
‘hat’ 0 0.413
‘glasses’ 0.3 0.287
A simple Python implementation
http://web.stanford.edu/class/linguist130a/
Major achievementsIncremental implicatures Cohn-Gordon, Goodman, Potts, ‘A incremental, iterated response model of
pragmatics’
Manner implicatures Bergen, Levy, Goodman, ‘Pragmatic reasoning through semantic inference’
I-implicatures and implicature blocking Potts & Levy, ‘Negotiating lexical uncertainty and speaker expertise with disjunction’
Embedded implicatures Potts, Lassiter, Levy, Frank, ‘Embedded implicatures as pragmatic inferences under compositional lexical uncertainty’
Hyperbole Kao, Wu, Bergen, Goodman, ‘Nonliteral understanding of number words’
Metaphor Kao, Bergen, Goodman, ‘Formalizing the pragmatics of metaphor understanding’
Politeness Yoon, Tessler, Goodman, Frank, ‘Polite speech emerges from competing social goals’
Irony Cohn-Gordon and Bergen, ‘Verbal irony, pretense, and the common ground’
Social meaning Work by E. Allyn Smith, Heather Burnett, Eric Acton, and others
Large-scale machine learning problems Work by Will Monroe, Jacob Andreas, Reuben Cohn-Gordon, and others
Major achievementsIncremental implicatures Cohn-Gordon, Goodman, Potts, ‘A incremental, iterated response model of
pragmatics’
Manner implicatures Bergen, Levy, Goodman, ‘Pragmatic reasoning through semantic inference’
I-implicatures and implicature blocking Potts & Levy, ‘Negotiating lexical uncertainty and speaker expertise with disjunction’
Embedded implicatures Potts, Lassiter, Levy, Frank, ‘Embedded implicatures as pragmatic inferences under compositional lexical uncertainty’
Hyperbole Kao, Wu, Bergen, Goodman, ‘Nonliteral understanding of number words’
Metaphor Kao, Bergen, Goodman, ‘Formalizing the pragmatics of metaphor understanding’
Politeness Yoon, Tessler, Goodman, Frank, ‘Polite speech emerges from competing social goals’
Irony Cohn-Gordon and Bergen, ‘Verbal irony, pretense, and the common ground’
Social meaning Work by E. Allyn Smith, Heather Burnett, Eric Acton, and others
Large-scale machine learning problems Work by Will Monroe, Jacob Andreas, Reuben Cohn-Gordon, and others
Manner implicatures Stop the car tends to signal a normal event.
Cause the car to stop tends to signal an unusual event.
Manner implicatures Stop the car tends to signal a normal event.
Cause the car to stop tends to signal an unusual event.
Manner implicatures Stop the car tends to signal a normal event.
Cause the car to stop tends to signal an unusual event.
Manner implicatures Stop the car tends to signal a normal event.
Cause the car to stop tends to signal an unusual event.
Manner implicatures Stop the car tends to signal a normal event.
Cause the car to stop tends to signal an unusual event.
Manner implicatures Stop the car tends to signal a normal event.
Cause the car to stop tends to signal an unusual event.
Manner implicatures Stop the car tends to signal a normal event.
Cause the car to stop tends to signal an unusual event.
Joint inference over utterance and lexicon
Joint inference over utterance and lexicon
Our uncertain lexicons1. synagogues and other churches
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches3. You may need angioplasty or surgery.
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches3. You may need angioplasty or surgery.4. He’s a wine lover, or oenophile.
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches3. You may need angioplasty or surgery.4. He’s a wine lover, or oenophile.5. superb but not outstanding
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches3. You may need angioplasty or surgery.4. He’s a wine lover, or oenophile.5. superb but not outstanding6. outstanding but not superb
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches3. You may need angioplasty or surgery.4. He’s a wine lover, or oenophile.5. superb but not outstanding6. outstanding but not superb7. It’s a couch, not a sofa.
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches3. You may need angioplasty or surgery.4. He’s a wine lover, or oenophile.5. superb but not outstanding6. outstanding but not superb7. It’s a couch, not a sofa.8. Does between 5 and 10 include 5 and 10?
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches3. You may need angioplasty or surgery.4. He’s a wine lover, or oenophile.5. superb but not outstanding6. outstanding but not superb7. It’s a couch, not a sofa.8. Does between 5 and 10 include 5 and 10?9. Is a barbecue a machine?
Our uncertain lexicons1. synagogues and other churches2. synagogues or churches3. You may need angioplasty or surgery.4. He’s a wine lover, or oenophile.5. superb but not outstanding6. outstanding but not superb7. It’s a couch, not a sofa.8. Does between 5 and 10 include 5 and 10?9. Is a barbecue a machine?
10. Can a horse be an athlete?
Joint inference over utterance and something else
Joint inference over utterance and something else
Joint inference over utterance and something elseIncremental implicatures Cohn-Gordon, Goodman, Potts, ‘A incremental, iterated response model of
pragmatics’
Manner implicatures Listener inference over the lexicon
I-implicatures and implicature blocking Listener inference over the lexicon; speaker communication about the lexicon
Embedded implicatures Listener inference over the lexicon
Hyperbole Listener inference over a social/emotional goal
Metaphor Listener inference over the question under discussion
Politeness Speaker pursuing social and presentational goals
Irony Speaker communicating about the presumed common ground
Social meaning Work by E. Allyn Smith, Heather Burnett, Eric Acton, and others
Large-scale machine learning problems Work by Will Monroe, Jacob Andreas, Reuben Cohn-Gordon, and others
Wrapping up
Wrapping up1. My primary goal was to motivate RSA and walk you through some example
calculations.
Wrapping up1. My primary goal was to motivate RSA and walk you through some example
calculations.
2. My hope is that this leads you to think about joint inference versions of RSA that can model intricate pragmatic phenomena.
Wrapping up1. My primary goal was to motivate RSA and walk you through some example
calculations.
2. My hope is that this leads you to think about joint inference versions of RSA that can model intricate pragmatic phenomena.
3. Additional materials:
a. YouTube video of the core calculations: https://youtu.be/bPd6CNy5UqA
b. Python implementation: https://web.stanford.edu/class/linguist130a/materials/rsa130a.py
c. https://probmods.org
Wrapping up1. My primary goal was to motivate RSA and walk you through some example
calculations.
2. My hope is that this leads you to think about joint inference versions of RSA that can model intricate pragmatic phenomena.
3. Additional materials:
a. YouTube video of the core calculations: https://youtu.be/bPd6CNy5UqA
b. Python implementation: https://web.stanford.edu/class/linguist130a/materials/rsa130a.py
c. https://probmods.org
Thanks!