an introduction to the theory of coherent lower … behavioural interpretation consistency...

62
The behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory of coherent lower previsions Enrique Miranda Rey Juan Carlos University Madrid, Spain E. Miranda c 2007 An introduction to the theory of coherent lower previsions

Upload: nguyendien

Post on 03-Mar-2018

226 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

An introduction to the theory of coherent lowerprevisions

Enrique MirandaRey Juan Carlos University

Madrid, Spain

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 2: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Overview, Part I

I Some considerations about probability.

I Coherent previsions and probabilities.

I Coherent lower and upper previsions.

I Sets of desirable gambles and linear previsions.

I Natural extension.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 3: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Which is the goal of probability?

Probability seeks to determine the plausibility of the differentoutcomes of an experiment when these cannot be predictedbeforehand.

I What is the probability of guessing the 6 winning numbers inthe lottery?

I What is the probability of arriving in 30’ from the airport tothe center of Prague by car?

I What is the probability of having a sunny day tomorrow?

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 4: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Aleatory vs. epistemic probabilities

In some cases, the probability of an event A is a property of theevent, meaning that it does not depend on the subject making theassessment. We talk then of aleatory probabilities.

However, and specially in the framework of decision making, wemay need to assess probabilities that represent our beliefs. Hence,these may vary depending on the subject or on the amount ofinformation he possesses at the time. We talk then of subjectiveprobabilities.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 5: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

The behavioural interpretation

One of the possible interpretations of subjective probability is thebehavioural interpretation. We interpret the probability of an eventA in terms of our betting behaviour: we are disposed to bet atmost P(A) on the event A.

If we consider the gamble IA where we win 1 if A happens and 0 ifit doesn’t happen, then we accept the transaction IA − P(A),because the expected gain is

(1− P(A)) ∗ P(A) + (0− P(A))(1− P(A)) = 0.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 6: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Gambles

More generally, we can consider our betting behaviour on gambles.

A gamble is a bounded real-valued variable on X , f : X → R.

It represents a reward that depends on the outcome of theexperiment modelled by X .

We shall denote the set of all gambles by L(X ).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 7: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Example

Who shall win the next Euro Cup of Basketball?

Consider the outcomes a=Greece, b=Spain, c=Lituania, d=Other.

X = {a, b, c , d}.

Consider the gamble f (a) = 3, f (b) = −2, f (c) = 5, f (d) = 10.

Depending on how likely I consider each of the outcomes I willaccept the gamble or not.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 8: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Betting on gambles

Consider now a gamble f on X . We may consider the supremumvalue µ such that we are disposed to pay µ for f , i.e., such thatthe reward f − µ is desirable: it will be the expectation E (f ).

I For any µ < E (f ), we expect to have a gain.

I For any µ > E (f ), we expect to have a loss.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 9: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Buying and selling prices

I may also give money in order to get the reward: if I am disposedto pay x for the gamble f , then the gamble f − x is desirable to me.

I may also sell a gamble f , meaning that if I am disposed to sell itat a price x then the gamble x − f is desirable to me.

In the case of probabilities, the supremum buying price for agamble f coincides with the infimum selling price, and we have afair price for f .

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 10: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Existence of indecision

When we don’t have much information, it may be difficult (andunreasonable) to give a fair price P(f ): there may be some pricesµ for which we would not be disposed to buy or sell the gamble f .

In terms of desirable gambles, this means that we would beundecided between two gambles.

It is sometimes considered preferable to give different valuesP(f ) < P(f ) than to give a precise (and possibly wrong) value.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 11: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Lower and upper previsions

The lower prevision for a gamble f , P(f ), is my supremumacceptable buying price for f , meaning that I am disposed to buy itfor P(f )− ε (or to accept the reward f − (P(f )− ε)) for any ε > 0.

The upper prevision for a gamble f , P(f ), is my infimumacceptable selling price for f , meaning that I am disposed to sell ffor P(f ) + ε (or to accept the reward P(f ) + ε− f ) for any ε > 0.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 12: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Example (cont.)

Consider the previous gamblef (a) = 3, f (b) = −2, f (c) = 5, f (d) = 10.

I If I am certain that Spain is not going to win the Euro Cup, Ishould be disposed to accept this gamble, and even to pay asmuch as 3 for it. Hence, I would have P(f ) ≥ 3.

I For the infimum selling price, if I think that the winner will beeither Spain or Greece, I should sell f for anything greaterthan 3, because for such prices I will always win money withthe transaction. Hence, I would have P(f ) ≤ 3.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 13: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

In the precise case we have P(f ) = P(f ).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 14: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Conjugacy of P , P

Under this interpretation,

P(−f ) = sup{x : −f − x acceptable }= − inf{−x : −f − x acceptable }= − inf{y : −f + y acceptable }= −P(f )

Hence, it suffices to work with one of these two functions.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 15: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Important remark

The domain K of P:

I need not have any predefined structure.

I may contain indicators of events.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 16: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Lower probabilities of events

The lower probability of A, P(A)

= lower prevision P(IA) of the indicator of A.

= supremum betting rate on A.

= measure of the evidence supporting A.

= measure of the strength of our belief in A.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 17: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Upper probabilities of events

I The upper probability of A, P(A)

= upper prevision P(IA) of the indicator of A.= measure of the lack of evidence against A.= measure of the plausibility of A.

I We have then a behavioural interpretation of upper and lowerprobabilities:

evidence in favour of A ↑ ⇒ P(A) ↑evidence against A ↑ ⇒ P(A) ↓

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 18: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 19: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Example(cont.)

I The lower probability we give to Spain being the winner wouldbe the lower prevision of Ib, where we get a reward of 1 ifSpain wins and 0 if it doesn’t.

I The upper probability of Lithuania or Greece winning wouldbe the upper probability of I{a,c}, or, equivalently, 1 minus thelower probability of Lithuania and Greece not winning.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 20: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Lower and upper previsionsLower and upper probabilities

Events or gambles?

In the case of probabilities, we are indifferent between betting onevents or on gambles: our betting rates on events (a probability)determine our betting rates on gambles (its expectation).

However, in the imprecise case, the lower and upper previsions forevents do not determine the lower and upper previsions forgambles uniquely.

Hence, lower and upper previsions are more informative than lowerand upper probabilities.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 21: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Consistency requirements

The assessments made by a lower prevision on a set of gamblesshould satisfy a number of consistency requirements:

I A combination of the assessments should not produce a netloss, no matter the outcome: avoiding sure loss.

I Our supremum buying price for a gamble f should not dependon our assessments for other gambles: coherence.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 22: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Avoiding sure loss

I represent my beliefs about the possible winner of the EuroCupsaying that

P(a) = 0.55,P(b) = 0.25,P(c) = 0.4,P(d) = 0.1

P(a) = 0.45,P(b) = 0.2,P(c) = 0.35,P(d) = 0.05

where {a, b, c , d} = {Greece,Spain, Lithuania,Other}.

This means that the gambles Ia − 0.44, Ib − 0.19, Ic − 0.34 andId − 0.04 are desirable for me. But if I accept all of them I get thesum

[Ia + Ib + Ic + Id ]− 1.01 = −0.01

which produces a net loss of 0.01, no matter the outcome of theCup.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 23: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Avoiding sure loss: general definition

Let P be a lower prevision defined on a set of gambles K. It avoidssure loss iff

supω∈X

n∑i=1

fk(ω)− P(fk) ≥ 0

for any f1, . . . , fn ∈ K.Otherwise, there is some ε > 0 such that

n∑i=1

fk − (P(fk)− ε) < −ε

no matter the outcome.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 24: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Consequences of avoiding sure loss

I P(f ) ≤ sup f .

I P(µ) ≤ µ ≤ P(µ) ∀µ ∈ R.

I If f ≥ g + µ, then P(f ) ≥ P(g) + µ.

I P(λf + (1− λ)g) ≤ λP(f ) + (1− λ)P(g).

I P(f + g) ≤ P(f ) + P(g).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 25: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Coherence

After reflecting a bit, I come up with the assessments:

P(a) = 0.55,P(b) = 0.25,P(c) = 0.4,P(d) = 0.1

P(a) = 0.45,P(b) = 0.15,P(c) = 0.30,P(d) = 0.05

These assessments avoid sure loss. However, they imply that thetransaction

Ia − 0.44 + Ic − 0.29 + Id − 0.04 = 0.23− Ib

is acceptable for me, which means that I am disposed to betagainst Spain at a rate 0.23, smaller that P(b). This indicates thatP(b) is too large.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 26: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Coherence: general definition

A lower prevision P is called coherent when given gamblesf0, f1, . . . , fn in its domain and m ∈ N,

n∑i=1

[fi (ω)− P(fi )] ≥ m[f0(ω)− P(f0)]

for some ω ∈ X .Otherwise, there is some ε > 0 such that

n∑i=1

fi − (P(fi )− ε) < m(f0 − P(f0)− ε),

and P(f0) + ε would be an acceptable buying price for f0.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 27: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Coherence on linear spaces

Suppose the domain K is a linear space of gambles:

I If f , g ∈ K, then f + g ∈ K.

I If f ∈ K, λ ∈ R, then λf ∈ K.

Then, P is coherent if and only if for any f , g ∈ K, λ ≥ 0,

I P(f ) ≥ inf f .

I P(λf ) = λP(f ).

I P(f + g) ≥ P(f ) + P(g).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 28: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Consequences of coherence

Whenever the gambles belong to the domain of P,P:

I P(∅) = P(∅) = 0,P(X ) = P(X ) = 1.

I A ⊆ B ⇒ P(A) ≤ P(B),P(A) ≤ P(B).

I P(f ) + P(g) ≤ P(f + g) ≤ P(f ) + P(g) ≤ P(f + g) ≤P(f ) + P(g).

I P(λf ) = λP(f ),P(λf ) = λP(f ) for λ ≥ 0.

I If fn → f uniformly, then P(fn) → P(f ) and P(fn) → P(f ).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 29: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Consequences of coherence (II)

I λP(f ) + (1− λ)P(g) ≤ P(λf + (1− λ)g) ∀λ ∈ [0, 1].

I P(f + µ) = P(f ) + µ ∀µ ∈ R.

I The lower envelope of a set of coherent lower previsions iscoherent.

I A convex combination of coherent lower previsions (with thesame domain) is coherent.

I The point-wise limit (inferior) of coherent lower previsions iscoherent.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 30: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Linear previsions

When K = −K := {−f : f ∈ K} and P(f ) = P(f ) for all f ∈ K,then P = P = P is a called a linear or precise prevision on K. If Kis a linear space, this is equivalent to

I P(f ) ≥ inf f .

I P(f + g) = P(f ) + P(g),

for all f , g ∈ K.

These are the previsions considered by de Finetti. We shall denoteby P(X ) the set of all linear previsions on X .

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 31: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Linear previsions and probabilities

A linear prevision P defined on indicators of events only is a finitelyadditive probability.

Conversely, a linear prevision P defined on the set L(X ) of allgambles is characterised by its restriction to the set of events,which is a finitely additive probability on P(X ), through theexpectation operator.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 32: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Coherence and precise previsions

Given a lower prevision P on K, we can consider

M(P) := {P ∈ P(X ) : P(f ) ≥ P(f ) ∀f ∈ Kk}.

I P avoids sure loss ⇐⇒ M(P) 6= ∅.I P coherent ⇐⇒ P = minM(P).

There is a 1-to-1 correspondence between coherent lower previsionsand (closed and convex) sets of linear previsions.

This correspondence establishes a sensitivity analysis interpretationto coherent lower previsions.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 33: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Example (cont.)

Consider the coherent assessments:

P(a) = 0.5,P(b) = 0.2,P(c) = 0.35,P(d) = 0.1

P(a) = 0.45,P(b) = 0.15,P(c) = 0.30,P(d) = 0.05

The equivalent set of coherent previsions represents the possiblemodels for the probabilities of each team being the winner:

M(P) := {(pa, pb, pc , pd) : pa+pb+pc +pd = 1, pa ∈ [0.45, 0.5],

pb ∈ [0.15, 0.2], pc ∈ [0.3, 0.35], pd ∈ [0.05, 0.1]}

To see that the bounds are attained, it suffices to consider thefollowing elements of M(P): (0.45, 0.15, 0.3, 0.1),(0.45, 0.2, 0.3, 0.05), (0.5, 0.15, 0.3, 0.05),(0.45, 0.15, 0.35, 0.05).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 34: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Sets of desirable gambles

Given a lower prevision P, we can consider the set of gambles

D := {f ∈ K : P(f ) ≥ 0},

the set of associated desirable gambles. Conversely, given a set ofgambles D we can define

P(f ) := sup{µ : f − µ ∈ D}

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 35: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Rationality axioms for sets of desirable gambles

If we consider a set of gambles that we find desirable, there are anumber of rationality requirements we can consider:

I A gamble that makes us lose money, no matter the outcome,should not be desirable, and a gamble which never makes uslose money should be desirable.

I A change of utility scale should not affect our desirability.

I If two transactions are desirable, so should be their sum.

These ideas define the notion of coherence for sets of gambles.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 36: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Coherence of sets of desirable gambles

A set of desirable gambles is coherent if and only if

I If sup f < 0, then f /∈ D.

I If f ≥ 0, then f ∈ D.

I If f , g ∈ D, then f + g ∈ D.

I If f ∈ D, λ ≥ 0, then λf ∈ D.

I If f + ε ∈ D for all ε > 0, then f ∈ D.

I If D is a coherent set of gambles, then the lower prevision itinduces is coherent.

I Conversely, a coherent lower prevision P determines a coherentset of desirable gambles through the previous formula.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 37: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Hence, we have three equivalent representations of our beliefs:

I Coherent lower and upper previsions.

I Closed and convex sets of linear previsions.

I Coherent sets of desirable gambles,

and we can easily go from any of these formulations to the others.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 38: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

Avoiding sure lossCoherenceEquivalent representations

Is coherence too strong?

Some critics to the property of coherence are:

I Descriptive decision theory shows that sometimes beliefsviolate the notion of coherence.

I Coherent lower previsions may be difficult to assign for peoplenot familiar with the behavioural theory of impreciseprobabilities.

I Other rationality criteria may be also interesting.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 39: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionEquivalent representationsParticular cases

Inference: natural extension

Consider the following gambles:

f (a) = 5, f (b) = 2, f (c) = −5, f (d) = −10

g(a) = 2, g(b) = −2, g(c) = 0, g(d) = 5,

and assume we make the assessments P(f ) = 2,P(g) = 0. Can wededuce anything about how much should we pay for the gamble

h(a) = 7, h(b) = 4, h(c) = −5, h(d) = 0?

For instance, since h ≥ f + g , we should be disposed to pay atleast P(f ) + P(g) = 2. But can we be more specific?

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 40: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionEquivalent representationsParticular cases

Definition

Consider a coherent lower prevision P with domain K, we seek todetermine the consequences of the assessments in K on gamblesoutside the domain.

The natural extension of P to all gambles is given by

E (f ) := sup{µ :∃fk ∈ K, λk ≥ 0, k = 1, . . . , n :

f − µ ≥n∑

i=1

λk(fk(ω)− P(fk))}

E (f ) is the supremum acceptable buying price for f that canderived from the assessments on the gambles in the domain.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 41: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionEquivalent representationsParticular cases

Example

Applying this definition, we obtain that E (h) = 3.4, by considering

h − 3.4 ≥ 1.2(f − P(f )).

Hence, the coherent assessments P(f ) = 2, P(g) = 0 imply thatwe should pay at least 3.4 for the gamble h, but not more.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 42: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionEquivalent representationsParticular cases

Natural extension: properties

I If P does not avoid sure loss, then E (f ) = +∞ for anygamble f .

I If P avoids sure loss, then E is the smallest coherent lowerprevision on L(X ) that dominates P on K.

I P is coherent if and only if E coincides with P on K.

I E is then the least-committal extension of P: if there are otherextensions, they reflect stronger assessments than those in P.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 43: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionEquivalent representationsParticular cases

In terms of sets of linear previsions

Given a lower prevision P and its set of dominating linear previsionM(P), the natural extension E of P is the lower envelope of P.This provides the natural extension with a sensitivity analysisinterpretation.

P is coherent if and only if M(E ) = M(P).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 44: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionEquivalent representationsParticular cases

In terms of sets of gambles

Consider a coherent set of desirable gambles D. Its naturalextension E is the set of gambles

E :={g ∈ L(X ) : (∀δ > 0)(∃n ≥ 0, λk ∈ R+, fk ∈ D)

g ≥n∑

k=1

λk fk − δ}.

It is the smallest coherent set of desirable gambles that contains D.It is the smallest closed convex cone including D and allnon-negative gambles.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 45: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionEquivalent representationsParticular cases

All these procedures of natural extension agree with one another:if we consider a coherent lower prevision P, its set of desirablegambles DP , the natural extension of this set EDP

and then thecoherent lower prevision associated to this set, we obtain thenatural extension of P.

Hence, we have three equivalent ways of representing ourbehavioural dispositions:

I Coherent lower previsions.

I Sets of linear previsions.

I Sets of desirable gambles.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 46: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionEquivalent representationsParticular cases

Particular cases

The natural extension coincides with some familiar extensionprocedures for some particular cases of coherent lower previsions:

I Lebesgue integration of a probability measure.

I Choquet integration of 2-monotone lower probabilities.

I Bayes’ rule for probability measures.

I Robust Bayesian analysis.

I Logical deduction.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 47: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Conditional lower previsions

I Definition.

I Consistency requirements.

I Natural extension.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 48: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Updating information

So far, we have assumed that all we know about the outcome ofthe experiment modelled by is that it belongs to a set X .

But we may have some additional information about this outcome,for instance that it belongs to a set B.

We need to update then our assessment by means of a conditionallower prevision

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 49: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Example (cont.)

We are in the semifinals of the Euro Cup of basketball, and theremaining teams are Spain, Lithuania, France, and Serbia.

For the gamble f on{a, b, c , d} = {Greece,Spain, Lithuania,Other} given byf (a) = 5, f (b) = 2, f (c) = −5, f (d) = −10, I had given thesupremum buying price P(f ) = 2.

But now I should probably lower this supremum buying price,unless I am certain that Spain will be the winner!

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 50: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Definition

Consider a subset B of X , and a gamble f on X .

P(f |B) represents our supremum acceptable buying price for f ,after we come to know that the outcome of the experimentbelongs to B.

If we consider a partition B of X , we define P(f |B) as the gamblethat takes the value P(f |B) on the elements of B. It is called aconditional lower prevision.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 51: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Separate coherence

A first consistency requirement is that the updated assessments areseparately coherent. This means that:

I P(B|B) = 1 for any B ∈ B.

I P(·|B) is a coherent lower prevision.

I Consequence: P(·|B) is determined by its values on B: forany B ∈ B,

IBh = IBh′ ⇒ P(h|B) = P(h′|B).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 52: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Example(cont)

The assessmentsP(a) = 0.45,P(b) = 0.15,P(c) = 0.3,P(d) = 0.05 that I gavebefore the championships started, are not coherent anymore:separate coherence implies that

P(a|Spain, Lithuania,France,Serbia) = 0.

I should have P(b, c , d |Spain, Lithuania,France,Serbia) = 1.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 53: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Separate coherence: equivalent formulation

If the domain K of P(f |B) is a linear space, this holds if and onlyif for any λ ≥ 0, f , g ∈ K and B ∈ B,

I P(f |B) ≥ infx∈B f (x)

I P(f + g |B) ≥ P(f |B) + P(g |B)

I P(λf |B) = λP(f |B)

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 54: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Consistency with the initial assessments

Not only our updated lower previsions have to be coherent, but weneed them to be coherent with the initial assessments.

For instance, if we consider a gamble f on {a, b, c , d} given byf (a) = −1, f (b) = 0, f (c) = 1, f (d) = 2 and we make P(f ) = 1.5,it does not make sense that if we learn that the outcome of theexperiment is either c or d then we make P(f |{c , d}) = 1.

The connection between unconditional and conditional lowerprevisions follows from the updating principle:

I We are disposed to accept a gamble f after observing B ifand only if the gamble IB f is desirable.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 55: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Coherence of conditional and unconditional previsions

Consider an unconditional lower prevision P on a linear space ofgambles K and a conditional lower prevision P(·|B) with lineardomain H. They are coherent if and only if for any f1, f2 ∈ K,g1, g2 ∈ H and B ∈ B,

I supx [f1 − P(f1) + g1 − P(g1|B)− (f2 − P(f2))](x) ≥ 0.

I supx [f1 − P(f1) + g1 − P(g1|B)− IB(g2 − P(g2|B))](x) ≥ 0.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 56: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Interpretation

The interpretation of this condition is that the supremumacceptable buying price for a gamble (either conditional orunconditional) should not be raised by considering other(conditional or unconditional) gambles.

A similar condition can be given for non-linear domains.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 57: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Equivalent conditions

It follows that if P and P(·|B) are coherent and the domains arerich enough, then

P(IB(f − P(f |B))) = 0

for any B ∈ B. This is called the Generalised Bayes Rule.

I If P(B) > 0, then P(f |B) is the unique value that satisfiesthe Generalised Bayes Rule.

I In that case, P(f |B) can be calculated as the lower envelopeof the values P(f |B), where P ≥ P and P(f |B) is calculatedusing Bayes’ rule.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 58: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Example (cont.)

Given our initial coherent assessments

P(a) = 0.5,P(b) = 0.2,P(c) = 0.35,P(d) = 0.1

P(a) = 0.45,P(b) = 0.15,P(c) = 0.30,P(d) = 0.05,

and if we know that the outcome will belong to {b, c , d}, we canupdate them using the envelope theorem, obtaining

P(b|{b, c , d}) = 3/11,P(c |{b, c , d}) = 6/11,P(d |{b, c , d}) = 1/11.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 59: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

GBR for linear previsions

When P and P(·|B) are linear and the partition B is finite, theGBR becomes

P(f |B) =P(fIB)

P(B)if P(B) > 0.

More generally, if B is infinite, coherence is equivalent toP(f ) = P(P(f |B)) for any gamble f , which is stronger than theGBR.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 60: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Natural extension

As in the unconditional case, we can study the behaviouralconsequences of the assessments given by coherent conditional andunconditional previsions. Given coherent P on K and P(·|B) on H,their natural extension is

E (f ) = sup{µ : ∃f1 ∈ K, f2 ∈ H, f −µ ≥ f1−P(f1)+ f2−P(f2|B)},

and

E (f |B) =

{max{β : E (IB(f − β)) ≥ 0} if E (B) > 0

sup{β : IB(f − β) ≥ IB(g − P(g |β)) for some g ∈ H}otherwise

for all f ∈ L(X ).

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 61: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Problems with the natural extension

The definition of the natural extension does not always have theproperties of the natural extension from the unconditional case:

I In some cases there are no coherent extensions.

I If there are, the natural extension may be only a lower boundof the smallest coherent extensions.

I It provides the smallest coherent extensions when the partitionB is finite.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions

Page 62: An introduction to the theory of coherent lower … behavioural interpretation Consistency requirements Natural extension Conditional lower previsions An introduction to the theory

The behavioural interpretationConsistency requirements

Natural extensionConditional lower previsions

DefinitionConsistency requirementsThe Generalised Bayes RuleNatural extension

Other types of extensions

I Regular extension: we compute the lower envelope of theconditional linear previsions dominating P,P(·|B).

I Marginal extension: we have a conditional lower previsionP(·|B) on L(X ) and a lower prevision P on the gambles thatare constant in the elements of B.

E. Miranda c©2007 An introduction to the theory of coherent lower previsions