multi-agent belief dynamics - vrije universiteit brussela. baltag, l.s. moss and h.p. van ditmarsch,...

89
NASSLLI 2010 1 Multi-Agent Belief Dynamics Alexandru Baltag, Oxford University website: http://alexandru.tiddlyspot.com Sonja Smets, University of Groningen website: http://sonja.tiddlyspot.com

Upload: others

Post on 31-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 1

Multi-Agent Belief Dynamics

Alexandru Baltag, Oxford University

website:

http://alexandru.tiddlyspot.com

Sonja Smets, University of Groningen

website:

http://sonja.tiddlyspot.com

Page 2: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 2

Combining two paradigms: DEL and BR

TOPIC: Logics for reasoning about multi-agent belief revision andknowledge updates induced by learning, communication andinteraction.

Methodology: Extend the DEL (Dynamic Epistemic Logic)setting and methods to integrate ideas from classical BR (BeliefRevision theory).

Applications: AI (multi-agent systems), CS (distributedcomputation), Philosophy (Epistemology), Economics (Game Theory),Social-Political Sciences (Social Choice Theory), Linguistics(pragmatics of natural language) etc.

Page 3: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 3

Material: slides and papers

Core Material:

A. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic andInformation Update. In (Eds) P. Adriaans and J. van Benthem,Philosophy of Information, part of Handbook of the Philosophy ofScience, vol. 8, pp. 361-465, Elsevier, 2008

A. Baltag and S. Smets, A Qualitative Theory of Dynamic InteractiveBelief Revision, in G. Bonanno, W. van der Hoek, M. Wooldridge(eds.), Logic and the Foundations of Game and Decision Theory, Textsin Logic and Games, Vol 3, pp.9-58, Amsterdam University Press, 2008

Page 4: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 4

Optional:

H.P. van Ditmarsch, W. van der Hoek and B.P. Kooi, DynamicEpistemic Logic, Synthese Library 337, Springer, 2007

R. Fagin, J.Y. Halpern, Y. Moses and M.Y. Vardi, Reasoning aboutKnowledge, MIT Press, 1995

P. Gardenfors, Knowledge in Flux: Modelling the Dynamics ofEpistemic States, MIT Press, 1988

P. Gardenfors. Belief Revision. Cambridge University Press. 1992.

P. Gochet and P. Gribomont, Epistemic Logic. In (eds) D. M. Gabbayand J. Woods, Handbook of the History of Logic, vol. 7, pp. 99-195,Elsevier, 2006

J.Y. Halpern, Reasoning about Uncertainty, MIT Press, 2003

S. O. Hansson. A textbook in Belief Dynamics. Dordrecht: KluwerAcademic Publishers. 1999.

Page 5: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 5

J. Hintikka, Knowledge and Belief, Cornell University Press, 1962

J.-J.Ch. Meyer and W. van der Hoek, Epistemic Logic for AI andComputer Science, Cambridge Tracts in Theoretical Computer Science,nr. 41, Cambridge University Press, 1995.

Page 6: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 6

Plan of Lecture 1

1.1 Introduction: Examples, Stories and Puzzles.

1.2 Kripke Models, Epistemic Models, Doxastic Models. Logics: S5,S4, KD45

1.3 Dynamics: (Single-Agent) Updates. The Problem of BeliefRevision.

Page 7: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 7

1.1. Examples, Stories and Puzzles

Examples of Multi-agent Systems:

1. Computation: a network of communicating computers; theInternet

2. Games: players in a game, e.g. chess or poker

3. AI: a team of robots exploring their environment and interactingwith each other

4. Cryptographic Communication: some communicating agents(“principals”) following a cryptographic protocol to communicatein a private and secret way

Page 8: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 8

5. Economics: economic agents engaged in transactions in a market

6. Society: people engaged in social activities

7. Politics: “political games”, diplomacy, war.

Page 9: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 9

“Dynamic” and “informational” systems

Such multi-agent systems are dynamic: agents “do” some actions,changing the system by interacting with each other. E.g. of actions:moves in a game, communicating (sending, receiving or intercepting)messages, buying/selling etc.

On the other hand, these systems are also informational systems:agents acquire, store, process and exchange information about eachother and the environment. This information may be truthful, and thenit’s called knowledge. Or the information may be only plausible (orprobable), well-justified, but still possibly false; then it’s called(justified) belief.

Page 10: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 10

Nested Knowledge in Chinese Philosophy

Chuangtze and Hueitse had strolled onto a bridge over the Hao,when the former observed,

“See how the small fish are darting about! That is thehappiness of the fish”.

“You are not fish yourself”, said Hueitse, “so how can youknow the happiness of the fish?”

“You are not me”, retorted Chuangtse, “so how can you knowthat I do not know?”

Chuangtse, c. 300 B. C.

Page 11: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 11

Self-Nesting: (Lack of) Introspection

As we know,There are known knowns.There are things we know we know.We also knowThere are known unknowns.That is to say We know there are some thingsWe do not know.But there are also unknown unknowns,The ones we don’t knowWe don’t know.

Donald Rumsfeld, Feb. 12, 2002, Department of Defense news briefing

Page 12: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 12

... And Belief?

“Man is made by his belief. As he believes, so he is.”

(Bhagavad Gita, part of the epic poem Mahabharata)

“Myths which are believed in tend to become true.”

(George Orwell)

“To succeed, we must first believe that we can.”

(Michael Korda)

“By believing passionately in some thing that does not yet exist,we create it.”

(Nikos Kazantzakis)

Page 13: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 13

“The thing always happens that you really believe in; and thebelief in a thing makes it happen.”

(Frank Lloid Wright)

Page 14: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 14

Oh, really?! But this is a Lie!

So what?

Everyone lies online. In fact, readers expect you to lie. If youdon’t, they’ll think you make less than you actually do. So theonly way to tell the truth is to lie.

(Brad Pitt’s thoughts on lying about how much money you make onyour online dating profile; Aug 2009 interview to “Wired” magazine)

Page 15: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 15

Well, but all this believing, lying and cheating interactions can only endup in extreme skepticism:

“I don’t even believe the truth anymore.”

(J. Edgar Hoover, the founder of the FBI)

Though even this was already anticipated centuries ago by the mostfamous pirate of the Carribean:

Page 16: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 16

Mullroy: What’s your purpose in Port Royal, Mr. Smith?

Murtogg: Yeah, and no lies.

Jack Sparrow: Well, then, I confess, it is my intention tocommandeer one of these ships, pick up a crew in Tortuga, raid,pillage, plunder and otherwise pilfer my weasely black guts out.

Murtogg: I said no lies.

Mullroy: I think hes telling the truth.

Murtogg: Don’t be stupid: if he were telling the truth, hewouldn’t have told it to us.

Jack Sparrow: Unless, of course, he knew you wouldn’t believethe truth even if he told it to you.

Page 17: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 17

Back to the real world!

The only escape from these infinite loops seems to be the solid groundof the real world.

“Reality is that which, when you stop believing in it, doesn’t goaway.”

(Philip K. Dick)

But how to get back to reality, from the midst of our mistakenbeliefs??

Page 18: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 18

Answer: Belief Revision!

Dare to confront your mistakes! Learn to give up!

“It is not bigotry to be certain we are right; but it is bigotry tobe unable to imagine how we might be wrong.”

(G. K. Chesterton)

Page 19: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 19

Belief revision is action!

True belief revision is “dynamic”: a sustained, self-correcting,truth-tracking action. True knowledge can only be recoveredby effort.

So, finally, we get to what we could call the “Motto” ofDynamic-Epistemic Logic:

“The wise sees action and knowledge as one. They see truly.”

(Bhagavad Gita, once again)

Page 20: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 20

Knowledge and Uncertainty

Uncertainty is a corollary of imperfect knowledge (or “imperfectinformation”):A game of imperfect information is one in which some moves arehidden, so that the players don’t know all that was going on: theyonly have a partial view of the situation.Example: poker (in contrast to chess).

A player may be uncertain about the real situation of the game at agiven time: e.g. they simply cannot distinguish between a situation inwhich another player has a winning hand and a situation in which thisis not the case. For all our player knows, these situations are both“possible”.

Page 21: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 21

Evolving Knowledge

The knowledge a player has may change in time, due to his or otherplayers’ actions.

For instance, he can do some move that allows him to learn some of thecards of the other player. As a general rule, players try to minimizetheir uncertainty and increase their knowledge.

Page 22: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 22

Wrong Beliefs: Cheating

In their drive for more knowledge and less uncertainty, playersmay be induced to acquire a false “certainty”: they will “know”things that are not true.

Example: bluffing (in poker) may induce your opponent to believe youhave a winning hand, when in fact you don’t.

Notice that such a wrong belief, once it becomes “certainty”, mightlook just like knowledge (to the believer):

your opponent may really think he “knows” you have a winning hand.

Page 23: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 23

“Everybody knows...”

Suppose that, in fact, everybody knows the road rules in France. Forinstance, everybody knows that a red light means “stop” and a greenlight means “go”. And suppose everybody respects the rules that (s)heknows.

Question: Is this enough for you to feel safe, as a driver?

Answer: NO.

Why? Think about it!

Page 24: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 24

Common Knowledge

Suppose the road rules (and the fact they are respected) are commonknowledge: everybody knows (and respects) the rules, and everybodyknows that everybody knows (and respects) the rules, and... etc.

Now, you can drive safely!

Page 25: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 25

Epistemic Puzzle no. 1: To learn is to falsify

Our starting example concerns a “love triangle”: suppose that Aliceand Bob are a couple, but Alice has just started an affair with Charles.

At some point, Alice sends to Charles an email, saying:

“Don’t worry, Bob doesn’t know about us”.

But suppose now that Bob accidentally reads the message (by, say,secretely breaking into Alice’s email account).

Then, paradoxically enough, after seeing (and believing) the messagewhich says he doesn’t know..., he will know !

Page 26: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 26

So, in this case, learning the message is a way to falsify it.

As we’ll see, this example shows that standard belief-revisionpostulates may fail to hold in such complex learning ac-tions, in which the message to be learned refers to the knowledgeof the hearer.

Page 27: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 27

Epistemic Puzzle no. 2: Self-fulfilling falsehoods

Suppose Alice becomes somehow convinced that Bob knows everything(about the affair).

This is false (Bob doesn’t have a clue), but nevertheless she’s soconvinced that she makes an attempt to warn Charles by sending him amessage:

”Bob knows everything about the affair!”.

As before, Bob secretely reads (and believes) the message. While falseat the moment of its sending, the message becomes true: now he knows.

Page 28: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 28

So, communicating a false belief (i.e. Alice’s action) might be aself-fulfilling prophecy: Alice’s false belief, once communi-cated, becomes true.

In the same time, the action of (reading and) believing a falsehood(i.e. Bob’s action) can be self-fulfilling: the false message, oncebelieved, becomes true.

Page 29: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 29

Epistemic Puzzle no. 3: Self-enabling falsehoods

Suppose that in fact Alice was faithful, despite all the attemptsmade by Charles to seduce her.

Out of despair, Charles comes up with a “cool” plan of how to break upthe marriage:

he sends an email which is identical to the one in the second puzzle(bearing Alice’s signature and warning Charles that Bob knows abouttheir affair.) Moreover, he makes sure somehow that Bob will have theopportunity to read the message.

Knowing Bob’s quick temper, Charles expects him to sue for a divorce;knowing Alice’s fragile, volatile sensitivity, he also expects that, whileon the rebound, she’d be open for a possible relationship with himself(Charles).

Page 30: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 30

The plan works: as a result, Bob is mislead into “knowing” that hehas been cheated.

He promptly sends Alice a message saying: ”I’ll see you in court”.

After divorce, Charles makes his seductive move, playing thefriend-in-need. Again, the original message becomes true: now,Alice does have an affair with Charles, and Bob knows it.

Sending a false message has enabled its validation.

Page 31: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 31

Movie: The Men Who Stare at Goats

About the U.S. Army’s exploration of psi research andmilitary applications of the paranormal.

General Brown: When did the Soviets begin this type ofresearch?

Brigadier General Dean Hopgood: Well, Sir, it looks like theyfound out about our attempt to telephathically communicatewith one of our nuclear subs. The Nautilus, while it was underthe Polar cap.

General Brown: What attempt?

Dean: There was no attempt. It seems the story was a Frenchhoax.

Page 32: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 32

Dean: But the Russians think the story about the story being aFrench hoax is just a story, Sir.

General Brown: So they started doing psi research because theythought we were doing psi research, when in fact we weren’tdoing psi research?

Dean: Yes sir. But now that they *are* doing psi research,we’re gonna have to do psi research, sir.

Dean: We can’t afford to have the Russian’s leading the field inthe paranormal.

Page 33: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 33

Epistemic Puzzle no. 4: Muddy Children

Suppose there are 4 children, all of them being good logicians, exactly 3of them having dirty faces. Each can see the faces of the others, butdoesn’t see his/her own face.

The father publicly announces:

“At least one of you is dirty”.

Then the father does another paradoxical thing: starts repeating overand over the same question “Do you know if you are dirty or not,and if so, which of the two?”

Page 34: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 34

After each question, the children have to answer publicly, sincerely andsimultaneously, based only on their knowledge, without taking anyguesses. No other communication is allowed and nobody can lie.

One can show that, after 2 rounds of questions and answers, all thedirty children will come to know they are dirty! So they givethis answer in the 3rd round, after which the clean child also comesto knows she’s clean, giving the correct answer at the 4th round.

Page 35: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 35

Muddy Children Puzzle continued

First Question: What’s the point of the father’s first announcement(”At least one of you is dirty”)?

Apparently, this message is not informative to any of the children: thestatement was already known to everybody! But the puzzle wouldn’twork without it: in fact this announcement adds information to thesystem! The children implicitly learn some new fact, namely the factthat what each of them used to know in private is now public knowledge.

Page 36: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 36

Second Question: What’s the point of the father’s repeated questions?

If the father knows that his children are good logicians, then at eachstep the father knows already the answer to his question,

before even asking it! However, the puzzle wouldn’t work without thesequestions. In a way, it seems the father’s questions are “abnormal”, inthat they don’t actually aim at filling a gap in father’s knowledge; butinstead they are part of a Socratic strategy ofteaching-through-questions.

Third Question: How can the children’s statements of ignorance leadthem to knowledge?

Page 37: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 37

Puzzle no 5: Sneaky Children

Let us modify the last example a bit.

Suppose the children are somehow rewarded for answering as quickly aspossible, but they are punished for incorrect answers; thus they areinterested in getting to the correct conclusion as fast as possible.

Suppose also that, after the first round of questions and answers,two of the dirty children “cheat” on the others by secretlyannouncing each other that they’re dirty, while none of theothers suspects this can happen.

Page 38: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 38

Honest Children Always Suffer

As a result, they both will answer truthfully “I know I am dirty” in thesecond round.

One can easily see that the third dirty child will be totallydeceived, coming to the “logical” conclusion that... she isclean!

So, after giving this wrong answer in the third round, she ends up bybeing punished for her credulity, despite her impeccable logic.

Page 39: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 39

Clean Children Always Go Crazy

What happens to the clean child?

Well, assuming she doesn’t suspect any cheating, she is facinga contradiction: two of the dirty children answered too quickly,coming to know they’re dirty before they were supposed to know!

If the third child simply updates her knowledge monotonically with thisnew information (and uses classical logic), then she ends up believingeverything : she goes crazy!

Page 40: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 40

Puzzle no 6: Surprised Children

The students in a high-school class know for sure that the date ofthe exam has been fixed in one of the five (working) days ofnext week: it’ll be the last week of the term, and it’s got to be anexam, and only one exam.

But they don’t know in which day.

Now the Teacher announces her students that the exam’s date willbe a surprise: even in the evening before the exam, the students willstill not be sure that the exam is tomorrow.

Page 41: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 41

Paradoxical Argumentation

Intuitively, one can prove (by backward induction, starting withFriday) that, IF this announcement is true, then the examcannot take place in any day of the week.

So, using this argument, the students come to “know” thatthe announcement is false: the exam CANNOT be a surprise.

GIVEN THIS, they feel entitle to dismiss the announcement, and...THEN, surprise: whenever the exam will come (say, on Tuesday), itWILL indeed be a complete surprise!

Page 42: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 42

1.2. Single Epistemic-Doxastic Logics

Epistemic Logic was first formalized by Hintikka (1962), who alsosketched the first steps in formalizing doxastic logic.

They were further developed and studied by philosophers and logicians(Parikh, Stalnaker, van Benthem etc.), computer-scientists (Halpern,Vardi, Fagin etc.) and economists (Aumann, Brandeburger, Samet etc).

Page 43: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 43

Syntax of Epistemic-Doxastic Logic

ϕ ::= p | ¬ϕ | ϕ ∧ ϕ | Kϕ | Bϕ

Page 44: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 44

Models for Single-Agent Information

We are given a set of “possible worlds”, meant to represent all therelevant epistemic/doxastic possibilities in a certain situation.

EXAMPLE 1: a coin is on the table, but the (implicit) agent doesn’tknow (nor believe he knows) which face is up.

º¹ ¸·³´ µ¶H

º¹ ¸·³´ µ¶T

Page 45: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 45

Knowledge or Belief

The universal quantifier over the domain of possibilities is interpretedas knowledge, or belief, by the implicit agent.

So we say the agent knows, or believes, a sentence ϕ if ϕ is true inall the possible worlds of the model.

The specific interpretation (knowledge or belief) depends on thecontext.

In the previous example, the agent doesn’t know (nor believe) that thecoin lies Heads up, and neither that it lies Tails up.

Page 46: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 46

Learning: Update

EXAMPLE 2:

Suppose now the agent looks at the upper face of the coin and he seesit’s Heads up.

The model of the new situation is now:

º¹ ¸·³´ µ¶H

Only one epistemic possibility has survived: the agent nowknows/believes that the coin lies Heads up.

Page 47: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 47

Update as World Elimination

In general, updating corresponds to world elimination:an update with a sentence ϕ is simply the operation of deleting allthe non-ϕ possibilities

After the update, the worlds not satisfying ϕ are no longer possible: theactual world is known not to be among them.

Page 48: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 48

Truth and Reality

But is ϕ “really” true (in the “real” world), apart from the agent’sknowledge or beliefs?

For this, we need to specify which of the possible worlds is is theactual world.

Page 49: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 49

Real World

Suppose that, in the original situation (before learning), the coin liedHeads up indeed (though the agent didn’t know, or believe, this).

We represent this situation by marking the actual (“real” state ofthe) world with a red star:

¤£

¡¢* H

¤£

¡¢T

Page 50: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 50

Mistaken Updates

But what if the real world is not among the “possible” ones? What ifthe agent’s sight was so bad that she only thought she saw the coinlying Heads up, when in fact it lied Tails up?

After the “update”, her epistemically-possible worlds are justº¹ ¸·³´ µ¶H

but we cannot mark the actual world here, since it doesn’t belongto the agent’s model!

Page 51: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 51

False Beliefs

Clearly, in this case, the model only represents the agent’s beliefs, butNOT her “knowledge” (in any meaningful sense): the agent believesthat the coin lies Heads up, but this is wrong!

Knowledge is usually assumed to be truthful, but in this case theagent’s belief is false.

But still, how can we talk about “truth” in a model in which theactual world is not represented?!

Page 52: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 52

Third-person Models

The solution is to go beyond the agent’s own model, by takingan “objective” (third-person) perspective: the real pos-sibility is always in the model, even if the agent believes it tobe impossible.

To point out which worlds are believed to be possible by the agentwe encircle them: these worlds form the “sphere of beliefs”.

“Belief ” now quantifies ONLY over the worlds in this sphere,while “knowledge” still quantifies over ALL possible worlds.

EXAMPLE 3:¤£

¡¢HON MLHI JK ¤

£¡¢* T

Page 53: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 53

Example 4

In the Surprise Exam story, a possible initial situation (BEFORE theTeacher’s announcement) might be given by:

ON MLHI JKº¹ ¸·³´ µ¶1

º¹ ¸·³´ µ¶2

º¹ ¸·³´ µ¶3

º¹ ¸·³´ µ¶4

º¹ ¸·³´ µ¶5

where i means that: the exam takes places in the i-th (working) day ofthe week.

This encodes an initial situation in which the student knows thatthere will be an exam in (exactly) one of the days, but he doesn’tknow the day, and moreover he doesn’t have any special beliefabout this: he considers all days as being possible.

We are not told when will the exam take place: no red star.

Page 54: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 54

Beliefs

EXAMPLE 5:

If however, the Student believes (for some reason or another) that theexam will take place either Monday or Tuesday, then the correctrepresentation is:

º¹ ¸·³´ µ¶1_^ ]\XY Z[

º¹ ¸·³´ µ¶2

º¹ ¸·³´ µ¶3

º¹ ¸·³´ µ¶4

º¹ ¸·³´ µ¶5

Again, we are not told when is the exam, so no red star.

Page 55: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 55

However, if we are told that the exam is in fact on Thursday (thoughthe student still doesn’t know this), then the model is:

º¹ ¸·³´ µ¶1_^ ]\XY Z[

º¹ ¸·³´ µ¶2

º¹ ¸·³´ µ¶3

º¹ ¸·³´ µ¶*4

º¹ ¸·³´ µ¶5

In this model, some of the student’s beliefs are false, since the realworld does NOT belongs to his “sphere of beliefs”.

Page 56: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 56

Simple Models for Knowledge and Belief

For a set Φ of facts, a (single-agent, pointed) epistemic-doxasticmodel is a triple

S = (S, S0, ‖.‖, s∗ ) , consisting of:

1. A set S of ”possible worlds” (or possible “states of the world”, alsoknown as “ontic states”). S defines the agent’s epistemic state:these are the states that are “epistemically possible”.

2. A non-empty subset S0 ⊆ S, S0 6= ∅, called the “sphere of beliefs”,or the agent’s doxastic state: these are the states that“doxastically possible”.

3. A map ‖.‖ : Φ → P(S), called the valuation, assigning to eachp ∈ Φ a set ‖p‖S of states.

4. A designated world s∗ ∈ S, called the “actual world”.

Page 57: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 57

Interpretation

• The epistemic state S gives us an (implicit) agent’s state ofknowledge: he knows the real world belongs to S, but cannotdistinguish between the states in S, so cannot know which of themis the real one.

• The doxastic state S0 gives us the agent’s state of belief : hebelieves that the real world belongs to S0, but his beliefs areconsistent with any world in S0.

• The valuation tells us which ontic facts hold in which world:we say that p is true at s if s ∈ ‖p‖.

• The actual world s∗ gives us the “real state” of the world: whatreally is the case.

Page 58: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 58

Truth

For any world w in a model S and any sentence ϕ, we write

w |=S ϕ

if ϕ is true in the world w.When the model S is fixed, we skip the subscript and simpkywrite w |= ϕ.

For atomic sentences, this is given by the valuation map:

w |= p iff w ∈ ‖p‖,

while for other propositional formulas is given by the usual truthclauses:

Page 59: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 59

w |= ¬ϕ iff w 6|= ϕ,

w |= ϕ ∧ ψ iff w |= ϕ and w |= ψ,

w |=S ϕ ∨ ψ iff either w |= ϕ or w |= ψ.

(We take ϕ ⇒ ψ to be just an abbreviation for ¬ϕ ∨ ψ, and ϕ ⇔ ψ tobe an abbreviation for (ϕ ⇒ ψ) ∧ (ψ ⇒ ϕ).)

Page 60: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 60

Interpretation Map

We can extend the valuation ‖p‖S to an interpretation map ‖ϕ‖S forall propositional formulas ϕ:

‖ϕ‖S := {w ∈ S : w |=S ϕ}.

Obviously, this has the property that

‖¬ϕ‖S = S \ ‖ϕ‖S,

‖ϕ ∧ ψ‖S = ‖ϕ‖S ∩ ‖ψ‖S,

‖ϕ ∨ ψ‖S = ‖ϕ‖S ∪ ‖ψ‖S.

We now want to extend the interpretation to all the sentences indoxastic-epistemic logic.

Page 61: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 61

Knowledge and Belief

Knowledge is defined as “truth in all epistemically possibleworlds”, while belief is “truth in all doxastically possibleworlds”:

w |= Kϕ iff t |= ϕ for all t ∈ S,

w |= Bϕ iff t |= ϕ for all t ∈ S0.

Page 62: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 62

Validity

A sentence is valid over epistemic-doxastic models if it is true at everystate in every epistemic-doxastic model.

A sentence is satisfiable (over epistemic-doxastic models) if it is truesome state in some doxastic-epistemic model.

Page 63: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 63

Consequences

For every sentences ϕ,ψ etc, the following are valid overepistemic-doxastic models:

1. Veracity of Knowledge:

Kϕ ⇒ ϕ

2. Positive Introspection of Knowledge:

Kϕ ⇒ KKϕ

3. Negative Introspection of Knowledge:

¬Kϕ ⇒ K¬Kϕ

4. Consistency of Belief:

¬B(ϕ ∧ ¬ϕ)

Page 64: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 64

5. Positive Introspection of Belief:

Bϕ ⇒ BBϕ

6. Negative Introspection of Belief:

¬Bϕ ⇒ B¬Bϕ

7. Strong Positive Introspection of Belief:

Bϕ ⇒ KBϕ

8. Strong Negative Introspection of Belief:

¬Bϕ ⇒ K¬Bϕ

9. Knowledge implies Belief:

Kϕ ⇒ Bϕ

Page 65: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 65

Epistemic-Doxastic Logic: Sound and Complete Proof System

In fact, a sound and complete proof system for single-agentepistemic-doxastic logic can be obtained by taking as axioms validities(1)-(4) and (7)-(9) above, together with “Kripke’s axioms” forknowledge and belief

K(ϕ ⇒ ψ) ⇒ (Kϕ ⇒ Kψ)

B(ϕ ⇒ ψ) ⇒ (Bϕ ⇒ Bψ)

, and together with following inference rules:

Modus Ponens: From ϕ and ϕ ⇒ ψ infer ψ.

Necessitation: From ϕ infer Kϕ.

Page 66: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 66

Generalization

Many philosophers deny that knowledge is introspective, and somephilosophers deny that belief is introspective. In particular, bothcommon usage and Platonic dialogues suggest that people maybelieve they know things that they don’t actually know.

Other of the above validities may also be debatable: e.g. some “crazy”agents may have inconsistent beliefs.

So it is convenient to have a more general semantics, in which theabove principles do not necessarily hold, so that one can pick whicheverprinciples one considers true.

Page 67: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 67

Kripke Semantics

For a set Φ of facts, a Φ-Kripke model is a triple

S = (S, Ri, ‖.‖, s∗ )i∈I

consisting of

1. a set S of ”possible worlds”

2. an (indexed) family of binary accessibility relationsRi ⊆ S × S

3. and a valuation ‖.‖ : Φ → P(S), assigning to each p ∈ Φ a set ‖p‖Sof states

4. a designated world s∗: the “actual” one.

Page 68: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 68

Kripke Semantics: Modalities

For atomic sentences and for Boolean connectives, we use the samesemantics (and notations) as on epistemic-doxastic models.

For relation R in the indexed family and every sentence ϕ, we candefine a new sentence [R]ϕ by (universally) quantifying overR-accessible worlds:

s |= [R]ϕ iff t |= ϕ for all t such that sRt.

The operator [R] is called a “(universal) Kripke modality”. When therelation R is unique, we can leave it implicit and abbreviate [R]ϕ as 2ϕ.

The dual existential modality is given by

< R > ϕ := ¬[R]¬ϕ.

Again, when R is unique, we can abbreviate < R > ϕ as 3ϕ.

Page 69: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 69

Kripke Models for Knowledge and Belief

In a context when we interpret a modality 2ϕ as knowledge, we usethe notation Kϕ instead, and we denote by ∼ the underlying binaryrelation R.

When we interpret the modality 2ϕ as belief, we use the notation Bϕ

instead, and we denote by → the underlying binary relation R.

So a Kripke model for (single-agent) knowledge and belief is ofthe form (S,∼,→, ‖.‖, s∗), with K interpreted as the modality [∼] forthe epistemic relation ∼, and B as the modality [→] for the doxasticrelation →.

Page 70: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 70

Example 3, again: knowledge

The agent’s knowledge in the concealed coin scenario can now berepresented as:

¤£

¡¢H44oo //

¤£

¡¢T jj

The arrows represent the epistemic relation ∼, which captures theagent’s uncertainty about the state the world. An arrow from state s

to state t means that, if s were the real state, then the agent wouldn’tdistinguish it from state t: for all he knows, the real state might be t.

Page 71: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 71

Knowledge properties

The fact that K in this model satisfied our validities (1)-(3) is nowreflected in the fact that ∼ is an equivalence relation in this model:

• The Veracity (known as axiom T in modal logic) Kϕ ⇒ ϕ

corresponds to the reflexivity of the relation ∼.

• Positive Introspection (known as axiom 4 in modal logic)Kϕ ⇒ KKϕ corresponds to the transitivity of the relation ∼.

• Negative Introspection (known as axiom 5 in modal logic)¬Kϕ ⇒ K¬Kϕ corresponds to Euclideaness of the relation ∼:

if s ∼ t and s ∼ w then t ∼ w.

In the context of the other two, Euclideaness is equivalent tosymmetry:

if s ∼ t then t ∼ s.

Page 72: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 72

Epistemic Models

An epistemic model (or S5-model) is a Kripke model in which allthe accessibility relations are equivalence relations, i.e. reflexive,transitive and symmetric (or equivalently: reflexive, transitiveand Euclidean).

Page 73: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 73

S4 Models for weak types of knowledge

But, we can see that, in the generalized setting of Kripke models, theseproperties are NOT automatically satisfied.

So one can use Kripke semantics to interpret weaker notions of“knowledge”, e.g. a type of knowledge that is truthful (factive) andpositively introspective, but NOT necessarily negative introspective.

An S4-model for knowledge is a Kripke model satisfying onlyreflexivity and transitivity (but not necessarily symmetry orEuclideaness).

Page 74: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 74

Example 3, again: beliefs

The agent’s beliefs after the mistaken update are now representable as:¤£

¡¢H44oo

¤£

¡¢T

In both worlds (i.e. irrespective of what world is the real one), theagent believes that the coin lies Heads up.

Page 75: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 75

Belief properties

The fact that belief in this model satisfied our validities (4)-(6) is nowreflected in the fact that the doxastic accessibility in the above modelhas the following properties:

• Consistency of beliefs (known as axiom D in modal logic)¬(Bϕ ∧ ¬ϕ) corresponds to the seriality of the relation →:

∀s∃t such that s → t.

• Positive Introspection for Beliefs (axiom 4) Bϕ ⇒ BBϕ

corresponds to the transitivity of the relation →.

• Negative Introspection for Beliefs (axiom 5) ¬Bϕ ⇒ B¬Bϕ

corresponds to Euclideaness of the relation →.

Page 76: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 76

Doxastic Models

A doxastic model (or KD45-model) is a Φ-Kripke model satisfyingthe following properties:

• (D) Seriality: for every s there exists some t such that s → t ;

• (4) Transitivity: If s → t and t → w then s → w

• (5) Euclideaness : If s → t and s → w then t → w

Page 77: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 77

Properties connecting Knowledge and Belief

The fact that knowledge and belief in this model satisfied our validities(7)-(9) is now reflected in the fact that the accessibility relations → inthe above model have the following properties:

• Strong Positive Introspection of beliefs Bϕ ⇒ KBϕ

corresponds to

if s ∼ t and t → w then s → w.

• Strong Negative Introspection of beliefs ¬Bϕ ⇒ K¬Bϕ

corresponds to

if s ∼ t and s → w then t → w.

• Knowledge Implies Beliefs Kϕ ⇒ Bϕ corresponds to

if s → t then s ∼ t.

Page 78: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 78

Epistemic-Doxastic Kripke Models

A Kripke model satisfying all the above conditions on the relations ∼and → is called an epistemic-doxastic Kripke model.

There are two important observations to be made about these models:

first, they are completely equivalent to our simple, sphere-basedepistemic-doxastic models;second, the epistemic relation is completely determined by the doxasticrelation.

Page 79: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 79

Equivalence of Models

EXERCISE: For every epistemic-doxastic model S = (S, S0, ‖.‖, s∗)there exists a doxastic-epistemic Kripke model S′ = (S,∼,→, ‖.‖, s∗)(having the same set of worlds S, same valuation ‖.‖ and same realworld s∗), such that the same sentences of doxastic-epistemiclogic are true at the real world s in model S as in model S′:

s |=S ϕ iff s |=S′ ϕ ,

for every sentence ϕ.

Conversely, for every doxastic-epistemic Kripke modelS′ = (S,∼,→, ‖.‖, s∗) there exist a doxastic-epistemic modelS = (S, S0, ‖.‖, s∗) such that, for every sentence ϕ, we have:

s |=S ϕ iff s |=S′ ϕ .

Page 80: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 80

Doxastic Relations Uniquely Determine Epistemic Ones

EXERCISE:

Given a doxastic Kripke model (S,→, ‖.‖, s∗) (i.e. one in which to→ is serial, transitive and Euclidean), there is a unique relation∼⊆ S × S such that (S,∼,→, ‖.‖, s∗) is a doxastic-epistemicKripke model.

This means that, to encode an epistemic-doxastic model as a Kripkemodel, we only need to draw the arrows for the doxasticrelation.

Page 81: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 81

Dynamics: updates, again

When some new information ϕ is learned with absolute certainty,we represent the resulting situation by performing an update !ϕ on theoriginal model. This corresponds to simply deleting the non-ϕ-worldsfrom the original sphere-based model S = (S, S0, ‖.‖, s∗): the new set ofworlds

‖ϕ‖S = {w ∈ S : w |=S ϕ}consists of all the worlds that satisfied ϕ in the old model ; the realworld s∗ is kept the same; the new sphere of beliefs is the restriction

S0 ∩ ‖ϕ‖S = {w ∈ S0 : w |=S ϕ}of the old sphere to the surviving worlds; and the new valuation is therestriction of the old valuation

‖p‖ = ‖p‖S ∩ ‖ϕ‖S .

Page 82: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 82

Updates on Kripke Models

In terms of Kripke models, the definition of update is the same, exceptthat we have to erase all the doxastic or epistemic arrows thatcome from or go to a deleted world:

in other words, the new epistemic and doxastic relations are therestrictions of the old ones to the new set of states.

The underlying assumption is that the announced sentence isTRUE in the real world: an update !ϕ can be performed on a modelS only if ϕ is true at the real world s∗ in S.

This is because absolute certainty (in our sense) implies truth: toperform an update, ϕ must be “hard information”, beyond anydoubt, obtained from an infallible source.

Page 83: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 83

Example

Given the original model in Example 5

º¹ ¸·³´ µ¶1_^ ]\XY Z[

º¹ ¸·³´ µ¶2

º¹ ¸·³´ µ¶3

º¹ ¸·³´ µ¶4

º¹ ¸·³´ µ¶5

suppose no announcement is made by the Teacher. After the passing ofMonday (with no exam!), the new model can be obtained by the update!(¬1):

_^ ]\XY Z[º¹ ¸·³´ µ¶2

º¹ ¸·³´ µ¶3

º¹ ¸·³´ µ¶4

º¹ ¸·³´ µ¶5

Page 84: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 84

Another Day Passes

After the passing of Tuesday (again, with no exam happening), let ustry to model the result by performing an update !(¬2) on the previousmodel: º¹ ¸·

³´ µ¶3º¹ ¸·³´ µ¶4

º¹ ¸·³´ µ¶5

The sphere of beliefs disappeared! Or, in terms of Kripke models, allthe doxastic arrows have been erased!

Page 85: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 85

The Problem of Belief Revision

In this last “model”, the sphere of beliefs is empty.

Technically, this goes against our requirement that S0 6= ∅. So this isnot really a correct epistemic-doxastic model!

Indeed, if we applied our definition of belief to this “model”, it wouldfollow that the agent has come to believe everything: he hasinconsistent beliefs!

This goes against our axiom (D) of “Consistency of Beliefs”.

Our student has gone crazy!

Page 86: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 86

The Problem of Belief Revision– continued

Well, the only solution seems to be to draw a second sphere ofbeliefs, encoding a kind of doxastic “contingency plan”: when theprevious beliefs are shown to be wrong, the agent weakens his belief tothe second sphere.

But... what worlds should belong to the second sphere?

What is the appropriate belief of the Student about exam, afterTuesday passed?

Page 87: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 87

The Problem of Belief Revision: syntactic version

What happens if I learn a new fact ϕ (e.g. that the exam isneither on Monday nor on Tuesday) that goes in contradictionto my old beliefs?

If I accept the fact ϕ, and put it together with the set T of all thesentences I used to believe, the resulting set T ∪ {ϕ} is logicallyinconsistent.

So I have to give up some of my old beliefs. But which ofthem?

Maybe all of them?! No, I should maybe try to maintain as muchas possible of my old beliefs, while still accepting the new fact ϕ

(without arriving to a contradiction).

Page 88: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 88

Example

Suppose I believe two facts p and q and (by logical closure) theirconjunction p ∧ q. So my belief base is the following

{p, q, p ∧ q}.

Suppose now that I learn the last sentence was actually false.

Obviously, I have to revise my belief base, eliminating the sentencep ∧ q, and replacing it with its negation: ¬(p ∧ q).

Page 89: Multi-Agent Belief Dynamics - Vrije Universiteit BrusselA. Baltag, L.S. Moss and H.P. van Ditmarsch, Epistemic Logic and Information Update. In (Eds) P. Adriaans and J. van Benthem,

NASSLLI 2010 89

But the base{p, q,¬(p ∧ q)}

is inconsistent!

So I have to do more!

Obviously, to accommodate the new fact ¬(p ∧ q), I have to give upeither my belief in p or my belief in q.

But which one?