csm6120 introduction to intelligent systems

63
[email protected] CSM6120 Introduction to Intelligent Systems Propositional and Predicate Logic 2

Upload: aquila

Post on 22-Feb-2016

46 views

Category:

Documents


0 download

DESCRIPTION

CSM6120 Introduction to Intelligent Systems. Propositional and Predicate Logic 2. Syntax and semantics. Syntax Rules for constructing legal sentences in the logic Which symbols we can use (English: letters, punctuation) How we are allowed to combine symbols Semantics - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: CSM6120 Introduction to Intelligent Systems

[email protected]

CSM6120Introduction to Intelligent

SystemsPropositional and Predicate Logic 2

Page 2: CSM6120 Introduction to Intelligent Systems

Syntax and semantics Syntax

Rules for constructing legal sentences in the logic Which symbols we can use (English: letters, punctuation) How we are allowed to combine symbols

Semantics How we interpret (read) sentences in the logic Assigns a meaning to each sentence

Example: “All lecturers are seven foot tall” A valid sentence (syntax) And we can understand the meaning (semantics) This sentence happens to be false (there is a

counterexample)

Page 3: CSM6120 Introduction to Intelligent Systems

Propositional logic Syntax

Propositions, e.g. “it is wet” Connectives: and, or, not, implies, iff (equivalent)

Brackets (), T (true) and F (false)

Semantics (Classical/Boolean) Define how connectives affect truth

“P and Q” is true if and only if P is true and Q is true Use truth tables to work out the truth of

statements

Page 4: CSM6120 Introduction to Intelligent Systems

Important concepts Soundness, completeness, validity

(tautologies) Logical equivalences Models/interpretations Entailment Inference Clausal forms (CNF, DNF)

Page 5: CSM6120 Introduction to Intelligent Systems

Resolution algorithm Given formula in conjunctive normal form,

repeat: Find two clauses with complementary literals, Apply resolution, Add resulting clause (if not already there)

If the empty clause results, formula is not satisfiable Must have been obtained from P and NOT(P)

Otherwise, if we get stuck (and we will eventually), the formula is guaranteed to be satisfiable

Page 6: CSM6120 Introduction to Intelligent Systems

Example Our knowledge base:

1) FriendWetBecauseOfSprinklers2) NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn

Can we infer SprinklersOn? We add:3) NOT(SprinklersOn)

From 2) and 3), get4) NOT(FriendWetBecauseOfSprinklers)

From 4) and 1), get empty clause = SpinklersOn entailed

Page 7: CSM6120 Introduction to Intelligent Systems

Horn clauses A Horn clause is a CNF clause with at most one positive

literal The positive literal is called the head, negative literals are the

body Prolog: head:- body1, body2, body3 … English: “To prove the head, prove body1, …” Implication: If (body1, body2 …) then head

Horn clauses form the basis of forward and backward chaining The Prolog language is based on Horn clauses Modus ponens: complete for Horn KBs

Deciding entailment with Horn clauses is linear in the size of the knowledge base

Page 8: CSM6120 Introduction to Intelligent Systems

Reasoning with Horn clauses Chaining: simple methods used by most inference

engines to produce a line of reasoning

Forward chaining (FC) For each new piece of data, generate all new facts, until

the desired fact is generated Data-directed reasoning

Backward chaining (BC) To prove the goal, find a clause that contains the goal as

its head, and prove the body recursively (Backtrack when the wrong clause is chosen) Goal-directed reasoning

Page 9: CSM6120 Introduction to Intelligent Systems

Forward chaining algorithm Read the initials facts

Begin Filter_phase: find the fired rules (that match facts) While fired rules not empty AND not end DO

Choice_phase: Analyse the conflicts, choose most appropriate rule

Apply the chosen rule End do

End

Page 10: CSM6120 Introduction to Intelligent Systems

Forward chaining example (1) Suppose we have three rules:

R1: If A and B then D (= A ˄ B → D)R2: If B then CR3: If C and D then E

If facts A and B are present, we infer D from R1 and infer C from R2

With D and C inferred, we now infer E from R3

Page 11: CSM6120 Introduction to Intelligent Systems

R1: IF hot AND smoky THEN fireR2: IF alarm-beeps THEN smokyR3: IF fire THEN switch-sprinkler

• alarm-beeps• hot

FactsRules

• smokyFirst cycle: R2 holds

• fireSecond cycle: R1 holds

• switch-sprinklerThird cycle: R3 holds

Action

Forward chaining example (2)

Page 12: CSM6120 Introduction to Intelligent Systems

Forward chaining example (3) Fire any rule whose premises are satisfied in

the KB

Add its conclusion to the KB until the query is found

Page 13: CSM6120 Introduction to Intelligent Systems

Forward chaining AND-OR Graph

Multiple links joined by an arc indicate conjunction – every link must be proved

Multiple links without an arc indicate disjunction – any link can be proved

Page 14: CSM6120 Introduction to Intelligent Systems

Forward chainingIf we want to apply the forward chaining to this graph we first process the known leaves A and B and then allow inference to propagate up the graph as far as possible

Page 15: CSM6120 Introduction to Intelligent Systems

Forward chaining

Page 16: CSM6120 Introduction to Intelligent Systems

Forward chaining

Page 17: CSM6120 Introduction to Intelligent Systems

Forward chaining

Page 18: CSM6120 Introduction to Intelligent Systems

Forward chaining

Page 19: CSM6120 Introduction to Intelligent Systems

Forward chaining

Page 20: CSM6120 Introduction to Intelligent Systems

Forward chaining

Page 21: CSM6120 Introduction to Intelligent Systems

Backward chaining Idea - work backwards from the query q

To prove q by BC, Check if q is known already, or Prove by BC all premises of some rule concluding q (i.e. try to

prove each of that rule’s conditions)

Avoid loops Check if new subgoal is already on the goal stack

Avoid repeated work: check if new subgoal Has already been proved true, or Has already failed

Page 22: CSM6120 Introduction to Intelligent Systems

Backward chaining example The same three rules:

R1: If A and B then DR2: If B then CR3: If C and D then E

If E is known (or is our hypothesis), then R3 implies C and D are true

R2 thus implies B is true (from C) and R1 implies A and B are true (from D)

Page 23: CSM6120 Introduction to Intelligent Systems

Example

R1: IF hot AND smoky THEN fireR2: IF alarm-beeps THEN smokyR3: IF fire THEN switch-sprinkler

RulesShould I switch the sprinklers on?

Hypothesis

IF fireUse R3

IF hot IF smokyUse R1

IF alarm-beeps Use R2

Evidence

• alarm-beeps• hotFacts

Page 24: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 25: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 26: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 27: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 28: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 29: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 30: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 31: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 32: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 33: CSM6120 Introduction to Intelligent Systems

Backward chaining

Page 34: CSM6120 Introduction to Intelligent Systems

Comparison Backward chaining

From hypotheses to relevant facts

Good when: Limited number of (clear)

hypotheses Determining truth of facts is

expensive Large number of possible

facts, mostly irrelevant

Forward chaining From facts to valid

conclusions Good when:

Less clear hypothesis Very large number of

possible conclusions True facts known at start

Page 35: CSM6120 Introduction to Intelligent Systems

Forward vs. backward chaining FC is data-driven

Automatic, unconscious processing E.g. routine decisions May do lots of work that is irrelevant to the goal

BC is goal-driven Appropriate for problem solving E.g. “Where are my keys?”, “How do I start the car?”

The complexity of BC can be much less than linear in size of the KB

Page 36: CSM6120 Introduction to Intelligent Systems

Application Wide use in expert systems

Backward chaining: diagnosis systems Start with set of hypotheses and try to prove each one, asking

additional questions of user when fact is unknown

Forward chaining: design/configuration systems See what can be done with available components

Page 37: CSM6120 Introduction to Intelligent Systems

Checking models Sometimes we just want to find any model

that satisfies the KB

Propositional satisfiability: Determine if an input propositional logic sentence (in CNF) is satisfiable We use a backtracking search to find a model DPLL, WalkSAT, etc – lots of algorithms out there!

This is similar to finding solutions in constraint satisfaction problems More about CSPs in a later module

Page 38: CSM6120 Introduction to Intelligent Systems

Predicate logic Predicate logic is an extension of propositional

logic that permits concisely reasoning about whole classes of entities Also termed Predicate Calculus or First Order Logic The language Prolog is built on a subset of this

Propositional logic treats simple propositions (sentences) as atomic entities

In contrast, predicate logic distinguishes the subject of a sentence from its predicate…

Topic #3 – Predicate Logic

Page 39: CSM6120 Introduction to Intelligent Systems

Subjects and predicates In the sentence “The dog is sleeping”:

The phrase “the dog” denotes the subject - the object or entity that the sentence is about

The phrase “is sleeping” denotes the predicate- a property that is true of the subject

In predicate logic, a predicate is modelled as a function P(·) from objects to propositions i.e. a function that returns TRUE or FALSE P(x) = “x is sleeping” (where x is any object) or, Is_sleeping(dog) Tree(a) is true if a = oak, false if a = daffodil

Topic #3 – Predicate Logic

Page 40: CSM6120 Introduction to Intelligent Systems

More about predicates Convention: Lowercase variables x, y, z...

denote objects/entities; uppercase variables P, Q, R… denote propositional functions (predicates)

Keep in mind that the result of applying a predicate P to an object x is the proposition P(x) But the predicate P itself (e.g. P=“is sleeping”) is

not a proposition (not a complete sentence) E.g. if P(x) = “x is a prime number”,

P(3) is the proposition “3 is a prime number”

Topic #3 – Predicate Logic

Page 41: CSM6120 Introduction to Intelligent Systems

Propositional functions Predicate logic generalizes the grammatical

notion of a predicate to also include propositional functions of any number of arguments, each of which may take any grammatical role that a noun can take E.g. let P(x,y,z) = “x gave y the grade z”, then if

x=“Mike”, y=“Mary”, z=“A” P(x,y,z) = “Mike gave Mary the grade A”

Topic #3 – Predicate Logic

Page 42: CSM6120 Introduction to Intelligent Systems

Reasoning KB:

(1) student(S)∧studies(S,ai) → studies(S,prolog)(2) student(T)∧studies(T,expsys) → studies(T,ai)(3) student(joe)(4) studies(joe,expsys)

(1) and (2) are rules, (3) and (4) are facts

With the information in (3) and (4), rule (2) can fire (but rule (1) can't), by matching (unifying) joe with T This gives a new piece of knowledge, studies(joe, ai) With this new knowledge, rule (1) can now fire joe is unified with S

Page 43: CSM6120 Introduction to Intelligent Systems

Reasoning KB:

(1) student(S)∧studies(S,ai) → studies(S,prolog)(2) student(T)∧studies(T,expsys) → studies(T,ai)(3) student(joe)(4) studies(joe,expsys)

We can apply modus ponens to this twice (FC), to get studies(joe, prolog)

This can then be added to our knowledge base as a new fact

Page 44: CSM6120 Introduction to Intelligent Systems

Clause form We can express any predicate calculus

statement in clause form

This enables us to work with just simple OR (disjunct: ∨) operators rather than having to deal with implication (→) and AND (∧) thus allowing us to work towards a resolution proof

Page 45: CSM6120 Introduction to Intelligent Systems

Example Let's put our previous example in clause form:

(1) ¬student(S) ∨ ¬studies(S, ai) ∨ studies(S,prolog)(2) ¬student(T) ∨ ¬studies(T,expsys) ∨ studies(T,ai)(3) student(joe)(4) studies(joe,expsys)

Is there a solution to studies(S, prolog)? = “is there someone who studies Prolog?” Negate it... ¬studies(S, prolog)

Page 46: CSM6120 Introduction to Intelligent Systems

Example(1) ¬student(S) ∨ ¬studies(S, ai) ∨

studies(S,prolog)(2) ¬student(T) ∨ ¬studies(T,expsys) ∨ studies(T,ai)(3) student(joe)(4) studies(joe,expsys)

Resolve with clause (1): ¬student(S) ∨ ¬studies(S,ai)

Resolve with clause (2) (S = T): ¬student(S) ∨ ¬studies(S,expsys)

Resolve with clause (4) (S = joe): ¬student(joe) Finally, resolve this with clause (3), and we have

nothing left – the empty clause

¬studies(S, prolog)

Page 47: CSM6120 Introduction to Intelligent Systems

Example These are all the same logically...

(1) student(S) ∧ studies(S,ai) → studies(S,prolog)(2) student(T) ∧ studies(T,expsys) → studies(T,ai)(3) student(joe)(4) studies(joe,expsys)

(1) ¬student(S) ¬studies(S, ai) ∨ ∨ studies(S,prolog)(2) ¬student(T) ¬studies(T,expsys) ∨ ∨ studies(T,ai)(3) student(joe)(4) studies(joe,expsys)

(1) studies(S,prolog) ← student(S) ∧ studies(S,ai)(2) studies(T,ai) ← student(T) ∧ studies(T,expsys)(3) student(joe) ←(4) studies(joe,expsys) ←

Page 48: CSM6120 Introduction to Intelligent Systems

Note the last two...

Joe is a student, and is studying expsys, so it's not dependent on anything – it's a statement/fact So there is nothing to the right of the ←

(3) student(joe) ←(4) studies(joe,expsys) ←

Page 49: CSM6120 Introduction to Intelligent Systems

Universes of discourse The power of distinguishing objects from

predicates is that it lets you state things about many objects at once

E.g., let P(x)=“x+1>x”. We can then say,“For any number x, P(x) is true” instead of(0+1>0) (1+1>1) (2+1>2) ...

The collection of values that a variable x can take is called x’s universe of discourse (or simply ‘universe’)

Topic #3 – Predicate Logic

Page 50: CSM6120 Introduction to Intelligent Systems

Quantifier expressions Quantifiers provide a notation that allows us

to quantify (count) how many objects in the universe of discourse satisfy a given predicate

“” is the FORLL or universal quantifier x P(x) means for all x in the universe, P holds

“” is the XISTS or existential quantifier x P(x) means there exists an x in the universe

(that is, 1 or more) such that P(x) is true

Topic #3 – Predicate Logic

Page 51: CSM6120 Introduction to Intelligent Systems

The universal quantifier Example:

Let the universe of x be parking spaces at AULet P(x) be the predicate “x is full”Then the universal quantification of P(x), x P(x), is the proposition: “All parking spaces at AU are full” i.e., “Every parking space at AU is full” i.e., “For each parking space at AU, that space is

full”

Topic #3 – Predicate Logic

Page 52: CSM6120 Introduction to Intelligent Systems

The existential quantifier Example:

Let the universe of x be parking spaces at AULet P(x) be the predicate “x is full”Then the existential quantification of P(x), x P(x), is the proposition: “Some parking space at AU is full” “There is a parking space at AU that is full” “At least one parking space at AU is full”

Topic #3 – Predicate Logic

Page 53: CSM6120 Introduction to Intelligent Systems

Nesting of quantifiers Example: Let the universe of x & y be people Let L(x,y)=“x likes y”

Then y L(x,y) = “There is someone whom x likes” Then x (y L(x,y)) =

“Everyone has someone whom they like”

Topic #3 – Predicate Logic

Page 54: CSM6120 Introduction to Intelligent Systems

What does this mean? C (owned-by(C,O) ∧ cat(C) → contented(C))

P (person(P) ∧ lives-in(P, Wales) → H (harp(H) ∧ plays(P,H)))

Page 55: CSM6120 Introduction to Intelligent Systems

Quantifier exercise If R(x,y)=“x relies upon y,” (x and y are

people) express the following in unambiguous English:

x(y R(x,y))=

y(x R(x,y))=

x(y R(x,y))=

y(x R(x,y))=

x(y R(x,y))=

Everyone has someone to rely on

There’s a person whom everyone relies upon (including himself)!

There’s some needy person who relies upon everybody (including himself)

Everyone has someone who relies upon them

Everyone relies upon everybody, (including themselves)!

Topic #3 – Predicate Logic

Page 56: CSM6120 Introduction to Intelligent Systems

More fun with sentences “Every dog chases its own tail”

– d, Chases(d, Tail-of (d))– Alternative Statement: d, t, Tail-of(t, d) Chases(d, t)

“Every dog chases its own (unique) tail”– d, 1 t, Tail-of(t, d) Chases(d, t)

d, t, Tail-of(t, d) Chases(d, t) [ t’, Chases(d, t’) t’ = t]

“Only the wicked flee when no one pursues”– x, Flees(x) [¬ y, Pursues(y, x)] Wicked(x)

Page 57: CSM6120 Introduction to Intelligent Systems

Propositional vs First-Order Inference Inference rules for quantifiers

Infer the following sentences: King(John) ∧ Greedy(John) → Evil(John) King(Richard) ∧ Greedy(Richard) → Evil(Richard) King(Father(John)) ∧ Greedy(Father(John)) →

Evil(Father(John)) …

)()()( xEvilxGreedyxKingx

Page 58: CSM6120 Introduction to Intelligent Systems

Need to add new logic rules above those in propositional logic Universal Elimination

Substitute Liz for x

Existential Elimination

(Person1 does not exist elsewhere in KB)

Existential Introduction

Inference in First-Order Logic

),(),( MuseLizLikesMusexLikesx

),1(),( MusePersonLikesMusexLikesx

),(),( MusexLikesxMuseGlennLikes

Page 59: CSM6120 Introduction to Intelligent Systems

Example of inference rules “It is illegal for students to copy music” “Joe is a student” “Every student copies music” Is Joe a criminal?

Knowledge Base:

),(

),()()(,

yxCopiesMusic(y)Student(x)yxe)Student(Jo

)Criminal(xyxCopiesyMusicxStudentyx

)3()2(

)1(

Page 60: CSM6120 Introduction to Intelligent Systems

Example cont...

),(),( :From

yJoeCopiesMusic(y)e)Student(JoyyxCopiesMusic(y)Student(x)yx

),( SomeSongJoeCopiesSong)Music(Somee)Student(Jo

oe)Criminal(J

Universal EliminationExistential Elimination

Modus Ponens

Page 61: CSM6120 Introduction to Intelligent Systems

Human reasoning Analogical

We solved one problem one way – so maybe we can solve another one the same (or similar) way

Nonmonotonic/default Handling contradictions, retracting previous conclusions

Temporal Things may change over time

Commonsense E.g., we know that humans are generally less than 2.2m

tall We have lots of this knowledge – computers don’t!

Inductive Induce new knowledge from observations

Page 62: CSM6120 Introduction to Intelligent Systems

Beyond true and false Multi-valued logics

More than two truth values e.g., true, false & unknown

Fuzzy logic - truth value in [0,1]

Modal logics Modal operators define mode for propositions Epistemic logics (belief)

e.g. necessity, possibility Temporal logics (time)

e.g. always, eventually

Page 63: CSM6120 Introduction to Intelligent Systems

Types of Logic

Language What exists Belief of agent

Propositional Logic Facts True/False/Unknown

First-Order Logic Facts, Objects, Relations True/False/Unknown

Temporal Logic Facts, Objects, Relations, Times

True/False/Unknown

Probability Theory Facts Degree of belief 0..1

Fuzzy Logic Degree of truth Degree of belief 0..1