expert systems presented by mohammad saniee december 2, 2003 department of computer engineering...

28
Expert Systems Expert Systems Presented by Presented by Mohammad Saniee Mohammad Saniee December 2, 2003 December 2, 2003 epartment of Computer Engineering epartment of Computer Engineering Sharif University of Technology Sharif University of Technology

Upload: merry-elliott

Post on 28-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Expert SystemsExpert Systems

Presented by Presented by

Mohammad SanieeMohammad SanieeDecember 2, 2003December 2, 2003

Department of Computer EngineeringDepartment of Computer EngineeringSharif University of TechnologySharif University of Technology

Expert SystemsExpert Systems

A branch of Artificial Intelligence that makes A branch of Artificial Intelligence that makes an extensive use of specialized knowledge to an extensive use of specialized knowledge to solve problems at the level of an human solve problems at the level of an human expert.expert.

NaturalNaturalLang.Lang.

RoboticsRobotics

VisionVision

UnderstandingUnderstanding

ExpertExpertSystemsSystemsNeural Neural

NetworksNetworks

Why do we need Expert SystemsWhy do we need Expert Systems

• Increased availabilityIncreased availability• Permanence Permanence • Reduced DangerReduced Danger• Reduced Cost Reduced Cost • Multiple expertiseMultiple expertise• Increased ReliabilityIncreased Reliability• Explanation facilityExplanation facility• Fast ResponseFast Response• Steady, emotional & complete response Steady, emotional & complete response • Intelligent tutorIntelligent tutor

Expert System building ProcessExpert System building Process

• Selecting a specific DomainSelecting a specific Domain• Scoping the project – The purpose/functionality of Scoping the project – The purpose/functionality of

the expert system.the expert system.• Identifying Human resources such as the Domain Identifying Human resources such as the Domain

expert / Knowledge Engineer, etc.expert / Knowledge Engineer, etc.• Knowledge AcquisitionKnowledge Acquisition• Designing user interfaceDesigning user interface• Implementing the expert systemImplementing the expert system• Maintenance and update of Knowledge Base/SystemMaintenance and update of Knowledge Base/System

Expert System componentsExpert System components

• Working MemoryWorking Memory– A global database of facts used by the systemA global database of facts used by the system

• Knowledge Base Knowledge Base – Contains the domain knowledgeContains the domain knowledge

• Inference Engine Inference Engine – The brain of the Expert system. Makes logical deductions based upon the The brain of the Expert system. Makes logical deductions based upon the

knowledge in the KB. knowledge in the KB.

• User Interface User Interface – A facility for the user to interact with the Expert system.A facility for the user to interact with the Expert system.

• Explanation FacilityExplanation Facility– Explains reasoning of the system to the userExplains reasoning of the system to the user

• Knowledge Acquisition Facility Knowledge Acquisition Facility – An automatic way to acquire knowledgeAn automatic way to acquire knowledge

Expert System StructureExpert System Structure

Knowledge Base Inference Engine

Explanation Facility

KnowledgeAcquisition

Facility

User Interface

MemoryMemoryWorkingWorking

Knowledge TypesKnowledge Types

• The The knowledge baseknowledge base of expert system contains both factual and heuristic of expert system contains both factual and heuristic knowledge. knowledge.

– Factual knowledgeFactual knowledge is that knowledge of the task domain that is widely shared, is that knowledge of the task domain that is widely shared, typically found in textbooks or journals, and commonly agreed upon by those typically found in textbooks or journals, and commonly agreed upon by those knowledgeable in the particular field.knowledgeable in the particular field.

• The capital of Italy is Rome The capital of Italy is Rome • A day consists of 24 hours A day consists of 24 hours • Bacteria type a causes Flu type BBacteria type a causes Flu type B

– Heuristic knowledgeHeuristic knowledge is the less rigorous, more experiential, more judgmental is the less rigorous, more experiential, more judgmental knowledge of performance. knowledge of performance.

• For instance, in a medical expert system - if patient has spots, it’s probably For instance, in a medical expert system - if patient has spots, it’s probably chickenpox chickenpox

• In a mechanical trouble shooting system - if engine doesn’t turn over, check batteryIn a mechanical trouble shooting system - if engine doesn’t turn over, check battery

Knowledge RepresentationKnowledge Representation

• Knowledge representationKnowledge representation formalizes and organizes the knowledge. The formalizes and organizes the knowledge. The two most widely used representation aretwo most widely used representation are

– Production Rules: Production Rules: A rule consists of an IF part and a THEN part (also called a A rule consists of an IF part and a THEN part (also called a conditioncondition and an and an actionaction). if the IF part of the rule is satisfied; consequently, the ). if the IF part of the rule is satisfied; consequently, the THEN part can be concluded, or its problem-solving action taken. Rule based THEN part can be concluded, or its problem-solving action taken. Rule based expert Systems use this representation, e.g.,expert Systems use this representation, e.g.,

IF the stain of the organism is gram negative AND the morphology of IF the stain of the organism is gram negative AND the morphology of the organism is rod AND the aerobicity of the organism is anaerobicthe organism is rod AND the aerobicity of the organism is anaerobic

THEN there is strongly suggestive evidence (0.8) that the class of the THEN there is strongly suggestive evidence (0.8) that the class of the organism is Enterobacter iaceaeorganism is Enterobacter iaceae. .

– Frames or unitsFrames or units: The unit is an assemblage of associated symbolic knowledge : The unit is an assemblage of associated symbolic knowledge about the represented entity Typically, a unit consists of a list of properties of about the represented entity Typically, a unit consists of a list of properties of the entity and associated values for those propertiesthe entity and associated values for those properties

Rules Based Expert SystemRules Based Expert System

Expert systems that represent domain Expert systems that represent domain knowledge using production rules. Two type knowledge using production rules. Two type of Rule based systems:of Rule based systems:

– Forward chaining SystemsForward chaining Systems

– Backward chaining systemsBackward chaining systems

Forward Chaining SystemsForward Chaining Systems

Forward chaining Systems support chaining of IF-Forward chaining Systems support chaining of IF-THEN rules to form a line of reasoning. The chaining THEN rules to form a line of reasoning. The chaining starts from a set of conditions and moves toward a starts from a set of conditions and moves toward a conclusion.conclusion.

Question: Does employee John get a Computer?Question: Does employee John get a Computer?Rule: If John is an employee, he gets a computer.Rule: If John is an employee, he gets a computer.Fact: John is an employeeFact: John is an employeeConclusion: John Gets a computer.Conclusion: John Gets a computer.

Forward ChainingForward Chaining

• The rules are of the form:The rules are of the form:left hand side (LHS) ==> right hand side (RHS).left hand side (LHS) ==> right hand side (RHS).

• The execution cycle isThe execution cycle is– Select a rule whose left hand side conditions match the Select a rule whose left hand side conditions match the

current state as stored in the working storage.current state as stored in the working storage.

– Execute the right hand side of that rule, thus somehow Execute the right hand side of that rule, thus somehow changing the current state.changing the current state.

– Repeat until there are no rules which apply.Repeat until there are no rules which apply.

Forward ChainingForward Chaining

• Facts are represented in a Facts are represented in a working memoryworking memory which is which is continually updated. continually updated.

• Rules represent possible actions to take when Rules represent possible actions to take when specified conditions hold on items in the working specified conditions hold on items in the working memory. memory.

• The conditions are usually The conditions are usually patternspatterns that must that must matchmatch items in the working memory, while the actions items in the working memory, while the actions usually involve usually involve addingadding or or deletingdeleting items from the items from the working memory. working memory.

Forward Chaining (example)Forward Chaining (example)

First we'll look at a very simple set of rules: First we'll look at a very simple set of rules: 1.1. IF (lecturing X) IF (lecturing X)

AND (marking- practicals X) AND (marking- practicals X) THEN ADD (overworked X) THEN ADD (overworked X)

2.2. IF (month February) IF (month February) THEN ADD (lecturing Alison) THEN ADD (lecturing Alison)

3.3. IF (month February) IF (month February) THEN ADD (marking- practicals Alison) THEN ADD (marking- practicals Alison)

4.4. IF (overworked X) IF (overworked X) OR (slept-badly X) OR (slept-badly X) THEN ADD (bad-mood X) THEN ADD (bad-mood X)

5.5. IF (bad-mood X) IF (bad-mood X) THEN DELETE (happy X) THEN DELETE (happy X)

6.6. IF (lecturing X) IF (lecturing X) THEN DELETE (researching X) THEN DELETE (researching X)

Here we use capital letters to indicate variablesHere we use capital letters to indicate variables

(month February) (month February) (happy Alison) (happy Alison) (researching Alison) (researching Alison)

• Rule 2 & 3 let’s assume rule 2 chosenRule 2 & 3 let’s assume rule 2 chosen(lecturing Alison) (lecturing Alison) (month February) (month February) (happy Alison) (happy Alison) (researching Alison)(researching Alison)

• Rule 3 & 6 apply, assume rule 3 chosen, Rule 3 & 6 apply, assume rule 3 chosen, This cycle continues and we end up withThis cycle continues and we end up with

(bad-mood Alison) (bad-mood Alison) (overworked Alison) (overworked Alison) (marking- practicals Alison) (marking- practicals Alison) (lecturing Alison) (lecturing Alison) (month February) (month February)

Example of Forward Chaining SystemExample of Forward Chaining System

• XCONXCON– Developed by DEC to configures computers. Developed by DEC to configures computers.

– Starts with the data about the customer order and Starts with the data about the customer order and works forward toward a configuration based on works forward toward a configuration based on that data. that data.

– Written in the OPS5 (forward chaining rule based) Written in the OPS5 (forward chaining rule based) language. language.

Backward Chaining SystemBackward Chaining System

If the conclusion is known (goal to be achieved) but If the conclusion is known (goal to be achieved) but the path to that conclusion is not known, then the path to that conclusion is not known, then reasoning backwards is called for, and the method is reasoning backwards is called for, and the method is backward chainingbackward chaining. .

• The The consequenceconsequence part of rule specifies combinations part of rule specifies combinations of facts (goals) to be matched against Working of facts (goals) to be matched against Working Memory.Memory.

• The condition part of the rule is then used as a set of The condition part of the rule is then used as a set of further sub-goals to be proven / satisfied.further sub-goals to be proven / satisfied.

Backward Chaining exampleBackward Chaining example

Question: Does employee John get a computer?Question: Does employee John get a computer?

Statement: John gets a computer.Statement: John gets a computer.

Rule: If employee is a programmer, then he gets a computer.Rule: If employee is a programmer, then he gets a computer.

Backward Chaining:Backward Chaining:

Check the rule base to see what has to be “true” for john to Check the rule base to see what has to be “true” for john to get a computer. A programmer. Is it a fact that john is get a computer. A programmer. Is it a fact that john is programmer. If true, then he gets a computerprogrammer. If true, then he gets a computer

Backward ChainingBackward Chaining

• Start with a goal stateStart with a goal state

• System will first check if the goal matches the initial facts given. If it does, System will first check if the goal matches the initial facts given. If it does, the goal succeeds. If it doesn't, the system will looks for rules whose the goal succeeds. If it doesn't, the system will looks for rules whose conclusions match the goal.conclusions match the goal.

• One such rule will be chosen, and the system will then try to prove any One such rule will be chosen, and the system will then try to prove any facts in the preconditions of the rule using the same procedure, setting facts in the preconditions of the rule using the same procedure, setting these as new goals to prove. these as new goals to prove.

• Needs to keep track of what goals it needs to prove its main hypothesis. Needs to keep track of what goals it needs to prove its main hypothesis.

Backward Chaining (example)Backward Chaining (example)

1.1. IF (lecturing X) IF (lecturing X) AND (marking- practicals X) AND (marking- practicals X) THEN (overworked X) THEN (overworked X)

2.2. IF (month February) IF (month February) THEN (lecturing Alison) THEN (lecturing Alison)

3.3. IF (month February) IF (month February) THEN (marking- practicals Alison) THEN (marking- practicals Alison)

4.4. IF (overworked X) IF (overworked X) THEN (bad-mood X) THEN (bad-mood X)

5.5. IF (slept-badly X) IF (slept-badly X) THEN (bad-mood X) THEN (bad-mood X)

6.6. IF (month February) IF (month February) THEN (weather cold) THEN (weather cold)

7.7. IF (year 1993) IF (year 1993) THEN (economy bad )THEN (economy bad )

• initial facts: initial facts: (month February) (month February) (year 1993) (year 1993)

• Goal that has to be proved: Goal that has to be proved: (bad-mood Alison)(bad-mood Alison) • The goal is not satisfied by initial The goal is not satisfied by initial

facts.facts.• Rules 4 & 5 apply. Assume 4 chosenRules 4 & 5 apply. Assume 4 chosen• New Goal( overworked Alison)New Goal( overworked Alison)• Rule 1 appliesRule 1 applies• New Goal (lecturing Alison)New Goal (lecturing Alison)

Conflict Resolution (I)Conflict Resolution (I)

• Conflict Resolution is a method that is used when more than one rule is Conflict Resolution is a method that is used when more than one rule is matched on the facts asserted. There are several approachesmatched on the facts asserted. There are several approaches– First in first serveFirst in first serve

• It involves firing the first rule that matches the content of the It involves firing the first rule that matches the content of the working memory or the facts asserted. working memory or the facts asserted.

– Last in first serveLast in first serve

• The rule applied will be the last rule that is matched.The rule applied will be the last rule that is matched.

– Prioritization:Prioritization:• The rule to apply will be selected based on priorities set on rules, with The rule to apply will be selected based on priorities set on rules, with

priority information usually provided by an expert or knowledge engineer.priority information usually provided by an expert or knowledge engineer.

Conflict Resolution (II)Conflict Resolution (II)

• Specificity Specificity - The rule applied is usually the most - The rule applied is usually the most specific rule, or the rule that matches the most facts. specific rule, or the rule that matches the most facts.

• Recency Recency - The rule applied is the rule that matches the - The rule applied is the rule that matches the most recently derived facts. most recently derived facts.

• Fired Rules - Fired Rules - Involves not applying rules that have Involves not applying rules that have already been used. already been used.

Conflict Resolution Conflict Resolution (example)(example)First we'll look at a very simple set of rules: First we'll look at a very simple set of rules: 1.1. IF (lecturing X) IF (lecturing X)

AND (marking- practicals X) AND (marking- practicals X) THEN ADD (overworked X) THEN ADD (overworked X)

2.2. IF (month February) IF (month February) THEN ADD (lecturing Alison) THEN ADD (lecturing Alison)

3.3. IF (month February) IF (month February) THEN ADD (marking- practicals Alison) THEN ADD (marking- practicals Alison)

4.4. IF (overworked X) IF (overworked X) OR (slept-badly X) OR (slept-badly X) THEN ADD (bad-mood X) THEN ADD (bad-mood X)

5.5. IF (bad-mood X) IF (bad-mood X) THEN DELETE (happy X) THEN DELETE (happy X)

6.6. IF (lecturing X) IF (lecturing X) THEN DELETE (researching X) THEN DELETE (researching X)

7.7. IF (marking – praticals X) IF (marking – praticals X) THEN ADD(Needsrest X)THEN ADD(Needsrest X)

Here we use capital letters to indicate variablesHere we use capital letters to indicate variables

(month February) (month February)

(researching Alison)(researching Alison)(overworked Alison) (overworked Alison)

• First-serve apply Rule 2First-serve apply Rule 2• Last in first serve apply rule 3Last in first serve apply rule 3

(month February) (month February) (researching Alison)(researching Alison)(overworked Alison) (overworked Alison) (marking- practicals Alison) (marking- practicals Alison)

• RecencyRecency Apply Rule that match most recent fact Apply Rule that match most recent factRule # 7Rule # 7

• Fired Rules – don’t fire the same rule againFired Rules – don’t fire the same rule again

• Specificity: If we had two rules but one of them Specificity: If we had two rules but one of them matched more facts than we’’ chose that rulematched more facts than we’’ chose that rule

• Prioritization If we add priority to thesePrioritization If we add priority to these rules rules then the higher priority rule will be fired then the higher priority rule will be fired

UncertaintyUncertainty

• The expert system must deal with the The expert system must deal with the uncertainty that comes from the individual uncertainty that comes from the individual rules, conflict resolution, and incompatibilities rules, conflict resolution, and incompatibilities among the rules. Certainty factors can be among the rules. Certainty factors can be assigned to the rules as in the case of MYCIN. assigned to the rules as in the case of MYCIN.

Uncertainty in MYCINUncertainty in MYCIN

• Rules contain certainty factors , cf.Rules contain certainty factors , cf.

– they make inexact inferences on a confidence scale of -1.0 to 1.0.they make inexact inferences on a confidence scale of -1.0 to 1.0.– 1.0 represents complete confidence that it is true.1.0 represents complete confidence that it is true.– -1.0 represents complete confidence that it is false.-1.0 represents complete confidence that it is false.– The Cfs are measurements of the association between the premise and The Cfs are measurements of the association between the premise and

action clauses of each rules.action clauses of each rules.

when a production rule succeeds because its premise clauses are true in when a production rule succeeds because its premise clauses are true in the current context, the Cfs of the component clauses that indicate how the current context, the Cfs of the component clauses that indicate how strongly each clause is believed are combined, the resulting CF is used to strongly each clause is believed are combined, the resulting CF is used to modify the CF specified in the action clause.modify the CF specified in the action clause.

Explanation facilities Explanation facilities

• Explains the reasoning process used to arrive a conclusionExplains the reasoning process used to arrive a conclusion

– provides the user with a means of understanding the system provides the user with a means of understanding the system behavior.  behavior. 

– This is important because a consultation with a human expert will This is important because a consultation with a human expert will often require some explanation. often require some explanation. 

– Many people would not always accept the answers of an expert Many people would not always accept the answers of an expert without some form of justification. without some form of justification. 

– e.g., a medical expert providing a diagnosis/treatment of a patient e.g., a medical expert providing a diagnosis/treatment of a patient is expected to explain the reasoning behind his/her conclusions: the is expected to explain the reasoning behind his/her conclusions: the uncertain nature of this type of decision may demand a detailed uncertain nature of this type of decision may demand a detailed explanation so that the patient concerned is aware of any risks, explanation so that the patient concerned is aware of any risks, alternative treatments ,etc. alternative treatments ,etc.

Expert System Tools (I)Expert System Tools (I)

• PROLOGPROLOG– A programming language that uses backward chaining.A programming language that uses backward chaining.

• ART-IM (Inference Corporation) ART-IM (Inference Corporation) – Following the distribution of NASA's CLIPS, Inference Corporation implemented a forward-Following the distribution of NASA's CLIPS, Inference Corporation implemented a forward-

chaining only derivative of ART/CLIPS called ART-IM. chaining only derivative of ART/CLIPS called ART-IM. • ART (Inference Corporation)ART (Inference Corporation)

– In 1984, Inference Corporation developed the Automated Reasoning Tool (ART), a forward chaining In 1984, Inference Corporation developed the Automated Reasoning Tool (ART), a forward chaining system.system.

• CLIPS – CLIPS – – NASA took the forward chaining capabilities and syntax of ART and introduced the "C Language NASA took the forward chaining capabilities and syntax of ART and introduced the "C Language

Integrated Production System" (i.e., CLIPS) into the public domain.Integrated Production System" (i.e., CLIPS) into the public domain.• OPS5 (Carnegie Mellon University)OPS5 (Carnegie Mellon University)

– OPS5 (Carnegie Mellon University) – First AI language used for Production System (XCON) OPS5 (Carnegie Mellon University) – First AI language used for Production System (XCON) • Eclipse (The Haley Enterprise, Inc.) Eclipse (The Haley Enterprise, Inc.)

Eclipse is the only C/C++ inference engine that supports both forward and Backward chaining. Eclipse is the only C/C++ inference engine that supports both forward and Backward chaining.

Expert Systems Tools (II)Expert Systems Tools (II)

• Expert System ShellsExpert System Shells– provide mechanism for knowledge representation, provide mechanism for knowledge representation,

reasoning, and explanation, e.g. EMYCIN reasoning, and explanation, e.g. EMYCIN

• Knowledge Acquisition Tools:Knowledge Acquisition Tools:– Programs that interact with experts to extract Programs that interact with experts to extract

domain knowledge. Support inputting knowledge, domain knowledge. Support inputting knowledge, maintaining knowledge base consistency and maintaining knowledge base consistency and completeness. E.g., Mole, Saltcompleteness. E.g., Mole, Salt

Expert System ExamplesExpert System Examples

• MYCIN (1972-80)MYCIN (1972-80)– MYCIN is an interactive program that diagnoses certain infectious MYCIN is an interactive program that diagnoses certain infectious

diseases, prescribes antimicrobial therapy, and can explain its reasoning in diseases, prescribes antimicrobial therapy, and can explain its reasoning in detaildetail

• PROSPECTORPROSPECTOR– Provides advice on mineral explorationProvides advice on mineral exploration

• XCON XCON – configure VAX computers configure VAX computers

• DENDRAL (1965-83) DENDRAL (1965-83) – rule-based expert systems that analyzes molecular structure. Using a plan-rule-based expert systems that analyzes molecular structure. Using a plan-

generate-test search paradigm and data from mass spectrometry and other generate-test search paradigm and data from mass spectrometry and other sources, DENDRAL proposes plausible candidate structures for new or sources, DENDRAL proposes plausible candidate structures for new or unknown chemical compounds.unknown chemical compounds.

LIMITATIONSLIMITATIONS

• NARROW DOMAINNARROW DOMAIN• LIMITED FOCUSLIMITED FOCUS• INABILITY TO LEARNINABILITY TO LEARN• MAINTENANCE PROBLEMSMAINTENANCE PROBLEMS• DEVELOPMENTAL COSTDEVELOPMENTAL COST