representing epistemic uncertainty by means of dialectical argumentation peter mcburney and simon...

30
Representing Epistemic Representing Epistemic Uncertainty by means of Uncertainty by means of Dialectical Argumentation Dialectical Argumentation Peter McBurney and Simon Parsons Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent ART) Agent Applications, Research and Technology (Agent ART) Group Group Department of Computer Science Department of Computer Science University of Liverpool, Liverpool UK University of Liverpool, Liverpool UK {p.j.mcburney,s.d.parsons}@csc.liv.ac.uk {p.j.mcburney,s.d.parsons}@csc.liv.ac.uk Presentation to: Presentation to: Department of Computer Science Department of Computer Science University of Liverpool University of Liverpool 6 February 2001 6 February 2001

Upload: logan-cooper

Post on 28-Mar-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Epistemic Uncertainty by Representing Epistemic Uncertainty by means of Dialectical Argumentationmeans of Dialectical Argumentation

Peter McBurney and Simon ParsonsPeter McBurney and Simon Parsons

Agent Applications, Research and Technology (Agent ART) GroupAgent Applications, Research and Technology (Agent ART) Group

Department of Computer ScienceDepartment of Computer Science

University of Liverpool, Liverpool UKUniversity of Liverpool, Liverpool UK

{p.j.mcburney,s.d.parsons}@csc.liv.ac.uk{p.j.mcburney,s.d.parsons}@csc.liv.ac.uk

Presentation to: Presentation to:

Department of Computer ScienceDepartment of Computer Science

University of LiverpoolUniversity of Liverpool

6 February 20016 February 2001

Page 2: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 2

Nature of the problemNature of the problem

Problem: Assessing the health risks of new chemicals and Problem: Assessing the health risks of new chemicals and technologiestechnologies

Classical decision theory methods require:Classical decision theory methods require:• Explicit delineation of all outcomesExplicit delineation of all outcomes• Quantification of uncertainties and consequences.Quantification of uncertainties and consequences.

But for most domains:But for most domains:• Scientific knowledge often limited (especially at outset)Scientific knowledge often limited (especially at outset)• Experimental evidence ambiguous and conflictingExperimental evidence ambiguous and conflicting• No agreement on quantification.No agreement on quantification.

Page 3: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 3

Types of evidence for chemical Types of evidence for chemical carcinogenicitycarcinogenicity

Chemical structure comparisonChemical structure comparison Mutagenic tests on tissue culturesMutagenic tests on tissue cultures Animal bioassaysAnimal bioassays Human epidemiological studiesHuman epidemiological studies Explication of biomedical causal pathways.Explication of biomedical causal pathways.

These different sources of evidence may conflict. These different sources of evidence may conflict. • E.g. Formaldehyde.E.g. Formaldehyde.

Page 4: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 4

Risk Assessment for chemical Risk Assessment for chemical XX

Are there adverse health effects from exposure to

chemical X ?

Are there adverse health effects from exposure to

chemical X ?

What is the likelihood andsize of impact?

What is the likelihood andsize of impact?

What should be done about chemical X ?

What should be done about chemical X ?

Page 5: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 5

Argumentation to represent uncertaintyArgumentation to represent uncertainty

Two meanings of “argument”:Two meanings of “argument”:• A case for a claim (a tentative proof)A case for a claim (a tentative proof)• A debate between people about a claim.A debate between people about a claim.

Our degree of certainty in a claim depends on the cases for Our degree of certainty in a claim depends on the cases for and against it.and against it.• The more and stronger cases against , the less certainty.The more and stronger cases against , the less certainty.• A consensus in favour of a claim indicates the greatest A consensus in favour of a claim indicates the greatest

certainty.certainty. We can therefore represent uncertainty by means of We can therefore represent uncertainty by means of

dialectical argumentation.dialectical argumentation. We also require a mechanism for generating inferences from We also require a mechanism for generating inferences from

the dialectical status of a claim. the dialectical status of a claim.

Page 6: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 6

Philosophical underpinningPhilosophical underpinning

We have adopted an explicit philosophy of science:We have adopted an explicit philosophy of science:• Pera’s (1994) model of science as a 3-person game:Pera’s (1994) model of science as a 3-person game:

– The Experimenter + Nature + The Scientific Community.The Experimenter + Nature + The Scientific Community.

• Feyerabend’s (1971) philosophy of science as epistemological Feyerabend’s (1971) philosophy of science as epistemological anarchism:anarchism:

– There are no absolute standards which distinguish science from non-There are no absolute standards which distinguish science from non-sciencescience

– Standards differ by time, by discipline and by context.Standards differ by time, by discipline and by context.

We see two principles as necessary for an activity to be We see two principles as necessary for an activity to be called “science”:called “science”:• All claims are contestable by anyone (in the community)All claims are contestable by anyone (in the community)• All claims are defeasible, with reasoning always to the best All claims are defeasible, with reasoning always to the best

explanation.explanation.

Page 7: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 7

Pera’s Philosophy of SciencePera’s Philosophy of Science

ExperimenterExperimenter NatureNature

ScientificCommunity

ScientificCommunity

Proposes and Undertakes Experiment

Interprets results of experiment

Responds to Experiment

Page 8: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 8

To model these, we need: To model these, we need:

A theory of rational discourse between reasonable, A theory of rational discourse between reasonable, consenting participants:consenting participants:• Hitchcock’s (1991) principles of rational mutual inquiryHitchcock’s (1991) principles of rational mutual inquiry• The discourse ethics of Habermas and Alexy (1978).The discourse ethics of Habermas and Alexy (1978).

A model for an argument:A model for an argument:• Toulmin’s (1958) argument schema.Toulmin’s (1958) argument schema.

A means to formalize complex dialogues:A means to formalize complex dialogues:• Walton and Krabbe’s (1995) characterization of different types of Walton and Krabbe’s (1995) characterization of different types of

dialoguesdialogues• Formal dialogue-games of Hamblin (1970, 1971) and MacKenzie Formal dialogue-games of Hamblin (1970, 1971) and MacKenzie

(1979, 1990).(1979, 1990).

Page 9: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 9

Hitchcock’s PrinciplesHitchcock’s Principles

18 Principles of rational mutual discourse, for example:18 Principles of rational mutual discourse, for example:• Dialectification: The content and methods of dialogue should be Dialectification: The content and methods of dialogue should be

decided by the participants.decided by the participants.• Mutuality: no statement becomes a commitment of a participant Mutuality: no statement becomes a commitment of a participant

unless he or she specifically accepts it.unless he or she specifically accepts it.• Orderliness: one issue is raised and discussed at a time.Orderliness: one issue is raised and discussed at a time.• Logical pluralism: both deductive and non-deductive inference is Logical pluralism: both deductive and non-deductive inference is

permitted.permitted.• Rule-consistency: there should be no situation where the rules prohibit Rule-consistency: there should be no situation where the rules prohibit

all acts, including the null act.all acts, including the null act.• Realism: the rules must make agreement between participants Realism: the rules must make agreement between participants

possible.possible.• Retraceability: participants must be free at all times to supplement, Retraceability: participants must be free at all times to supplement,

change or withdraw previous tentative commitments.change or withdraw previous tentative commitments.• Role reversability: the rules should permit the responsibility for Role reversability: the rules should permit the responsibility for

initiating suggestions to shift between participants.initiating suggestions to shift between participants.

Page 10: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 10

Alexy’s Discourse Rules Alexy’s Discourse Rules

Rules for discourse over moral and ethical questions, for Rules for discourse over moral and ethical questions, for example:example:• Freedom of assemblyFreedom of assembly• Common languageCommon language• Freedom of speechFreedom of speech• Freedom to challenge claimsFreedom to challenge claims• Arguments required for claimsArguments required for claims• Freedom to challenge argumentsFreedom to challenge arguments• Freedom to disagree over modalitiesFreedom to disagree over modalities• Requirement for clarification and precizationRequirement for clarification and precization• Proportionate defenceProportionate defence• No self-contradictions permitted.No self-contradictions permitted.

Page 11: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 11

Toulmin’s Argument SchemaToulmin’s Argument Schema

probablyprobably

Modality

X is carcinogenicto humans

X is carcinogenicto humans

Claim

X is a chemicalof type T

X is a chemicalof type T

Data

X is not carcinogenic

to rats

X is not carcinogenic

to rats

Rebuttal

Epidemiologicalevidence for others

Epidemiologicalevidence for others

Backing

Most other Type T chemicals are carcinogenic to

humans

Most other Type T chemicals are carcinogenic to

humans

Warrant

Epidemiologicalevidence notunambiguous

Epidemiologicalevidence notunambiguous

Undercut (Pollock)

Page 12: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 12

Walton and Krabbe’s Typology of Walton and Krabbe’s Typology of DialoguesDialogues

Information-seeking dialoguesInformation-seeking dialogues• One participant seeks the answer to a question.One participant seeks the answer to a question.

InquiriesInquiries• All participants collaborate to find the answer to a question.All participants collaborate to find the answer to a question.

PersuasionsPersuasions• One participant seeks to persuade other(s) of the truth of a One participant seeks to persuade other(s) of the truth of a

proposition.proposition. Negotiations Negotiations

• Participants seek to divide a scarce resource.Participants seek to divide a scarce resource. DeliberationsDeliberations

• Participants collaborate to decide a course of action in some Participants collaborate to decide a course of action in some situation.situation.

Eristic dialoguesEristic dialogues• Participants quarrel verbally as a substitute for physical fighting.Participants quarrel verbally as a substitute for physical fighting.

Page 13: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 13

Risk Assessment for chemical Risk Assessment for chemical XX

Are there adverse health effects from exposure to

chemical X ?

Are there adverse health effects from exposure to

chemical X ?

What is the likelihood andsize of impact?

What is the likelihood andsize of impact?

What should be doneabout chemical X ?

What should be doneabout chemical X ?

Scientific Dialogues

Regulatory Dialogue

Page 14: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 14

Risk Assessment DialoguesRisk Assessment Dialogues

Scientific dialogues:Scientific dialogues:• Does exposure (in a certain way at certain dose levels) to chemical Does exposure (in a certain way at certain dose levels) to chemical XX

lead to adverse health effects? If so, what is the likelihood and lead to adverse health effects? If so, what is the likelihood and magnitude of impact?magnitude of impact?

• A mix of:A mix of:– InquiriesInquiries– Persuasion dialogues.Persuasion dialogues.

A regulatory dialogue:A regulatory dialogue:• What regulatory actions (if any) should be taken regarding chemical What regulatory actions (if any) should be taken regarding chemical XX

??• A mix of:A mix of:

– InquiriesInquiries– DeliberationsDeliberations– NegotiationsNegotiations– Persuasion dialogues.Persuasion dialogues.

Page 15: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 15

Dialogue GamesDialogue Games

Games between 2+ players where each “moves” by uttering Games between 2+ players where each “moves” by uttering a locution.a locution.• Developed by philosophers to study fallacious reasoning.Developed by philosophers to study fallacious reasoning.• Used in: agent dialogues (Parsons & Amgoud), software Used in: agent dialogues (Parsons & Amgoud), software

development (Stathis), modeling legal reasoning (Bench-Capon development (Stathis), modeling legal reasoning (Bench-Capon et al.et al., Prakken)., Prakken).

Rules define circumstances of:Rules define circumstances of:• Commencement of the dialogueCommencement of the dialogue• Permitted locutionsPermitted locutions• Combinations of locutionsCombinations of locutions

– e.g. cannot assert a proposition and its negatione.g. cannot assert a proposition and its negation

• CommitmentCommitment– When does a player commit to some claim?When does a player commit to some claim?

• Termination of the dialogue.Termination of the dialogue.

Page 16: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 16

The Risk AgoraThe Risk Agora

A formal framework for representing dialogues concerning A formal framework for representing dialogues concerning carcinogenic risk of chemicals.carcinogenic risk of chemicals.• Represent the arguments for and against a chemical being a Represent the arguments for and against a chemical being a

carcinogen.carcinogen.• Represent the current state of scientific knowledge, including Represent the current state of scientific knowledge, including

epistemic uncertainty.epistemic uncertainty.• Enable contestation and defence of clains and arguments.Enable contestation and defence of clains and arguments.• Enable comparison and synthesis of arguments for specific Enable comparison and synthesis of arguments for specific

claims.claims.• Enable summary “snapshots” of the debate at any time. Enable summary “snapshots” of the debate at any time.

We have fully specified the locutions and rules for a dialogue-We have fully specified the locutions and rules for a dialogue-game for scientific discourses.game for scientific discourses.

Page 17: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 17

Speaking in the AgoraSpeaking in the Agora

Participants can:Participants can:• Propose or assert claims, arguments, grounds, inference-rules, Propose or assert claims, arguments, grounds, inference-rules,

consequencesconsequences• Modify each with modalitiesModify each with modalities• Question or contest others’ proposals or assertionsQuestion or contest others’ proposals or assertions• Accept others’ proposals or assertions.Accept others’ proposals or assertions.

Examples of locutions:Examples of locutions:• propose propose ( participant 1: (claim, modality) )( participant 1: (claim, modality) )• assertassert ( participant 1: (claim, modality) ) ( participant 1: (claim, modality) )• show_arg show_arg ( participant 1: (arg_for_claim, modalities) ) ( participant 1: (arg_for_claim, modalities) )• contest contest ( participant 2: ( participant 2: propose propose ( participant 1: (claim, ( participant 1: (claim,

modality) ) )modality) ) )• etc.etc.

Page 18: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 18

Representing uncertainty in the AgoraRepresenting uncertainty in the Agora

We represent the degree of uncertainty in a claim by means of its We represent the degree of uncertainty in a claim by means of its dialectical argument status in the Agora.dialectical argument status in the Agora.

We use a dictionary of labels due to Krause, Fox We use a dictionary of labels due to Krause, Fox et al.et al. (1998). (1998).• We have modified definitions slightly to allow for counter-counter-We have modified definitions slightly to allow for counter-counter-

arguments. arguments. • This is an example, and other modality dictionaries could be defined. This is an example, and other modality dictionaries could be defined.

A claim is:A claim is: Open Open - no arguments presented yet for it or against it.- no arguments presented yet for it or against it. Supported Supported - at least one grounded argument presented for it .- at least one grounded argument presented for it . Plausible Plausible - at least one consistent, grounded argument presented for it.- at least one consistent, grounded argument presented for it. Probable Probable - at least one consistent, grounded argument presented and - at least one consistent, grounded argument presented and

no rebuttals or undercuts presented.no rebuttals or undercuts presented. AcceptedAccepted - at least one consistent, grounded argument presented for it - at least one consistent, grounded argument presented for it

and any rebuttals or undercuts have been attacked with counter-and any rebuttals or undercuts have been attacked with counter-arguments.arguments.

Page 19: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 19

Debating experimental tests of claimsDebating experimental tests of claims

We also permit debate on:We also permit debate on:• The validity of experiments to test scientific claims.The validity of experiments to test scientific claims.• The results of valid experiments.The results of valid experiments.

An experimental test of a claim is:An experimental test of a claim is:• Open Open - no evidence either way. - no evidence either way. • Invalid test Invalid test - the scientific experiment is not accepted by the - the scientific experiment is not accepted by the

participants as a valid test of the claimparticipants as a valid test of the claim• Inconclusive test Inconclusive test - the test is accepted as valid, but the results - the test is accepted as valid, but the results

are not accepted as statistically significant support for the claim are not accepted as statistically significant support for the claim or against it.or against it.

• Disconfirming instance Disconfirming instance - the test is accepted as evidence - the test is accepted as evidence against the claim.against the claim.

• Confirming instance Confirming instance - the test is accepted as evidence for the - the test is accepted as evidence for the claim.claim.

Page 20: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 20

Experimental status of claimsExperimental status of claims

Claims are then assigned labels according to the extent that Claims are then assigned labels according to the extent that debate in the Agora accepts experimental evidence for and debate in the Agora accepts experimental evidence for and against them.against them.

A claim is:A claim is:• UntestedUntested• Inconclusive Inconclusive • RefutedRefuted• Confirmed. Confirmed.

Experimental evidence in favour of a claim can be presented Experimental evidence in favour of a claim can be presented as an argument for the claim.as an argument for the claim.

Page 21: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 21

Inference from the AgoraInference from the Agora

We define a claim as (defeasibly) true at time t We define a claim as (defeasibly) true at time t • if and only ifif and only if it is Accepted in the Agora at time t.it is Accepted in the Agora at time t.• Otherwise, it is not (defeasibly) true at time t.Otherwise, it is not (defeasibly) true at time t.

This notion of “truth” depends on the opinions of the This notion of “truth” depends on the opinions of the participants in the Agora, which may change over time.participants in the Agora, which may change over time.• As more evidence is obtained and further arguments presented As more evidence is obtained and further arguments presented

to the Agora, the truth status of a claim may change. to the Agora, the truth status of a claim may change. • Such changes may be non-monotonic.Such changes may be non-monotonic.

Page 22: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 22

Formal properties of the Agora:Formal properties of the Agora:

The Agora dialogue-game rules comply with:The Agora dialogue-game rules comply with:• Alexy’s discourse rulesAlexy’s discourse rules• 15 of Hitchcock’s 18 Principles.15 of Hitchcock’s 18 Principles.

Acceptability of claims is a Acceptability of claims is a game-theoretic semantics game-theoretic semantics (Hintikka (Hintikka 1968):1968):• ““Truth” of a proposition depends on a participant in the Agora having Truth” of a proposition depends on a participant in the Agora having

a strategy to defeat any opponent in the dialogue-game associated a strategy to defeat any opponent in the dialogue-game associated with the proposition.with the proposition.

Inference from finite snap-shots to the long-run is well-founded:Inference from finite snap-shots to the long-run is well-founded:• We can place probabilistic bounds on the possibility of errors of We can place probabilistic bounds on the possibility of errors of

inference from finite snapshots to values at infinity. inference from finite snapshots to values at infinity. • This is analogous to the Neyman-Pearson (1928) theory of statistical This is analogous to the Neyman-Pearson (1928) theory of statistical

inference. inference.

Page 23: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 23

Agora debate

Time

Inference from snapshots to infinite Inference from snapshots to infinite statusstatus

OpenStatus forClaim P:

(With apologies to Jackson Pollock)

Probable PlausibleAccepted

Snapshots

Page 24: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 24

Theorem:Theorem:Stability of labels in absence of new Stability of labels in absence of new information.information.

Let P be a claim. Suppose that:Let P be a claim. Suppose that: A(P) is a consistent argument for P such that all rebuttals and A(P) is a consistent argument for P such that all rebuttals and

undercuts against A(P) are themselves attacked by other undercuts against A(P) are themselves attacked by other arguments,arguments,

All arguments pertaining to P using the initial information and All arguments pertaining to P using the initial information and inference rules are eventually articulated by participants inference rules are eventually articulated by participants within the Agora, andwithin the Agora, and

No new information concerning P is received by participants No new information concerning P is received by participants following commencement. following commencement.

Then:Then: The uncertainty label for P converges to “Accepted” as time The uncertainty label for P converges to “Accepted” as time

goes to infinity. goes to infinity.

Page 25: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 25

Key Theorem: Key Theorem: Probability of Inference Errors is Probability of Inference Errors is bounded.bounded.

Consider a claim P. Suppose that:Consider a claim P. Suppose that: The uncertainty label for P converges to a limit at infinity, The uncertainty label for P converges to a limit at infinity, A snapshot is taken at a time A snapshot is taken at a time tt after all relevant arguments after all relevant arguments

related to P have been presented, related to P have been presented, The uncertainty label of P at time The uncertainty label of P at time tt is “Accepted”, and is “Accepted”, and The probability of new information relevant to P arising after The probability of new information relevant to P arising after

time time tt is less than is less than , for some 0 < , for some 0 < < 1.< 1.

Then:Then:

The probability that the uncertainty label for claim P at The probability that the uncertainty label for claim P at infinity is also “Accepted” is at least 1 - infinity is also “Accepted” is at least 1 - ..

Page 26: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 26

Example:Example:

Assumptions:Assumptions:• K1: The chemical K1: The chemical XX is produced by the human body naturally (it is produced by the human body naturally (it

is endogenous).is endogenous).• K2: K2: XX is endogeneous in rats. is endogeneous in rats.• K3: An endogenous chemical is not carcinogenic.K3: An endogenous chemical is not carcinogenic.• K4: Bioassays of X on rats show significant carcinogenic effects.K4: Bioassays of X on rats show significant carcinogenic effects.

Rules of inference:Rules of inference:• R1 (And Introduction): From P and Q, infer (P &Q).R1 (And Introduction): From P and Q, infer (P &Q).• R2 (Modus Ponens): From P and (P implies Q) infer Q.R2 (Modus Ponens): From P and (P implies Q) infer Q.• R3: If a chemical is carcinogenic in an animal species, infer that R3: If a chemical is carcinogenic in an animal species, infer that

it is also carcinogenic in humans.it is also carcinogenic in humans.

Page 27: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 27

Example (cont): A dialogue concerning the Example (cont): A dialogue concerning the statement statement P = “P = “X X is carcinogenic to humans” is carcinogenic to humans”

• Snapshot status of Claim P: OpenSnapshot status of Claim P: Open assertassert (Participant 1: (P, confirmed) ) (Participant 1: (P, confirmed) ) queryquery (Participant 2: (Participant 2: assertassert (Participant 1: (P, confirmed))) (Participant 1: (P, confirmed))) show_argshow_arg (Participant 1: (K4, R3, P, (Confirmed, Valid, Confirmed)) (Participant 1: (K4, R3, P, (Confirmed, Valid, Confirmed))

• Snapshot status of Claim P: AcceptedSnapshot status of Claim P: Accepted contest contest (Participant 2: (Participant 2: assertassert (Participant 1: (P, confirmed))) (Participant 1: (P, confirmed))) query query [Participant 3: [Participant 3: contest contest (Participant 2: (Participant 2: assertassert (Participant 1: (P, (Participant 1: (P,

confirmed))))confirmed)))) propose propose (Participant 2: (not-P, Plausible)) (Participant 2: (not-P, Plausible)) queryquery [Participant 1: [Participant 1: propose propose (Participant 2: (not-P, Plausible)) (Participant 2: (not-P, Plausible)) show_arg show_arg (Participant 2: ((K1, K3) , R2, not-P, (Confirmed, Probable, (Participant 2: ((K1, K3) , R2, not-P, (Confirmed, Probable,

Valid, Plausible)))Valid, Plausible)))

• Snapshot status of Claim P: PlausibleSnapshot status of Claim P: Plausible

Page 28: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 28

What’s next:What’s next:

A model of a deliberation dialogueA model of a deliberation dialogue• Dialogues about what action(s) to take.Dialogues about what action(s) to take.• Have proposed a model based on Wohlrapp’s (1998) Have proposed a model based on Wohlrapp’s (1998) retroflexive retroflexive

argumentationargumentation, a model of non-deductive inference (joint work , a model of non-deductive inference (joint work with David Hitchcock).with David Hitchcock).

Locutions specific to regulatory domainLocutions specific to regulatory domain• Have proposed a first set using Habermas’ (1981) Theory of Have proposed a first set using Habermas’ (1981) Theory of

Communicative Action.Communicative Action. A means to combine different types of dialogueA means to combine different types of dialogue

• Have proposed a formalism using Parikh’s (1985) Game Logic, a Have proposed a formalism using Parikh’s (1985) Game Logic, a version of Dynamic Modal Logic (the modal logic of processes).version of Dynamic Modal Logic (the modal logic of processes).

A qualitative decision theoryA qualitative decision theory• Will draw on Fox and Parsons (1998). Will draw on Fox and Parsons (1998).

Page 29: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 29

Other formal properties under Other formal properties under exploration:exploration:

Can we automate these dialogues?Can we automate these dialogues?

Will automated dialogues ever terminate?Will automated dialogues ever terminate?• Under what circumstances?Under what circumstances?• After how many moves? (Computational complexity).After how many moves? (Computational complexity).

When are two dialogues the same?When are two dialogues the same?

How do we assess the quality of a dialogue system?How do we assess the quality of a dialogue system?

How sensitive is the framework to changes in the game How sensitive is the framework to changes in the game rules?rules?

Page 30: Representing Epistemic Uncertainty by means of Dialectical Argumentation Peter McBurney and Simon Parsons Agent Applications, Research and Technology (Agent

Representing Uncertainty with Argumentation 30

Thanks to:Thanks to:

EPSRC EPSRC • Grant GR/L84117: Qualitative Decision TheoryGrant GR/L84117: Qualitative Decision Theory• Grant GR/N35441/01: Symposium on Argument and ComputationGrant GR/N35441/01: Symposium on Argument and Computation• Phd Studentship.Phd Studentship.

European Union Information Society Technologies Programme (IST):European Union Information Society Technologies Programme (IST):• Sustainable Lifecycles in Information Ecosystems (SLIE) (IST-Sustainable Lifecycles in Information Ecosystems (SLIE) (IST-

1999-10948).1999-10948). Trevor Bench-Capon, Computer Science Dept, University of Liverpool.Trevor Bench-Capon, Computer Science Dept, University of Liverpool. John Fox, Advanced Computation Laboratory, Imperial Cancer John Fox, Advanced Computation Laboratory, Imperial Cancer

Research Fund, London.Research Fund, London. David Hitchcock, Philosophy Dept, McMaster University, Hamilton, David Hitchcock, Philosophy Dept, McMaster University, Hamilton,

Ontario.Ontario. Anonymous referees (UAI, GTDT, AMAI).Anonymous referees (UAI, GTDT, AMAI).