bayesian nets in student modeling its- sept 30, 2004

21
Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Upload: barbara-goodwin

Post on 17-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Bayesian Nets in Student ModelingBayesian Nets in Student Modeling

ITS- Sept 30, 2004ITS- Sept 30, 2004

Page 2: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Sources of UncertaintySources of Uncertainty• Incomplete and/or incorrect

knowledge• Slips and/or guesses• Multiple derivations• Invisible inferences• Not showing all work• Help messages• Self-explaining ahead

• Incomplete and/or incorrect knowledge

• Slips and/or guesses• Multiple derivations• Invisible inferences• Not showing all work• Help messages• Self-explaining ahead

Page 3: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Andes student modelAndes student model

• Knowledge tracing• Plan recognition• 1st to use student’s domain

knowledge

• Action prediction• Andes first to support all three

• Knowledge tracing• Plan recognition• 1st to use student’s domain

knowledge

• Action prediction• Andes first to support all three

Page 4: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Goals of AndesGoals of Andes• Students work as much as possible

alone• React to student’s incorrect action,

signal error, explain• React to student’s impasse, provide

procedural help• Assure student understands examples,

prompt self-explaining

• Students work as much as possible alone

• React to student’s incorrect action, signal error, explain

• React to student’s impasse, provide procedural help

• Assure student understands examples, prompt self-explaining

Page 5: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Types of helpTypes of help

• Error help• Procedural help (ask for hints)• Unsolicited help (for non-

physics errors)• Different levels of hints ‘til

“bottom-out hint”

• Error help• Procedural help (ask for hints)• Unsolicited help (for non-

physics errors)• Different levels of hints ‘til

“bottom-out hint”

Page 6: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Usage of student modelUsage of student model

• Plan recognition: recognize and support goals (requires prediction)

• Asses knowledge: help presentation (reminder v. minilesson)

• Assess mastery level: prompt self-explanation or not

• Plan recognition: recognize and support goals (requires prediction)

• Asses knowledge: help presentation (reminder v. minilesson)

• Assess mastery level: prompt self-explanation or not

Page 7: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Self-Explaining CoachSelf-Explaining Coach

• Step correctness (domain)• Rule Browser• E.g.: using force or acceleration

• Step utility (role in solution plan)• Plan Browser• Recognize goals

• Step correctness (domain)• Rule Browser• E.g.: using force or acceleration

• Step utility (role in solution plan)• Plan Browser• Recognize goals

Page 8: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Bayesian networkBayesian network

• Solution graph: map of all solutions with no variables (propositional)

• Solution graph: map of all solutions with no variables (propositional)

Page 9: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Types of nodesTypes of nodes• Domain-general: rules• 2 values indicating mastery

• Task-specific: • facts, goals, rule apps, strategy

nodes• Doable (done already or knows all

needed) • Not-doable

• Domain-general: rules• 2 values indicating mastery

• Task-specific: • facts, goals, rule apps, strategy

nodes• Doable (done already or knows all

needed) • Not-doable

Page 10: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Knowledge evolutionKnowledge evolution

• Dynamic Bayesian network• Analyze each exercise alone• Roll-up: prior probabilities set to

marginal probabilities for previous• Improvements: could model

dependencies & knowledge decay

• Dynamic Bayesian network• Analyze each exercise alone• Roll-up: prior probabilities set to

marginal probabilities for previous• Improvements: could model

dependencies & knowledge decay

Page 11: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Intention or ability?Intention or ability?

• Probability that student can and IS implementing a certain goal

• Decision-theoretic tutor keeps probabilites of “focus of attention”

• Probability that student can and IS implementing a certain goal

• Decision-theoretic tutor keeps probabilites of “focus of attention”

Page 12: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Problem creationProblem creation

• Givens• Goals• Problem-solver applies rules,

generating subgoals until done• Solution graph created

• Givens• Goals• Problem-solver applies rules,

generating subgoals until done• Solution graph created

Page 13: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Andes assessorAndes assessor

• Dynamic belief network for domain-general nodes

• Rules - priors set by test scores• Context-Rules• P(CR=true|R=true)=1• P(CR=true|R=false)=difficulty• One context changes, adjust rest

• Dynamic belief network for domain-general nodes

• Rules - priors set by test scores• Context-Rules• P(CR=true|R=true)=1• P(CR=true|R=false)=difficulty• One context changes, adjust rest

Page 14: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Task-specific nodesTask-specific nodes

• Fact, goal, rule application, strategy

• Context-Rule nodes link task-specific to domain-general rules

• Fact, goal, rule application, strategy

• Context-Rule nodes link task-specific to domain-general rules

Page 15: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Fact & Goal NodesFact & Goal Nodes

• A.k.a. Propositional Nodes• 1 parent for each way to derive• Leaky-OR: T if 1 parent T, also

sometimes true if not• Reasons: guessing, analogy, etc

• A.k.a. Propositional Nodes• 1 parent for each way to derive• Leaky-OR: T if 1 parent T, also

sometimes true if not• Reasons: guessing, analogy, etc

Page 16: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Rule-Application NodesRule-Application Nodes

• Connect CR,Strategy & Prop nodes to new derived Prop nodes

• Doable or not-doable• Parents: 1 CR, pre-condition Props,

sometimes one Strategy node• Noisy-AND: T if ALL parents T, but

sometimes not, 1-alpha

• Connect CR,Strategy & Prop nodes to new derived Prop nodes

• Doable or not-doable• Parents: 1 CR, pre-condition Props,

sometimes one Strategy node• Noisy-AND: T if ALL parents T, but

sometimes not, 1-alpha

Page 17: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Strategy NodesStrategy Nodes

• Used when >1 way to reach a Goal• Paired with a Goal Node• Values are mutually exclusive• No parents in network• Priors=freq. students use this strat.

• Used when >1 way to reach a Goal• Paired with a Goal Node• Values are mutually exclusive• No parents in network• Priors=freq. students use this strat.

Page 18: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

Compare FiguresCompare Figures

• Figure 9 before observing A-is-body

• Figure 10 after observing A-is-body

• Figure 9 before observing A-is-body

• Figure 10 after observing A-is-body

Page 19: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

HintsHints

• Add a new parent to a Prop node

• Accounts for guessing

• Add a new parent to a Prop node

• Accounts for guessing

Page 20: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

SE-CoachSE-Coach• Adds nodes for Read• Link these to Prop nodes• Longer read time, higher prob knows

Prop (p 26)

• Adds nodes for plan selection• Link these to Context-Rules

• Rule Application node prob T if knows CR & all preconditions=Noisy-AND

• Adds nodes for Read• Link these to Prop nodes• Longer read time, higher prob knows

Prop (p 26)

• Adds nodes for plan selection• Link these to Context-Rules

• Rule Application node prob T if knows CR & all preconditions=Noisy-AND

Page 21: Bayesian Nets in Student Modeling ITS- Sept 30, 2004

EvaluationEvaluation

• Simulated students, 65% correct for rule mastery

• 95% if no “invis inferences” and has to “show all work”

• Post-test shows significant learning• Voluntary acceptance?• Accuracy of plan recognition

• Simulated students, 65% correct for rule mastery

• 95% if no “invis inferences” and has to “show all work”

• Post-test shows significant learning• Voluntary acceptance?• Accuracy of plan recognition