unbounded knowledge acquisition based upon mutual information in dependent questions tony c. smith...

7
Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science Waikato University [email protected] AI 2010

Upload: mervyn-peters

Post on 29-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science

Unbounded Knowledge Acquisition Based Upon

Mutual Information in Dependent Questions

Tony C. Smith & Chris van de Molen

Department of Computer ScienceWaikato University

[email protected]

AI 2010

Page 2: Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science

Outline

Motivation/background

Representations of knowledge

Page 3: Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science
Page 4: Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science

Truth table

Attribute vs. entity

Fruitbat Eagle Tiger Rock …

Is alive? T T T F

Flies? T T F F

Lays eggs? F T F F

Page 5: Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science

Multivalued truth table

Attribute vs. entity

Fruitbat Eagle Tiger Rock …

Is alive? 0.90 0.96 0.88 0.01

Flies? 0.85 0.97 0.04 0.22

Lays eggs? 0.20 0.91 0.00 0.01

YES, NO, SOMETIMES, USUALLY, MAYBE, SELDOM, RARELY, etc

Page 6: Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science

Mutual Information

A measure of the information (i.e. uncertainty) in an event ω whose probability is pω can be expressed in bits as

I (ω) = - log2 pω

Entropy is the average information contentI (ω) = - pω log2 pω

Mutual information is the amount of information two events share

MI(X, Y) = I(x) + I(y) – [I(x) + I(y|x)]

Page 7: Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science

Mutual information example

Given:P(A) = 1/32P(B) = 1/64P(B|A) = 1/2P(A|B) = 1/4

Then:I(A) = 5I(B) = 6I(A,B) = 1MI(A,B) = 5 + 6 – (