bayesian networks using random variables to represent objects and events in the world –various...

Post on 17-Dec-2015

213 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Bayesian Networks• Using random variables to represent

objects and events in the world– Various instantiations to these variables can

model the current state of the world• Computing joint probabilities over these variables

• Estimating joint probabilities over all random variables often leads to a combinatorial explosion– Estimating probabilities of every combination

of values for the variables involved

Conditional independence• Use the chain rule to replace joint

probabilities with conditional ones:P(A1, A2, A3, A4, A5) = P(A1 | A2, A3, A4,A5) · P(A2 | A3, A4, A5) · P(A3 | A4, A5) · P(A5)

• Bayesian Networks allow to reduce the terms even further, by taking into account conditional independence:– P(A|C1, …, Cn, U) = P(A|C1, …, Cn) for some

set U of random variables– Given {C1, …, Cn}, A is independent of U

Conditional independence (cont’d)• The structure of a Bayesian Network

reflects a number of independence assumptions (d-separation criterion)

• Directed arcs between random variables represent conditional dependencies

• Combined with the chain rule, conditional independencies allow us to manipulate smaller conditional probability tables to estimate the joint probabilities

Ind( Xi ; {X1,…,Xi-1}\Pai | Pai )

where Pai are parents of Xi

Formal notation• Each random variable A is conditionally

independent of all other random variables that are not descendants of A, given A’s parents

n

iiin xpxxp

11 )|(),,( pa

n

iiin xxxpxxp

1111 ),|(),,(

Example: modeling real world• “Mary walks outside and finds that the street and

lawn are wet. She concludes that it has just rained recently. Furthermore, she decides that she doesn’t need to water her roses.”

• Mary’s logic:– rain or sprinklers street = wet– rain or sprinklers lawn = wet– lawn = wet soil = moist– soil = moist roses = OK

Mary’s world (cont’d)• Let’s compute the probability of the

following (intuitively unlikely) world event:– The roses are OK– The soil is dry– The lawn is wet– The street is wet– The sprinklers are off– It’s raining

Where are the missing probabilities, e.g., P(sprinklers=F) or P(street=dry|rain=T, sprinklers=T) ?

Computing the probability of a complex event

• P(sprinklers = F, rain = T, street = wet, lawn = wet, soil = dry, roses = OK) =

P(roses = OK | soil = dry) *

P(soil = dry | lawn = wet) *

P(lawn = wet | rain = T, sprinklers = F) * P(street = wet | rain = T, sprinklers = F) * P(sprinklers = F) * P(rain = T) =

0.2 * 0.1 * 1.0 * 1.0 * 0.6 * 0.7 = 0.0084

The event is quite unlikely !

Usage scenarios

• There are 2 types of computations performed with Bayesian Networks– Belief updating

• Computation of probabilities over random variables

– Belief revision• Finding maximally probably global assignment• Given some evidence or observation, our task is to

come up with a set of hypotheses (variable assignments) that together constitute the most satisfactory explanation/interpretation of the evidence at hand.

Belief revision• Let W be a set of all random variables in

the network

• Let e be the evidence (e is a subset of W)

• Any instantiation of all the variables in W that is consistent with e is called an explanation or interpretation of e

• The problem is to find an explanation w* such that P(w* | e) = max P(w | e)– w* is called the most probable explanation

Belief updating• Computing marginal probabilities of a

subset of random variables given the evidence

• Typically, the task is to determine the best instantiation of a single random variable given the evidence

Belief revision - example• Evidence: e = {roses = OK}

• Goal: determine the assignment to all the random variables that maximizes P(W|e)– P(e) is constant, and e is a subset of W

it’s sufficient to maximize P(W)

– Intuitively, non-evidence random variables in W can be viewed as possible hypotheses for e

• Solution: P(sprinklers=F, rain=T, street=wet, lawn=wet, soil=wet, roses=OK) = 0.2646

Belief updating - example• Evidence: e = {roses = OK}

• Goal: determine the probability that the lawn is either wet or dry given this observation– P(lawn=dry | roses = OK) = 0.1190– P(lawn=wet | roses = OK) = 0.8810

top related