1 markov logic stanley kok dept. of computer science & eng. university of washington joint work...
TRANSCRIPT
![Page 1: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/1.jpg)
1
Markov Logic
Stanley KokDept. of Computer Science & Eng.
University of Washington
Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,
Parag Singla and Jue Wang
![Page 2: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/2.jpg)
2
Overview
Motivation Background Markov logic Inference Learning Software Applications
![Page 3: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/3.jpg)
3
Motivation Most learners assume i.i.d. data
(independent and identically distributed) One type of object Objects have no relation to each other
Real applications:dependent, variously distributed data Multiple types of objects Relations between objects
![Page 4: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/4.jpg)
4
Examples
Web search Medical diagnosis Computational biology Social networks Information extraction Natural language processing Perception Ubiquitous computing Etc.
![Page 5: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/5.jpg)
5
Costs/Benefits of Markov Logic
Benefits Better predictive accuracy Better understanding of domains Growth path for machine learning
Costs Learning is much harder Inference becomes a crucial issue Greater complexity for user
![Page 6: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/6.jpg)
6
Overview
Motivation Background Markov logic Inference Learning Software Applications
![Page 7: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/7.jpg)
7
Markov Networks Undirected graphical models
Cancer
CoughAsthma
Smoking
Potential functions defined over cliques
Smoking Cancer Ф(S,C)
False False 4.5
False True 4.5
True False 2.7
True True 4.5
c
cc xZxP )(
1)(
x c
cc xZ )(
![Page 8: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/8.jpg)
8
Markov Networks Undirected graphical models
Log-linear model:
Weight of Feature i Feature i
otherwise0
CancerSmokingif1)CancerSmoking,(1f
5.11 w
Cancer
CoughAsthma
Smoking
iii xfw
ZxP )(exp
1)(
![Page 9: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/9.jpg)
9
Hammersley-Clifford Theorem
If Distribution is strictly positive (P(x) > 0)
And Graph encodes conditional independences
Then Distribution is product of potentials over cliques of graph
Inverse is also true.
(“Markov network = Gibbs distribution”)
![Page 10: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/10.jpg)
10
Markov Nets vs. Bayes Nets
Property Markov Nets Bayes Nets
Form Prod. potentials Prod. potentials
Potentials Arbitrary Cond. probabilities
Cycles Allowed Forbidden
Partition func. Z = ? Z = 1
Indep. check Graph separation D-separation
Indep. props. Some Some
Inference MCMC, BP, etc. Convert to Markov
![Page 11: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/11.jpg)
11
First-Order Logic
Constants, variables, functions, predicatesE.g.: Anna, x, MotherOf(x), Friends(x, y)
Literal: Predicate or its negation Clause: Disjunction of literals Grounding: Replace all variables by constants
E.g.: Friends (Anna, Bob) World (model, interpretation):
Assignment of truth values to all ground predicates
![Page 12: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/12.jpg)
12
Overview
Motivation Background Markov logic Inference Learning Software Applications
![Page 13: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/13.jpg)
13
Markov Logic: Intuition
A logical KB is a set of hard constraintson the set of possible worlds
Let’s make them soft constraints:When a world violates a formula,It becomes less probable, not impossible
Give each formula a weight(Higher weight Stronger constraint)
satisfiesit formulas of weightsexpP(world)
![Page 14: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/14.jpg)
14
Markov Logic: Definition
A Markov Logic Network (MLN) is a set of pairs (F, w) where F is a formula in first-order logic w is a real number
Together with a set of constants,it defines a Markov network with One node for each grounding of each predicate in
the MLN One feature for each grounding of each formula F
in the MLN, with the corresponding weight w
![Page 15: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/15.jpg)
15
Example: Friends & Smokers
habits. smoking similar have Friends
cancer. causes Smoking
![Page 16: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/16.jpg)
16
Example: Friends & Smokers
)()(),(,
)()(
ySmokesxSmokesyxFriendsyx
xCancerxSmokesx
![Page 17: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/17.jpg)
17
Example: Friends & Smokers
)()(),(,
)()(
ySmokesxSmokesyxFriendsyx
xCancerxSmokesx
1.1
5.1
![Page 18: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/18.jpg)
18
Example: Friends & Smokers
)()(),(,
)()(
ySmokesxSmokesyxFriendsyx
xCancerxSmokesx
1.1
5.1
Two constants: Anna (A) and Bob (B)
![Page 19: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/19.jpg)
19
Example: Friends & Smokers
)()(),(,
)()(
ySmokesxSmokesyxFriendsyx
xCancerxSmokesx
1.1
5.1
Cancer(A)
Smokes(A) Smokes(B)
Cancer(B)
Two constants: Anna (A) and Bob (B)
![Page 20: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/20.jpg)
20
Example: Friends & Smokers
)()(),(,
)()(
ySmokesxSmokesyxFriendsyx
xCancerxSmokesx
1.1
5.1
Cancer(A)
Smokes(A)Friends(A,A)
Friends(B,A)
Smokes(B)
Friends(A,B)
Cancer(B)
Friends(B,B)
Two constants: Anna (A) and Bob (B)
![Page 21: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/21.jpg)
21
Example: Friends & Smokers
)()(),(,
)()(
ySmokesxSmokesyxFriendsyx
xCancerxSmokesx
1.1
5.1
Cancer(A)
Smokes(A)Friends(A,A)
Friends(B,A)
Smokes(B)
Friends(A,B)
Cancer(B)
Friends(B,B)
Two constants: Anna (A) and Bob (B)
![Page 22: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/22.jpg)
22
Example: Friends & Smokers
)()(),(,
)()(
ySmokesxSmokesyxFriendsyx
xCancerxSmokesx
1.1
5.1
Cancer(A)
Smokes(A)Friends(A,A)
Friends(B,A)
Smokes(B)
Friends(A,B)
Cancer(B)
Friends(B,B)
Two constants: Anna (A) and Bob (B)
![Page 23: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/23.jpg)
23
Markov Logic Networks MLN is template for ground Markov nets Probability of a world x:
Typed variables and constants greatly reduce size of ground Markov net
Functions, existential quantifiers, etc. Infinite and continuous domains
Weight of formula i No. of true groundings of formula i in x
iii xnw
ZxP )(exp
1)(
![Page 24: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/24.jpg)
24
Relation to Statistical Models
Special cases: Markov networks Markov random fields Bayesian networks Log-linear models Exponential models Max. entropy models Gibbs distributions Boltzmann machines Logistic regression Hidden Markov models Conditional random fields
Obtained by making all predicates zero-arity
Markov logic allows objects to be interdependent (non-i.i.d.)
![Page 25: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/25.jpg)
25
Relation to First-Order Logic
Infinite weights First-order logic Satisfiable KB, positive weights
Satisfying assignments = Modes of distribution Markov logic allows contradictions between
formulas
![Page 26: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/26.jpg)
26
Overview
Motivation Background Markov logic Inference Learning Software Applications
![Page 27: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/27.jpg)
27
MAP/MPE Inference
Problem: Find most likely state of world given evidence
)|(max xyPy
Query Evidence
![Page 28: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/28.jpg)
28
MAP/MPE Inference
Problem: Find most likely state of world given evidence
i
iix
yyxnw
Z),(exp
1max
![Page 29: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/29.jpg)
29
MAP/MPE Inference
Problem: Find most likely state of world given evidence
i
iiy
yxnw ),(max
![Page 30: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/30.jpg)
30
MAP/MPE Inference
Problem: Find most likely state of world given evidence
This is just the weighted MaxSAT problem Use weighted SAT solver
(e.g., MaxWalkSAT [Kautz et al., 1997] ) Potentially faster than logical inference (!)
i
iiy
yxnw ),(max
![Page 31: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/31.jpg)
31
The WalkSAT Algorithm
for i ← 1 to max-tries do solution = random truth assignment for j ← 1 to max-flips do if all clauses satisfied then return solution c ← random unsatisfied clause with probability p flip a random variable in c else flip variable in c that maximizes number of satisfied clausesreturn failure
![Page 32: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/32.jpg)
32
The MaxWalkSAT Algorithm
for i ← 1 to max-tries do solution = random truth assignment for j ← 1 to max-flips do if ∑ weights(sat. clauses) > threshold then return solution c ← random unsatisfied clause with probability p flip a random variable in c else flip variable in c that maximizes ∑ weights(sat. clauses) return failure, best solution found
![Page 33: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/33.jpg)
33
But … Memory Explosion
Problem: If there are n constantsand the highest clause arity is c,the ground network requires O(n ) memory
Solution:Exploit sparseness; ground clauses lazily
→ LazySAT algorithm [Singla & Domingos, 2006]
c
![Page 34: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/34.jpg)
34
Computing Probabilities
P(Formula|MLN,C) = ? MCMC: Sample worlds, check formula holds P(Formula1|Formula2,MLN,C) = ? If Formula2 = Conjunction of ground atoms
First construct min subset of network necessary to answer query (generalization of KBMC)
Then apply MCMC (or other) Can also do lifted inference [Braz et al, 2005]
![Page 35: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/35.jpg)
35
Ground Network Construction
network ← Øqueue ← query nodesrepeat node ← front(queue) remove node from queue add node to network if node not in evidence then add neighbors(node) to queue until queue = Ø
![Page 36: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/36.jpg)
36
MCMC: Gibbs Sampling
state ← random truth assignmentfor i ← 1 to num-samples do for each variable x sample x according to P(x|neighbors(x)) state ← state with new value of xP(F) ← fraction of states in which F is true
![Page 37: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/37.jpg)
37
But … Insufficient for Logic
Problem:Deterministic dependencies break MCMCNear-deterministic ones make it very slow
Solution:Combine MCMC and WalkSAT
→ MC-SAT algorithm [Poon & Domingos, 2006]
![Page 38: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/38.jpg)
38
Overview
Motivation Background Markov logic Inference Learning Software Applications
![Page 39: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/39.jpg)
39
Learning
Data is a relational database Closed world assumption (if not: EM) Learning parameters (weights) Learning structure (formulas)
![Page 40: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/40.jpg)
40
Generative Weight Learning
Maximize likelihood Numerical optimization (gradient or 2nd order) No local maxima
Requires inference at each step (slow!)
No. of times clause i is true in data
Expected no. times clause i is true according to MLN
)()()(log xnExnxPw iwiwi
![Page 41: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/41.jpg)
41
Pseudo-Likelihood
Likelihood of each variable given its neighbors in the data
Does not require inference at each step Widely used in vision, spatial statistics, etc. But PL parameters may not work well for
long inference chains
i
ii xneighborsxPxPL ))(|()(
![Page 42: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/42.jpg)
42
Discriminative Weight Learning
Maximize conditional likelihood of query (y) given evidence (x)
Approximate expected counts with: counts in MAP state of y given x (with MaxWalkSAT) with MC-SAT
No. of true groundings of clause i in data
Expected no. true groundings of clause i according to MLN
),(),()|(log yxnEyxnxyPw iwiwi
![Page 43: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/43.jpg)
43
Structure Learning
Generalizes feature induction in Markov nets Any inductive logic programming approach can be
used, but . . . Goal is to induce any clauses, not just Horn Evaluation function should be likelihood Requires learning weights for each candidate Turns out not to be bottleneck Bottleneck is counting clause groundings Solution: Subsampling
![Page 44: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/44.jpg)
44
Structure Learning
Initial state: Unit clauses or hand-coded KB Operators: Add/remove literal, flip sign Evaluation function:
Pseudo-likelihood + Structure prior Search: Beam, shortest-first, bottom-up
[Kok & Domingos, 2005; Mihalkova & Mooney, 2007]
![Page 45: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/45.jpg)
45
Overview
Motivation Background Markov logic Inference Learning Software Applications
![Page 46: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/46.jpg)
46
Alchemy
Open-source software including: Full first-order logic syntax Generative & discriminative weight learning Structure learning Weighted satisfiability and MCMC Programming language features
alchemy.cs.washington.edu
![Page 47: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/47.jpg)
47
Overview
Motivation Background Markov logic Inference Learning Software Applications
![Page 48: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/48.jpg)
48
Applications
Basics Logistic regression Hypertext classification Information retrieval Entity resolution Bayesian networks Etc.
![Page 49: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/49.jpg)
49
Running Alchemy
Programs Infer Learnwts Learnstruct
Options
MLN file Types (optional) Predicates Formulas
Database files
![Page 50: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/50.jpg)
50
Uniform Distribn.: Empty MLN
Example: Unbiased coin flips
Type: flip = { 1, … , 20 }
Predicate: Heads(flip)
2
1))((
0101
01
ee
efHeadsP
ZZ
Z
![Page 51: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/51.jpg)
51
Binomial Distribn.: Unit Clause
Example: Biased coin flips
Type: flip = { 1, … , 20 }
Predicate: Heads(flip)
Formula: Heads(f)
Weight: Log odds of heads:
By default, MLN includes unit clauses for all predicates
(captures marginal distributions, etc.)
peee
eP
wZ
wZ
wZ
1
1)Heads(f)(
011
1
p
pw
1log
![Page 52: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/52.jpg)
52
Multinomial Distribution
Example: Throwing die
Types: throw = { 1, … , 20 }
face = { 1, … , 6 }
Predicate: Outcome(throw,face)
Formulas: Outcome(t,f) ^ f != f’ => !Outcome(t,f’).
Exist f Outcome(t,f).
Too cumbersome!
![Page 53: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/53.jpg)
53
Multinomial Distrib.: ! Notation
Example: Throwing die
Types: throw = { 1, … , 20 }
face = { 1, … , 6 }
Predicate: Outcome(throw,face!)
Formulas:
Semantics: Arguments without “!” determine arguments with “!”.
Also makes inference more efficient (triggers blocking).
![Page 54: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/54.jpg)
54
Multinomial Distrib.: + Notation
Example: Throwing biased die
Types: throw = { 1, … , 20 }
face = { 1, … , 6 }
Predicate: Outcome(throw,face!)
Formulas: Outcome(t,+f)
Semantics: Learn weight for each grounding of args with “+”.
![Page 55: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/55.jpg)
55
Logistic regression:
Type: obj = { 1, ... , n }Query predicate: C(obj)Evidence predicates: Fi(obj)Formulas: a C(x) bi Fi(x) ^ C(x)
Resulting distribution:
Therefore:
Alternative form: Fi(x) => C(x)
Logistic Regression
iiii fbafba
CP
CP
)0exp(
explog
)|0(
)|1(log
fF
fF
iii cfbac
ZcCP exp
1),( fF
ii fbaCP
CP
)|0(
)|1(log
fF
fF
![Page 56: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/56.jpg)
56
Text Classification
page = { 1, … , n }word = { … }topic = { … }
Topic(page,topic!)HasWord(page,word)
!Topic(p,t)HasWord(p,+w) => Topic(p,+t)
![Page 57: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/57.jpg)
57
Text Classification
Topic(page,topic!)HasWord(page,word)
HasWord(p,+w) => Topic(p,+t)
![Page 58: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/58.jpg)
58
Hypertext Classification
Topic(page,topic!)HasWord(page,word)Links(page,page)
HasWord(p,+w) => Topic(p,+t)Topic(p,t) ^ Links(p,p') => Topic(p',t)
Cf. S. Chakrabarti, B. Dom & P. Indyk, “Hypertext ClassificationUsing Hyperlinks,” in Proc. SIGMOD-1998.
![Page 59: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/59.jpg)
59
Information Retrieval
InQuery(word)HasWord(page,word)Relevant(page)
InQuery(+w) ^ HasWord(p,+w) => Relevant(p)Relevant(p) ^ Links(p,p’) => Relevant(p’)
Cf. L. Page, S. Brin, R. Motwani & T. Winograd, “The PageRank CitationRanking: Bringing Order to the Web,” Tech. Rept., Stanford University, 1998.
![Page 60: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/60.jpg)
60
Problem: Given database, find duplicate records
HasToken(token,field,record)SameField(field,record,record)SameRecord(record,record)
HasToken(+t,+f,r) ^ HasToken(+t,+f,r’) => SameField(+f,r,r’)SameField(f,r,r’) => SameRecord(r,r’)SameRecord(r,r’) ^ SameRecord(r’,r”) => SameRecord(r,r”)
Cf. A. McCallum & B. Wellner, “Conditional Models of Identity Uncertaintywith Application to Noun Coreference,” in Adv. NIPS 17, 2005.
Entity Resolution
![Page 61: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/61.jpg)
61
Can also resolve fields:
HasToken(token,field,record)SameField(field,record,record)SameRecord(record,record)
HasToken(+t,+f,r) ^ HasToken(+t,+f,r’) => SameField(f,r,r’)SameField(f,r,r’) <=> SameRecord(r,r’)SameRecord(r,r’) ^ SameRecord(r’,r”) => SameRecord(r,r”)SameField(f,r,r’) ^ SameField(f,r’,r”) => SameField(f,r,r”)
More: P. Singla & P. Domingos, “Entity Resolution withMarkov Logic”, in Proc. ICDM-2006.
Entity Resolution
![Page 62: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/62.jpg)
62
Bayesian Networks
Use all binary predicates with same first argument (the object x).
One predicate for each variable A: A(x,v!) One conjunction for each line in the CPT
A literal of state of child and each parent Weight = log P(Child|Parents)
Context-specific independence:One conjunction for each path in the decision tree
Logistic regression: As before
![Page 63: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/63.jpg)
63
Practical Tips
Add all unit clauses (the default) Implications vs. conjunctions Open/closed world assumptions Controlling complexity
Low clause arities Low numbers of constants Short inference chains
Use the simplest MLN that works Cycle: Add/delete formulas, learn and test
![Page 64: 1 Markov Logic Stanley Kok Dept. of Computer Science & Eng. University of Washington Joint work with Pedro Domingos, Daniel Lowd, Hoifung Poon, Matt Richardson,](https://reader035.vdocument.in/reader035/viewer/2022062423/5697bfa81a28abf838c995fe/html5/thumbnails/64.jpg)
64
Summary
Most domains are non-i.i.d. Markov logic combines first-order logic and
probabilistic graphical models Syntax: First-order logic + Weights Semantics: Templates for Markov networks
Inference: LazySAT + MC-SAT Learning: LazySAT + MC-SAT + ILP + PL Software: Alchemy
http://alchemy.cs.washington.edu