unifying logical and statistical ai
DESCRIPTION
Unifying Logical and Statistical AI. Pedro Domingos Dept. of Computer Science & Eng. University of Washington Joint work with Jesse Davis, Stanley Kok, Daniel Lowd, Aniruddh Nath, Hoifung Poon, Matt Richardson, Parag Singla, Marc Sumner, and Jue Wang. Overview. Motivation Background - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/1.jpg)
Unifying Logical and Statistical AI
Pedro DomingosDept. of Computer Science & Eng.
University of Washington
Joint work with Jesse Davis, Stanley Kok, Daniel Lowd, Aniruddh Nath, Hoifung Poon, Matt Richardson,
Parag Singla, Marc Sumner, and Jue Wang
![Page 2: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/2.jpg)
OverviewMotivation BackgroundMarkov logic Inference Learning Software ApplicationsDiscussion
![Page 3: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/3.jpg)
AI: The First 100 Years
IQ HumanIntelligence
ArtificialIntelligence
1956 20562006
![Page 4: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/4.jpg)
AI: The First 100 Years
IQ HumanIntelligence
ArtificialIntelligence
1956 20562006
![Page 5: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/5.jpg)
AI: The First 100 Years
IQ HumanIntelligence
ArtificialIntelligence
1956 20562006
![Page 6: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/6.jpg)
The Great AI SchismField Logical
approachStatistical approach
Knowledge representation
First-order logic Graphical models
Automated reasoning
Satisfiability testing
Markov chain Monte Carlo
Machine learning Inductive logic programming
Neural networks
Planning Classical planning
Markov decision processes
Natural languageprocessing
Definite clause grammars
Prob. context-free grammars
![Page 7: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/7.jpg)
We Need to Unify the Two The real world is complex and uncertain Logic handles complexity Probability handles uncertainty
![Page 8: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/8.jpg)
Progress to Date Probabilistic logic [Nilsson, 1986] Statistics and beliefs [Halpern, 1990] Knowledge-based model construction
[Wellman et al., 1992] Stochastic logic programs [Muggleton, 1996] Probabilistic relational models [Friedman et al., 1999] Relational Markov networks [Taskar et al., 2002] Etc. This talk: Markov logic [Richardson & Domingos, 2004]
![Page 9: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/9.jpg)
Markov Logic Syntax: Weighted first-order formulas Semantics: Templates for Markov nets Inference: Lifted belief propagation, etc. Learning: Voted perceptron, pseudo-
likelihood, inductive logic programming Software: AlchemyApplications: Information extraction,
NLP, social networks, comp bio, etc.
![Page 10: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/10.jpg)
OverviewMotivationBackgroundMarkov logic Inference Learning Software ApplicationsDiscussion
![Page 11: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/11.jpg)
Markov Networks Undirected graphical models
Cancer
CoughAsthma
Smoking
Potential functions defined over cliques
Smoking Cancer Ф(S,C)
False False 4.5
False True 4.5
True False 2.7
True True 4.5
c
cc xZxP )(1)(
x c
cc xZ )(
![Page 12: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/12.jpg)
Markov Networks Undirected graphical models
Log-linear model:
Weight of Feature i Feature i
otherwise0
CancerSmokingif1)CancerSmoking,(1f
51.01 w
Cancer
CoughAsthma
Smoking
iii xfw
ZxP )(exp1)(
![Page 13: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/13.jpg)
First-Order LogicConstants, variables, functions, predicates
E.g.: Anna, x, MotherOf(x), Friends(x,y)Grounding: Replace all variables by constants
E.g.: Friends (Anna, Bob)World (model, interpretation):
Assignment of truth values to all ground predicates
![Page 14: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/14.jpg)
OverviewMotivation BackgroundMarkov logic Inference Learning Software ApplicationsDiscussion
![Page 15: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/15.jpg)
Markov Logic A logical KB is a set of hard constraints
on the set of possible worlds Let’s make them soft constraints:
When a world violates a formula,It becomes less probable, not impossible
Give each formula a weight(Higher weight Stronger constraint)
satisfiesit formulas of weightsexpP(world)
![Page 16: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/16.jpg)
Definition A Markov Logic Network (MLN) is a set of
pairs (F, w) where F is a formula in first-order logic w is a real number
Together with a set of constants,it defines a Markov network with One node for each grounding of each predicate in
the MLN One feature for each grounding of each formula F
in the MLN, with the corresponding weight w
![Page 17: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/17.jpg)
Example: Friends & Smokers
![Page 18: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/18.jpg)
Example: Friends & Smokers
habits. smoking similar have Friendscancer. causes Smoking
![Page 19: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/19.jpg)
Example: Friends & Smokers
)()(),(,)()(
ySmokesxSmokesyxFriendsyxxCancerxSmokesx
![Page 20: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/20.jpg)
Example: Friends & Smokers
)()(),(,)()(
ySmokesxSmokesyxFriendsyxxCancerxSmokesx
1.15.1
![Page 21: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/21.jpg)
Example: Friends & Smokers
)()(),(,)()(
ySmokesxSmokesyxFriendsyxxCancerxSmokesx
1.15.1
Two constants: Anna (A) and Bob (B)
![Page 22: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/22.jpg)
Example: Friends & Smokers
)()(),(,)()(
ySmokesxSmokesyxFriendsyxxCancerxSmokesx
1.15.1
Cancer(A)
Smokes(A) Smokes(B)
Cancer(B)
Two constants: Anna (A) and Bob (B)
![Page 23: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/23.jpg)
Example: Friends & Smokers
)()(),(,)()(
ySmokesxSmokesyxFriendsyxxCancerxSmokesx
1.15.1
Cancer(A)
Smokes(A)Friends(A,A)
Friends(B,A)
Smokes(B)
Friends(A,B)
Cancer(B)
Friends(B,B)
Two constants: Anna (A) and Bob (B)
![Page 24: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/24.jpg)
Example: Friends & Smokers
)()(),(,)()(
ySmokesxSmokesyxFriendsyxxCancerxSmokesx
1.15.1
Cancer(A)
Smokes(A)Friends(A,A)
Friends(B,A)
Smokes(B)
Friends(A,B)
Cancer(B)
Friends(B,B)
Two constants: Anna (A) and Bob (B)
![Page 25: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/25.jpg)
Example: Friends & Smokers
)()(),(,)()(
ySmokesxSmokesyxFriendsyxxCancerxSmokesx
1.15.1
Cancer(A)
Smokes(A)Friends(A,A)
Friends(B,A)
Smokes(B)
Friends(A,B)
Cancer(B)
Friends(B,B)
Two constants: Anna (A) and Bob (B)
![Page 26: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/26.jpg)
Markov Logic NetworksMLN is template for ground Markov nets Probability of a world x:
Typed variables and constants greatly reduce size of ground Markov net
Functions, existential quantifiers, etc. Infinite and continuous domains
Weight of formula i No. of true groundings of formula i in x
iii xnw
ZxP )(exp1)(
![Page 27: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/27.jpg)
Relation to Statistical Models Special cases:
Markov networks Markov random fields Bayesian networks Log-linear models Exponential models Max. entropy models Gibbs distributions Boltzmann machines Logistic regression Hidden Markov models Conditional random fields
Obtained by making all predicates zero-arity
Markov logic allows objects to be interdependent (non-i.i.d.)
![Page 28: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/28.jpg)
Relation to First-Order Logic Infinite weights First-order logic Satisfiable KB, positive weights
Satisfying assignments = Modes of distributionMarkov logic allows contradictions between
formulas
![Page 29: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/29.jpg)
OverviewMotivation BackgroundMarkov logic Inference Learning Software ApplicationsDiscussion
![Page 30: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/30.jpg)
InferenceMAP/MPE state MaxWalkSAT LazySAT
Marginal and conditional probabilities MCMC: Gibbs, MC-SAT, etc. Knowledge-based model construction Lifted belief propagation
![Page 31: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/31.jpg)
InferenceMAP/MPE state MaxWalkSAT LazySAT
Marginal and conditional probabilities MCMC: Gibbs, MC-SAT, etc. Knowledge-based model construction Lifted belief propagation
![Page 32: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/32.jpg)
Lifted InferenceWe can do inference in first-order logic
without grounding the KB (e.g.: resolution) Let’s do the same for inference in MLNsGroup atoms and clauses into
“indistinguishable” setsDo inference over those First approach: Lifted variable elimination
(not practical)Here: Lifted belief propagation
![Page 33: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/33.jpg)
Belief Propagation
Nodes (x)
Features (f)
}\{)(
)()(fxnh
xhfx xx
}{~ }\{)(
)( )()(x xfny
fyxwf
xf yex
![Page 34: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/34.jpg)
Lifted Belief Propagation
}\{)(
)()(fxnh
xhfx xx
}{~ }\{)(
)( )()(x xfny
fyxwf
xf yex
Nodes (x)
Features (f)
![Page 35: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/35.jpg)
Lifted Belief Propagation
}\{)(
)()(fxnh
xhfx xx
}{~ }\{)(
)( )()(x xfny
fyxwf
xf yex
Nodes (x)
Features (f)
![Page 36: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/36.jpg)
Lifted Belief Propagation
}\{)(
)()(fxnh
xhfx xx
}{~ }\{)(
)( )()(x xfny
fyxwf
xf yex
, :Functions of edge counts
Nodes (x)
Features (f)
![Page 37: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/37.jpg)
Lifted Belief Propagation Form lifted network composed of supernodes
and superfeatures Supernode: Set of ground atoms that all send and
receive same messages throughout BP Superfeature: Set of ground clauses that all send and
receive same messages throughout BP Run belief propagation on lifted network Guaranteed to produce same results as ground BP Time and memory savings can be huge
![Page 38: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/38.jpg)
Forming the Lifted Network1. Form initial supernodes
One per predicate and truth value(true, false, unknown)
2. Form superfeatures by doing joins of their supernodes
3. Form supernodes by projectingsuperfeatures down to their predicatesSupernode = Groundings of a predicate with same number of projections from each superfeature
4. Repeat until convergence
![Page 39: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/39.jpg)
Theorem There exists a unique minimal lifted network The lifted network construction algo. finds it BP on lifted network gives same result as
on ground network
![Page 40: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/40.jpg)
Representing SupernodesAnd Superfeatures List of tuples: Simple but inefficientResolution-like: Use equality and inequality Form clusters (in progress)
![Page 41: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/41.jpg)
Open QuestionsCan we do approximate KBMC/lazy/lifting?Can KBMC, lazy and lifted inference be
combined?Can we have lifted inference over both
probabilistic and deterministic dependencies? (Lifted MC-SAT?)
Can we unify resolution and lifted BP?Can other inference algorithms be lifted?
![Page 42: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/42.jpg)
OverviewMotivation BackgroundMarkov logic Inference Learning Software ApplicationsDiscussion
![Page 43: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/43.jpg)
LearningData is a relational databaseClosed world assumption (if not: EM) Learning parameters (weights) Generatively Discriminatively
Learning structure (formulas)
![Page 44: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/44.jpg)
Generative Weight LearningMaximize likelihoodUse gradient ascent or L-BFGSNo local maxima
Requires inference at each step (slow!)
No. of true groundings of clause i in data
Expected no. true groundings according to model
)()()(log xnExnxPw iwiwi
![Page 45: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/45.jpg)
Pseudo-Likelihood
Likelihood of each variable given its neighbors in the data [Besag, 1975]
Does not require inference at each stepConsistent estimatorWidely used in vision, spatial statistics, etc. But PL parameters may not work well for
long inference chains
i
ii xneighborsxPxPL ))(|()(
![Page 46: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/46.jpg)
Discriminative Weight Learning
Maximize conditional likelihood of query (y) given evidence (x)
Approximate expected counts by counts in MAP state of y given x
No. of true groundings of clause i in data
Expected no. true groundings according to model
),(),()|(log yxnEyxnxyPw iwiwi
![Page 47: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/47.jpg)
wi ← 0for t ← 1 to T do yMAP ← Viterbi(x) wi ← wi + η [counti(yData) – counti(yMAP)]return ∑t wi / T
Voted PerceptronOriginally proposed for training HMMs
discriminatively [Collins, 2002] Assumes network is linear chain
![Page 48: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/48.jpg)
wi ← 0for t ← 1 to T do yMAP ← MaxWalkSAT(x) wi ← wi + η [counti(yData) – counti(yMAP)]return ∑t wi / T
Voted Perceptron for MLNsHMMs are special case of MLNsReplace Viterbi by MaxWalkSATNetwork can now be arbitrary graph
![Page 49: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/49.jpg)
Structure Learning Generalizes feature induction in Markov nets Any inductive logic programming approach can be
used, but . . . Goal is to induce any clauses, not just Horn Evaluation function should be likelihood Requires learning weights for each candidate Turns out not to be bottleneck Bottleneck is counting clause groundings Solution: Subsampling
![Page 50: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/50.jpg)
Structure Learning Initial state: Unit clauses or hand-coded KBOperators: Add/remove literal, flip sign Evaluation function:
Pseudo-likelihood + Structure prior Search: Beam [Kok & Domingos, 2005] Shortest-first [Kok & Domingos, 2005] Bottom-up [Mihalkova & Mooney, 2007]
![Page 51: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/51.jpg)
OverviewMotivation BackgroundMarkov logic Inference Learning Software ApplicationsDiscussion
![Page 52: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/52.jpg)
AlchemyOpen-source software including: Full first-order logic syntaxMAP and marginal/conditional inferenceGenerative & discriminative weight learning Structure learning Programming language features
alchemy.cs.washington.edu
![Page 53: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/53.jpg)
Alchemy Prolog BUGS
Represent-ation
F.O. Logic + Markov nets
Horn clauses
Bayes nets
Inference Lifted BP, etc.
Theorem proving
Gibbs sampling
Learning Parameters& structure
No Params.
Uncertainty Yes No Yes
Relational Yes Yes No
![Page 54: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/54.jpg)
OverviewMotivation BackgroundMarkov logic Inference Learning SoftwareApplicationsDiscussion
![Page 55: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/55.jpg)
Applications Information extraction Entity resolution Link prediction Collective classification Web mining Natural language
processing
Computational biology Social network analysis Robot mapping Activity recognition Probabilistic Cyc CALO Etc.
![Page 56: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/56.jpg)
Information ExtractionParag Singla and Pedro Domingos, “Memory-EfficientInference in Relational Domains” (AAAI-06).
Singla, P., & Domingos, P. (2006). Memory-efficentinference in relatonal domains. In Proceedings of theTwenty-First National Conference on Artificial Intelligence(pp. 500-505). Boston, MA: AAAI Press.
H. Poon & P. Domingos, Sound and Efficient Inferencewith Probabilistic and Deterministic Dependencies”, inProc. AAAI-06, Boston, MA, 2006.
P. Hoifung (2006). Efficent inference. In Proceedings of theTwenty-First National Conference on Artificial Intelligence.
![Page 57: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/57.jpg)
SegmentationParag Singla and Pedro Domingos, “Memory-EfficientInference in Relational Domains” (AAAI-06).
Singla, P., & Domingos, P. (2006). Memory-efficentinference in relatonal domains. In Proceedings of theTwenty-First National Conference on Artificial Intelligence(pp. 500-505). Boston, MA: AAAI Press.
H. Poon & P. Domingos, Sound and Efficient Inferencewith Probabilistic and Deterministic Dependencies”, inProc. AAAI-06, Boston, MA, 2006.
P. Hoifung (2006). Efficent inference. In Proceedings of theTwenty-First National Conference on Artificial Intelligence.
AuthorTitle
Venue
![Page 58: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/58.jpg)
Entity ResolutionParag Singla and Pedro Domingos, “Memory-EfficientInference in Relational Domains” (AAAI-06).
Singla, P., & Domingos, P. (2006). Memory-efficentinference in relatonal domains. In Proceedings of theTwenty-First National Conference on Artificial Intelligence(pp. 500-505). Boston, MA: AAAI Press.
H. Poon & P. Domingos, Sound and Efficient Inferencewith Probabilistic and Deterministic Dependencies”, inProc. AAAI-06, Boston, MA, 2006.
P. Hoifung (2006). Efficent inference. In Proceedings of theTwenty-First National Conference on Artificial Intelligence.
![Page 59: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/59.jpg)
Entity ResolutionParag Singla and Pedro Domingos, “Memory-EfficientInference in Relational Domains” (AAAI-06).
Singla, P., & Domingos, P. (2006). Memory-efficentinference in relatonal domains. In Proceedings of theTwenty-First National Conference on Artificial Intelligence(pp. 500-505). Boston, MA: AAAI Press.
H. Poon & P. Domingos, Sound and Efficient Inferencewith Probabilistic and Deterministic Dependencies”, inProc. AAAI-06, Boston, MA, 2006.
P. Hoifung (2006). Efficent inference. In Proceedings of theTwenty-First National Conference on Artificial Intelligence.
![Page 60: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/60.jpg)
State of the Art Segmentation HMM (or CRF) to assign each token to a field
Entity resolution Logistic regression to predict same field/citation Transitive closure
Alchemy implementation: Seven formulas
![Page 61: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/61.jpg)
Types and Predicates
token = {Parag, Singla, and, Pedro, ...}field = {Author, Title, Venue}citation = {C1, C2, ...}position = {0, 1, 2, ...}
Token(token, position, citation)InField(position, field, citation)SameField(field, citation, citation)SameCit(citation, citation)
![Page 62: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/62.jpg)
Types and Predicates
token = {Parag, Singla, and, Pedro, ...}field = {Author, Title, Venue, ...}citation = {C1, C2, ...}position = {0, 1, 2, ...}
Token(token, position, citation)InField(position, field, citation)SameField(field, citation, citation)SameCit(citation, citation)
Optional
![Page 63: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/63.jpg)
Types and Predicates
Evidence
token = {Parag, Singla, and, Pedro, ...}field = {Author, Title, Venue}citation = {C1, C2, ...}position = {0, 1, 2, ...}
Token(token, position, citation)InField(position, field, citation)SameField(field, citation, citation)SameCit(citation, citation)
![Page 64: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/64.jpg)
token = {Parag, Singla, and, Pedro, ...}field = {Author, Title, Venue}citation = {C1, C2, ...}position = {0, 1, 2, ...}
Token(token, position, citation)InField(position, field, citation)SameField(field, citation, citation)SameCit(citation, citation)
Types and Predicates
Query
![Page 65: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/65.jpg)
Token(+t,i,c) => InField(i,+f,c)InField(i,+f,c) <=> InField(i+1,+f,c)f != f’ => (!InField(i,+f,c) v !InField(i,+f’,c))
Token(+t,i,c) ^ InField(i,+f,c) ^ Token(+t,i’,c’) ^ InField(i’,+f,c’) => SameField(+f,c,c’)SameField(+f,c,c’) <=> SameCit(c,c’)SameField(f,c,c’) ^ SameField(f,c’,c”) => SameField(f,c,c”)SameCit(c,c’) ^ SameCit(c’,c”) => SameCit(c,c”)
Formulas
![Page 66: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/66.jpg)
Formulas
Token(+t,i,c) => InField(i,+f,c)InField(i,+f,c) <=> InField(i+1,+f,c)f != f’ => (!InField(i,+f,c) v !InField(i,+f’,c))
Token(+t,i,c) ^ InField(i,+f,c) ^ Token(+t,i’,c’) ^ InField(i’,+f,c’) => SameField(+f,c,c’)SameField(+f,c,c’) <=> SameCit(c,c’)SameField(f,c,c’) ^ SameField(f,c’,c”) => SameField(f,c,c”)SameCit(c,c’) ^ SameCit(c’,c”) => SameCit(c,c”)
![Page 67: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/67.jpg)
Formulas
Token(+t,i,c) => InField(i,+f,c)InField(i,+f,c) <=> InField(i+1,+f,c)f != f’ => (!InField(i,+f,c) v !InField(i,+f’,c))
Token(+t,i,c) ^ InField(i,+f,c) ^ Token(+t,i’,c’) ^ InField(i’,+f,c’) => SameField(+f,c,c’)SameField(+f,c,c’) <=> SameCit(c,c’)SameField(f,c,c’) ^ SameField(f,c’,c”) => SameField(f,c,c”)SameCit(c,c’) ^ SameCit(c’,c”) => SameCit(c,c”)
![Page 68: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/68.jpg)
Formulas
Token(+t,i,c) => InField(i,+f,c)InField(i,+f,c) <=> InField(i+1,+f,c)f != f’ => (!InField(i,+f,c) v !InField(i,+f’,c))
Token(+t,i,c) ^ InField(i,+f,c) ^ Token(+t,i’,c’) ^ InField(i’,+f,c’) => SameField(+f,c,c’)SameField(+f,c,c’) <=> SameCit(c,c’)SameField(f,c,c’) ^ SameField(f,c’,c”) => SameField(f,c,c”)SameCit(c,c’) ^ SameCit(c’,c”) => SameCit(c,c”)
![Page 69: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/69.jpg)
Token(+t,i,c) => InField(i,+f,c)InField(i,+f,c) <=> InField(i+1,+f,c)f != f’ => (!InField(i,+f,c) v !InField(i,+f’,c))
Token(+t,i,c) ^ InField(i,+f,c) ^ Token(+t,i’,c’) ^ InField(i’,+f,c’) => SameField(+f,c,c’)SameField(+f,c,c’) <=> SameCit(c,c’)SameField(f,c,c’) ^ SameField(f,c’,c”) => SameField(f,c,c”)SameCit(c,c’) ^ SameCit(c’,c”) => SameCit(c,c”)
Formulas
![Page 70: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/70.jpg)
Token(+t,i,c) => InField(i,+f,c)InField(i,+f,c) <=> InField(i+1,+f,c)f != f’ => (!InField(i,+f,c) v !InField(i,+f’,c))
Token(+t,i,c) ^ InField(i,+f,c) ^ Token(+t,i’,c’) ^ InField(i’,+f,c’) => SameField(+f,c,c’)SameField(+f,c,c’) <=> SameCit(c,c’)SameField(f,c,c’) ^ SameField(f,c’,c”) => SameField(f,c,c”)SameCit(c,c’) ^ SameCit(c’,c”) => SameCit(c,c”)
Formulas
![Page 71: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/71.jpg)
Formulas
Token(+t,i,c) => InField(i,+f,c)InField(i,+f,c) <=> InField(i+1,+f,c)f != f’ => (!InField(i,+f,c) v !InField(i,+f’,c))
Token(+t,i,c) ^ InField(i,+f,c) ^ Token(+t,i’,c’) ^ InField(i’,+f,c’) => SameField(+f,c,c’)SameField(+f,c,c’) <=> SameCit(c,c’)SameField(f,c,c’) ^ SameField(f,c’,c”) => SameField(f,c,c”)SameCit(c,c’) ^ SameCit(c’,c”) => SameCit(c,c”)
![Page 72: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/72.jpg)
Formulas
Token(+t,i,c) => InField(i,+f,c)InField(i,+f,c) ^ !Token(“.”,i,c) <=> InField(i+1,+f,c)f != f’ => (!InField(i,+f,c) v !InField(i,+f’,c))
Token(+t,i,c) ^ InField(i,+f,c) ^ Token(+t,i’,c’) ^ InField(i’,+f,c’) => SameField(+f,c,c’)SameField(+f,c,c’) <=> SameCit(c,c’)SameField(f,c,c’) ^ SameField(f,c’,c”) => SameField(f,c,c”)SameCit(c,c’) ^ SameCit(c’,c”) => SameCit(c,c”)
![Page 73: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/73.jpg)
Results: Segmentation on Cora
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1Recall
Prec
isio
n
Tokens
Tokens + Sequence
Tok. + Seq. + PeriodTok. + Seq. + P. + Comma
![Page 74: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/74.jpg)
Results:Matching Venues on Cora
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1Recall
Prec
isio
n
Similarity
Sim. + Relations
Sim. + TransitivitySim. + Rel. + Trans.
![Page 75: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/75.jpg)
OverviewMotivation BackgroundMarkov logic Inference Learning Software ApplicationsDiscussion
![Page 76: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/76.jpg)
The Interface Layer
Interface Layer
Applications
Infrastructure
![Page 77: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/77.jpg)
Networking
Interface Layer
Applications
Infrastructure
Internet
RoutersProtocols
WWWEmail
![Page 78: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/78.jpg)
Databases
Interface Layer
Applications
Infrastructure
Relational Model
QueryOptimization
TransactionManagement
ERP
OLTPCRM
![Page 79: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/79.jpg)
Programming Systems
Interface Layer
Applications
Infrastructure
High-Level Languages
CompilersCodeOptimizers
Programming
![Page 80: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/80.jpg)
Artificial Intelligence
Interface Layer
Applications
InfrastructureRepresentation
Learning
Inference
NLP
Planning
Multi-AgentSystemsVision
Robotics
![Page 81: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/81.jpg)
Artificial Intelligence
Interface Layer
Applications
InfrastructureRepresentation
Learning
Inference
NLP
Planning
Multi-AgentSystemsVision
Robotics
First-Order Logic?
![Page 82: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/82.jpg)
Artificial Intelligence
Interface Layer
Applications
InfrastructureRepresentation
Learning
Inference
NLP
Planning
Multi-AgentSystemsVision
Robotics
Graphical Models?
![Page 83: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/83.jpg)
Artificial Intelligence
Interface Layer
Applications
InfrastructureRepresentation
Learning
Inference
NLP
Planning
Multi-AgentSystemsVision
Robotics
Markov Logic
![Page 84: Unifying Logical and Statistical AI](https://reader036.vdocument.in/reader036/viewer/2022062816/56814a7d550346895db79127/html5/thumbnails/84.jpg)
Artificial Intelligence
Alchemy: alchemy.cs.washington.edu
Applications
InfrastructureRepresentation
Learning
Inference
NLP
Planning
Multi-AgentSystemsVision
Robotics