concurrent reasoning with inference graphs
DESCRIPTION
Concurrent Reasoning with Inference Graphs. Daniel R. Schlegel Stuart C. Shapiro. Department of Computer Science and Engineering. Problem Summary. Inference Graphs. Concurrency and Scheduling. Example:. Extend Propositional Graphs Adds channels for information flow: - PowerPoint PPT PresentationTRANSCRIPT
Concurrent Reasoning with Inference GraphsDaniel R. Schlegel Stuart C. Shapiro
Department of Computer Science and Engineering
Problem Summary• Rise of multi-core computers, BUT:• Lack of concurrent natural deduction systems.
This work has been supported by a Multidisciplinary University Research Initiative (MURI) grant (Number W911NF-09- 1-0392) for Unified Research on Network-based Hard/Soft Information Fusion, issued by the US Army Research Office (ARO) under the program management of Dr. John Lavery.
Inference Capabilities• Forward, backward, bi-directional, and focused
inference.• Retains all derived formulas for later re-use.• Propagates disbelief.
Only concurrent inference system with these capabilities.
Propositional Graphs• Directed acyclic graph• Every well-formed expression is a node
• Individual constants• Functional terms• Atomic formulas• Non-atomic formulas (“rules”)
• Each node has an identifier, either• Symbol, or• wfti[!]
• No two nodes with same identifier.
Inference Graphs• Extend Propositional Graphs• Adds channels for information flow:
• i-channels report truth of an antecedent to a rule node.• u-channels report truth of a consequent from a rule node.
• Channels contain valves.• Hold messages back, or allow them through.
• Channels relay messages• I-INFER (“I’ve been inferred”)• U-INFER (“You’ve been inferred”)• BACKWARD-INFER (“Open valves so messages that might infer me can arrive”)• CANCEL-INFER (“Stop inferring me (close valves)”)• UNASSERT (“I’m no longer believed”)
• Different message types have different relative priorities (important for scheduling).
Channels represented by dashed lines are i-channels and are drawn from antecedents to rule nodes. Channels represented by dotted lines are u -channels and are drawn from rule nodes to consequents.
Example:
Rule Node Inference
Concurrency and SchedulingThe area between two valves is called an inference segment.
When a message passes through a valve:• A task is created with the same priority as the
message, and is the application of the inference segment’s function to the message.
• The task is added to a queue which puts higher priority tasks towards its head.
A task only operates within a task segment.1. tasks for relaying newly derived
information using segments to the right are executed before those to the left, and
2. once a node is known to be true or false, all tasks attempting to derive it (left of it in the graph) are canceled, as long as their results are not needed elsewhere.
There is minimal shared state between tasks, allowing many tasks to operate concurrently.
Evaluation
References• Daniel R. Schlegel and Stuart C. Shapiro, Concurrent Reasoning with
Inference Graphs. In Proceedings of the Third International IJCAI Workshop on Graph Structures for Knowledge Representation and Reasoning (GKR 2013), 2013, in press.
Example:
Propositional graph for the assertions that if a, b, and c are true, then d is true, and if d or e are true, then f is true.
1. Message arrives at node.2. Message translated to a RUI, containing
positive and negative instances of antecedents contained in the message.
3. New RUI combined with existing ones.4. Output is a set of new RUIs which are used to
decide of the rule can fire.5. When a rule fires, new messages are sent out.
Example: We assume backward inference has been initiated, opening all the valves in the graph. First, in (a), messages about the truth of a, b, and c flow through i-channels to wft1. Since wft1 is and-entailment, each of its antecedents must be true for it to fire. Since they are, in (b) the message that d is true flows through wft1’s u-channel. d becomes asserted and reports its new status through its i-channel (c). In (d), wft2 receives this information, and since it is an or-entailment rule and requires only a single antecedent to be true for it to fire, it reports to its consequents that they are now true, and cancels inference in e. Finally, in (e), f is asserted, and inference is complete.
Concurrency:• Near linear performance improvement with the
number of processors• Performance resilient to graph depth and
branching factor changes.Scheduling Heuristics:• Backward-inference with or-entailment shows
10x improvement over LIFO queues, and 20-40x over FIFO queues.
See GKR paper (below) for more details.