synthesis and inductive learning –part 2...synthesis and inductive learning –part 2 sanjit a....

46
Synthesis and Inductive Learning – Part 2 Synthesis and Inductive Learning – Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments to several Ph.D. students, postdoctoral researchers, and collaborators, and to the students of EECS 219C, Spring 2015, UC Berkeley

Upload: others

Post on 23-Mar-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Synthesis and Inductive Learning – Part 2

Synthesis and Inductive Learning – Part 2

Sanjit A. Seshia

EECS DepartmentUC Berkeley

NSF ExCAPE Summer SchoolJune 23-25, 2015

Acknowledgments to several Ph.D. students, postdoctoral researchers, and collaborators, and to

the students of EECS 219C, Spring 2015, UC Berkeley

Page 2: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 2 –

Questions of Interest for this TutorialQuestions of Interest for this Tutorial

How can inductive synthesis be used to solve other (non-synthesis) problems?

Reducing a Problem to Synthesis How does inductive synthesis compare with

machine learning? What are the common themes amongst various inductive synthesis efforts?

Oracle-Guided Inductive Synthesis (OGIS) Framework

Is there a complexity/computability theory for inductive synthesis?

Yes! A first step: Theoretical analysis of counterexample-guided inductive synthesis (CEGIS)

Page 3: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 3 –

Outline for this Lecture SequenceOutline for this Lecture Sequence

Examples of Reduction to Synthesis– Specification– Verification

Differences between Inductive Synthesis and Machine Learning

Oracle-Guided Inductive Synthesis– Examples, CEGIS

Theoretical Analysis of CEGIS– Properties of Learner– Properties of Verifier

Demo: Requirement Mining for Cyber-Physical Systems

Page 4: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 4 –

Comparison with Machine LearningComparison with Machine Learning

Page 5: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 5 –

Formal Inductive SynthesisFormal Inductive Synthesis

Given:– Class of Artifacts C– Formal specification – Set of (labeled) examples E (or source of E)

Find, using only E, an f C that satisfies

Page 6: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 6 –

Counterexample-Guided Inductive Synthesis (CEGIS)Counterexample-Guided Inductive Synthesis (CEGIS)

INITIALIZE

SYNTHESIZE VERIFY

CandidateArtifact

Counterexample

Verification SucceedsSynthesis Fails

Structure Hypothesis (“Syntax-Guidance”), Initial Examples

Page 7: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 7 –

CEGIS = Learning from Examples & CounterexamplesCEGIS = Learning from Examples & Counterexamples

INITIALIZE

LEARNINGALGORITHM

VERIFICATIONORACLE

CandidateConcept

Counterexample

Learning SucceedsLearning Fails

“Concept Class”, Initial Examples

Page 8: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 8 –

CEGIS is an instance of ActiveLearningCEGIS is an instance of ActiveLearning

ACTIVELEARNING

ALGORITHM

1. Search Strategy: How to search the space of candidate concepts?

2. Example Selection: Which examples to learn from?

Examples

Search Strategy

SelectionStrategy

Page 9: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 9 –

Some Instances of CEGIS you’ve seen (will see soon)Some Instances of CEGIS you’ve seen (will see soon)Instance Concept

ClassLearner Verifier

SKETCH Programs in SKETCH

SAT/SMT solver

SAT/SMT solver

Enumerative SyGuS

Defined by Grammar

EnumerativeSearch

SMT solver

Stochastic SyGuS

Defined by Grammar

StochasticSearch

SMT solver

Constraint SyGuS

Defined by Grammar

SMT solver SMT solver

Req. Mining for STL

Parametric STL

Parameter Search

Simulation-based falsifier

(see [Alur et al., FMCAD’13])

Page 10: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 10 –

Comparison*Comparison*

Feature Formal Inductive Synthesis

MachineLearning

Concept/Program Classes

Programmable, Complex Fixed, Simple

Learning Algorithms

General-Purpose Solvers Specialized

Learning Criteria Exact, w/ Formal Spec

Approximate, w/ Cost Function

Oracle-Guidance Common (can control Oracle)

Rare (black-boxoracles)

* Between typical inductive synthesizer and machine learning algo

Page 11: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 11 –

Oracle-Guided Inductive SynthesisOracle-Guided Inductive Synthesis

Page 12: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 12 –

Oracle-Guided Inductive SynthesisOracle-Guided Inductive Synthesis

Given:– Domain of Examples D– Concept Class C – Formal Specification ⊆ D– Oracle O that can answer queries of type Q

Find, by only querying O, an f C that satisfies

Page 13: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 13 –

Common Oracle Query TypesCommon Oracle Query Types

LEARNER ORACLE

Positive Witnessx , if one exists, else

Negative Witnessx , if one exists, else

Membership: Is x ?Yes / No

Equivalence: Is f = ?Yes / No + x f

Subsumption/Subset: Is f ⊆ ?Yes / No + x f \

Distinguishing Input: f, X ⊆ ff’ s.t. f’ ≠f X ⊆ f’, if it exists;

o.w.

Page 14: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 14 –

Examples of OGISExamples of OGIS

L* algorithm to learn DFAs:– Membership + Equivalence queries

CEGIS used in SKETCH/SyGuS solvers– (positive) Witness + Equivalence/Subsumption

queries CEGIS used in Reactive Model Predictive Control

– covered in Vasu Raman’s lecture Two different examples:

– Learning Programs from Distinguishing Inputs [Jha et al., ICSE 2010]

– Learning LTL Properties for Synthesis from Counterstrategies [Li et al., MEMOCODE 2011]

Page 15: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 15 –

Reverse Engineering Malware by Program SynthesisReverse Engineering Malware by Program Synthesis

Obfuscated code:Input: y Output: modified value of y

{ a=1; b=0; z=1; c=0;while(1) { if (a == 0) {if (b == 0) { y=z+y; a =~a;b=~b; c=~c; if (~c) break; }else {z=z+y; a=~a; b=~b; c=~c;if (~c) break; } }else if (b == 0) {z=y << 2; a=~a;}else { z=y << 3; a=~a; b=~b;}

} }FROM CONFICKER WORM

What it does:

y = y * 45

We solve this using program synthesis.

Paper: S. Jha et al., “Oracle-Guided Component-Based Program Synthesis”, ICSE 2010.

Page 16: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 16 –

Class of Programs: “Loop-Free”Class of Programs: “Loop-Free”

Programs implementing functions: I O

Functions could be if-then-else definitions and hence, the above represents any loop-free code.

P(I):O1 = f1 (V1)O2 = f2 (V2)…On = fn (Vn)

where

f1,f2,…,fn are functions from a given component library

Page 17: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 18 –

Program Learning as Set CoverProgram Learning as Set Cover

Space of all possible programsEach dot represents semantically unique program

[Goldman & Kearns, 1995]

Page 18: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 19 –

Program Learning as Set CoverProgram Learning as Set Cover

Space of all possible programsEach dot represents semantically unique program

Page 19: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 20 –

Program Learning as Set CoverProgram Learning as Set Cover

Space of all possible programsEach dot represents semantically unique program

(i1, o1)

Example Set of programs ruled out by that example

Page 20: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 21 –

Program Learning as Set CoverProgram Learning as Set Cover

Space of all possible programsEach dot represents semantically unique program

(i1, o1) - E1(i2, o2) - E2

………

(in, on) - En

Theorem: [Goldman & Kearns, ‘95]Smallest set of I/O examples to learn correct program

IS

Minimum size subset of {E1, E2, ……, En } that covers all the incorrect programs

Page 21: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 22 –

Program Learning as Set CoverProgram Learning as Set Cover

Space of all possible programsEach dot represents semantically unique program

(i1, o1) - E1(i2, o2) - E2

………

(in, on) - En

Smallest set of I/O examples to learn correct design

IS

Minimum size subset of {E1, E2, ……, En } that cover all the incorrect programs

Practical challenge: can’t enumerate all inputs and find set Ei for each

Page 22: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 23 –

Program Learning as Set CoverProgram Learning as Set Cover

Space of all possible programsEach dot represents semantically unique program

(i1, o1) - E1(i2, o2) - E2

………

(in, on) - En

ONLINE set-cover:

In each step, • choose some (ij,oj) pair• eliminated incorrect programs Ejdisclosed

Page 23: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 24 –

Program Learning as Set CoverProgram Learning as Set Cover

Space of all possible programsEach dot represents semantically unique program

(i1, o1) - E1(i2, o2) - E2

………

(in, on) - En

ONLINE set-cover:

In each step, • choose some (ij,oj) pair• eliminated incorrect programs Ejdisclosed

Our heuristic:|Ej| 1: atleast one incorrect program identified

Page 24: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 25 –

Approach: Learning based on Distinguishing InputsApproach: Learning based on Distinguishing Inputs

Space of all possible programsEach dot represents semantically unique program

[Jha et al., ICSE’10]

Page 25: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 26 –

Space of all possible programs

Make (positive) Witness queryExample I/O set E := {(i1,o1)}

Learning from Distinguishing InputsLearning from Distinguishing Inputs

Page 26: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 27 –

Space of all possible programs

Example I/O set E := {(i1,o1)}

P1

Learning from Distinguishing InputsLearning from Distinguishing Inputs

SMT Learner synthesizes P1 usingSMT solver

Page 27: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 28 –

Space of all possible programs

Example I/O set E := {(i1,o1)}

P1

P2

Learning from Distinguishing InputsLearning from Distinguishing Inputs

SMT

SMT

Page 28: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 29 –

Space of all possible programs

Example I/O set E := {(i1,o1)}

P1

P2i2

Learning from Distinguishing InputsLearning from Distinguishing Inputs

SMT

Page 29: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 30 –

Space of all possible programs

Example I/O set E := {(i1,o1)}

i2

(i2, o2)

P1

P2

Learning from Distinguishing InputsLearning from Distinguishing Inputs

Implemented as a singleDistinguishing input query:• Given P1,• Finds P2 s.t. P1 != P2

and (i2, o2) is dist. input

Page 30: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 31 –

Space of all possible programs

Example I/O set E := E {(i2,o2)}

Learning from Distinguishing InputsLearning from Distinguishing Inputs

Page 31: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 32 –

Space of all possible programs

Example I/O set E := E {(ij,oj)}

Learning from Distinguishing InputsLearning from Distinguishing Inputs

Page 32: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 33 –

Space of all possible programs

Example I/O set E := E {(ik,ok)}

Learning from Distinguishing InputsLearning from Distinguishing Inputs

Page 33: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 34 –

Space of all possible programs

Example I/O set E := E {(in,on)}

Semantically Unique Program

Correct Program?

Learning from Distinguishing InputsLearning from Distinguishing Inputs

Page 34: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 35 –

SoundnessSoundness

Library of components is sufficient ?

Correct design

I/O pairs show infeasibility ?

YES

Infeasibility reported

Incorrect design

YES

NO

NO

Can put this learning approach within an outer CEGIS loop

Conditional on Validity of Structure Hypothesis

Page 35: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 38 –

Example 2: Assumption SynthesisExample 2: Assumption Synthesis

Page 36: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Reactive Synthesis from LTL

39

SystemRequirements

EnvironmentAssumptions

SynthesisTool

Unrealizable

Realizable

FSM

Often due to incomplete environment assumptions!

s

e

e s

Page 37: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Example

40

Inputs: request r and cancel c

Outputs: grant g

System specification s:- G (r X F g)- G(c g X g)

Environment assumption e: - True

Is it realizable?

Not realizable because the environment can force c to be high all the time

Page 38: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Counter-strategy and counter-trace

41

Counter-strategy is a strategy for the environment to force violation of the specification.

Counter-trace is a fixed input sequence such that the specification is violated regardless of the outputs generated by the system.

System s:- G (r X F g)- G(c g X g)

A counter-trace:r: 1 1 (1)c: 1 1 (1)

Page 39: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

CounterStrategy-Guided Environment Assumption Synthesis [Li et al., MEMOCODE 2011]

42

FormalSpecification

Synthesis Tool

Realizable

Done

ComputeCounterstrategy

Unrealizable

Start

MineAssumptions

AddSpecificationTemplates

UserScenarios

Page 40: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Instance of CEGIS

Concept Class: all LTL formulas over I and O “constraining I”

Domain of Examples: All finite-state transducers with input O and output I (environments)

Formal Spec: set of env transducers for which there exists an implementation of Sys satisfying s

Oracle supports subsumption query: Does the current assumption rule out environments outside ? If

not, give one such counterexample.

MEMOCODE 201143

I

OSysEnv

Page 41: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Assumption Learning Algorithm

45

Generate Candidate φ

e.g. G F ?Specification

Templates

UserScenarios

Counter-strategy

Eliminate or Accept

Candidate

e.g. G F p

DoneAccept φ

Eliminate φ

(check consistency with existing assumptions)

Page 42: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Assumption Synthesis Algorithm

46

Generate Candidate φ

e.g. G F ?Specification

Templates

UserScenarios

Counter-strategy E

Invoke Model Checker: Does E satisfy φ ?

e.g. G F p

DoneYES: Accept φ

NO: Eliminate φ

VERSION SPACES:• Retain candidates

consistent with examples

• Traverse weakest to strongest

(check consistency with existing assumptions)

Page 43: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Version Space Learning

G F p1

true

false

G F p2 G F pn

G X p1

G p1

G p1 p2

G p2 G pn

G pn-1 pn

G X pn

G (p1 X p2)

G X p2

. . .

. . .

. . .

. . .

. . .

. . .

STRONGEST (most specific)

WEAKEST (most general)

Originally due to [Mitchell, 1978]

Page 44: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Example

48

System s:- G (r X F g)- G(c g X g)

A counter-trace:r: 1 1 (1)c: 1 1 (1)

Test assumption candidates by checking its negation:o G (F c) ------------ o G (F c) ------------

System s:- G (r X F g)- G(c g X g)

Environment e:- G (F c)

Page 45: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

Theoretical Results

49

Theorem 1: [Completeness] If there exist environment assumptions under our structure hypothesis that make the spec realizable, then the procedure finds them (terminates successfully). “conditional completeness” guarantee

Theorem 2: [Soundness] The procedure never adds inconsistent environment assumptions.

Page 46: Synthesis and Inductive Learning –Part 2...Synthesis and Inductive Learning –Part 2 Sanjit A. Seshia EECS Department UC Berkeley NSF ExCAPE Summer School June 23-25, 2015 Acknowledgments

– 50 –

Summary of Lecture 2Summary of Lecture 2

Differences between Formal Inductive Synthesis and Machine Learning

Common features across various inductive synthesis methods

Oracle-Guided Inductive Synthesis framework Two Instances

– Learning Programs from Distinguishing Inputshttp://www.eecs.berkeley.edu/~sseshia/pubs/b2hd-jha-

icse10.html– Learning Assumptions from Counterstrategies

http://www.eecs.berkeley.edu/~sseshia/pubs/b2hd-li-memocode11.html

Brief Introduction to the Theory (more tomorrow)