Hypernetwork Models of MemoryHypernetwork Models of Memory
Byoung-Tak Zhang
Biointelligence LaboratorySchool of Computer Science and Engineering
Brain Science, Cognitive Science, Bioinformatics ProgramsSeoul National University
Seoul 151-742, Korea
[email protected]://bi.snu.ac.kr/
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
3
Protein-Protein Interaction NetworksProtein-Protein Interaction Networks
[Jeong et al., Nature 2001]
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
4
Regulatory NetworksRegulatory Networks
[Katia Basso, et al., Nature Genetics 37, 382 – 390, 2005]
Transcription Factor
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
5
Molecular NetworksMolecular Networks
x2
x3
x4
x5
x6
x7
x8 x9
x10
x11
x12
x13
x14
x15
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
6
x1x2
x3
x4
x5
x6
x7
x8 x9
x10
x11
x12
x13
x14
x15
The “Hyperinteraction” Model of The “Hyperinteraction” Model of BiomoleculesBiomolecules
• Vertex: • Genes• Proteins• Neurotransmitters• Neuromodulators• Hormones
• Edge:• Interactions• Genetic• Signaling• Metabolic• Synaptic
[Zhang, DNA-2006][Zhang, FOCI-2007]
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
8
HypergraphsHypergraphs
A hypergraph is a (undirected) graph G whose edges connect a non-null number of vertices, i.e. G = (V, E), where
V = {v1, v2, …, vn}, E = {E1, E2, …, En}, and Ei = {vi1, vi2, …, vim} An m-hypergraph consists of a set V of vertices and a subset
E of V[m], i.e. G = (V, V[m]) where V[m] is a set of subsets of V whose elements have precisely m members.
A hypergraph G is said to be k-uniform if every edge Ei in E has cardinality k.
A hypergraph G is k-regular if every vertex has degree k. Rem.: An ordinary graph is a 2-uniform hypergraph.
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
9
An Example HypergraphAn Example Hypergraph
v5v5
v1v1
v3v3
v7v7
v2v2
v6v6
v4v4
G = (V, E)V = {v1, v2, v3, …, v7}E = {E1, E2, E3, E4, E5}
E1 = {v1, v3, v4}E2 = {v1, v4}E3 = {v2, v3, v6}E4 = {v3, v4, v6, v7}E5 = {v4, v5, v7}
E1
E4
E5
E2
E3
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
10
HypernetworksHypernetworks
A hypernetwork is a hypergraph of weighted edges. It is defined as a triple H = (V, E, W), where
V = {v1, v2, …, vn},
E = {E1, E2, …, En},
and W = {w1, w2, …, wn}. An m-hypernetwork consists of a set V of vertices and a subset E of V[m],
i.e. H = (V, V[m], W) where V[m] is a set of subsets of V whose elements have precisely m members and W is the set of weights associated with the hyperedges.
A hypernetwork H is said to be k-uniform if every edge Ei in E has cardinality k.
A hypernetwork H is k-regular if every vertex has degree k. Rem.: An ordinary graph is a 2-uniform hypergraph with wi=1.
[Zhang, DNA-2006]
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
11
x1x2
x3
x4
x5
x6
x7
x8 x9
x10
x11
x12
x13
x14
x15
The Hypernetwork Memory The Hypernetwork Memory
)( 2121...21
21
21...21
321
321321
21
2121
321
321321
21
2121
2 ,...,,
)()()()(
2 ,...,,
)()()()(
,,
)()()()3(
,
)()()2(
)()(
,,
)()()()3(
,
)()()2()(
...)(
1exp)Z(
isfunction partition thewhere
,...)(
1exp
)Z(
1
...6
1
2
1exp
)Z(
1
)];(exp[)Z(
1 )|(
ondistributiy probabilit The
...6
1
2
1 );(
rkhypernetwo theofenergy The
m kkiiikiii
k
kiiikiii
iiiiiiiiii
iiiiiiiiii
K
k iii
mmmk
K
k iii
nnnk
iii
nnn
ii
nn
nn
iii
nnn
ii
nnn
xxxwkc
W
xxxwkcW
xxxwxxwW
WEW
WP
xxxwxxwWE
x
xx
x
Nn
K
iii
i
D
WWWW
SkXSSS
xxxX
WSXH
I
1)(
)()3()2(
,...,,
}{
:set Training
),...,,(
|| , ,
)(
),,(
as defined isrk hypernetwo The
21
x
[Zhang, DNA12-2006]
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
12x8 x9
x12
x1x2
x3
x4
x5
x6
x7x10
x11
x13
x14
x15
x1 =1
x2 =0
x3 =0
x4 =1
x5 =0
x6 =0
x7 =0
x8 =0
x9 =0
x10 =1
x11 =0
x12 =1
x13 =0
x14 =0
x15 =0
y
= 1
x1 =0
x2 =1
x3 =1
x4 =0
x5 =0
x6 =0
x7 =0
x8 =0
x9 =1
x10 =0
x11 =0
x12 =0
x13 =0
x14 =1
x15 =0
y
= 0
x1 =0
x2 =0
x3 =1
x4 =0
x5 =0
x6 =1
x7 =0
x8 =1
x9 =0
x10 =0
x11 =0
x12 =0
x13 =1
x14 =0
x15 =0
y
=1
4 sentences (with labels)
x4 x10 y=1x1
x4 x12 y=1x1
x10 x12 y=1x4
x3 x9 y=0x2
x3 x14 y=0x2
x9 x14 y=0x3
x6 x8 y=1x3
x6 x13 y=1x3
x8 x13 y=1x6
1
2
3
1
2
3
x1 =0
x2 =0
x3 =0
x4 =0
x5 =0
x6 =0
x7 =0
x8 =1
x9 =0
x10 =0
x11 =1
x12 =0
x13 =0
x14 =0
x15 =1
y
=14
x11 x15 y=0x84
Round 1Round 2Round 3
An Example: An Example: Hypernetwork Model of Hypernetwork Model of
Linguistic MemoryLinguistic Memory
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
14
The Language Game PlatformThe Language Game Platform
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
15
A Language GameA Language Game
? still ? believe ? did this. I still can't believe you did
this.
We ? ? a lot ? gifts. We don't have a lot of gifts.
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
16
Text Corpus: TV Drama SeriesText Corpus: TV Drama Series
Friends, 24, House, Grey Anatomy, Gilmore Girls, Sex and the City
289,468 Sentences
(Training Data)
700 Sentences with Blanks(Test Data)
I don't know what happened.Take a look at this.…
What ? ? ? here.? have ? visit the ? room.
…
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
17
Step 1: Learning a Linguistic Step 1: Learning a Linguistic MemoryMemory
Display
Color
Text
News
Monitor
Computer
Network
Price
Computer network is rapidly increased. The price of computer network installing is very cheap. The price of monitor display is on decreasing. Nowadays, color monitor display is so common, the price is not so high. This is a system adopting text mode color display. This is an animation news networks. ...
Price
Computer Network
Computer Price
Computer Display
Computer Monitor Display
News Network
Text News
Monitor Display
Monitor Price
Color Monitor
Color Monitor
Color Display
Text Display
Color
Display
Computer Network Price
Price
Color
Text
k = 2k = 2k = 3
k = 4 …
HypernetworkMemory
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/18
Step 2: Recalling from the Step 2: Recalling from the MemoryMemory
He is my best friend
best friendmy bestis myHe is
He is a ? friend
He is a strong boy
Strong friend likes pretty girl
He is is a a strong strong boy
Strong friend friend likes likes strong pretty girl
Strong friend
a strong
best friend
He is a strong friend
X7
X6
X5
X8
X1
X2
X3
X4
Recall
Self-assembly
Storage
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
19
The Language Game: ResultsThe Language Game: Results
Why ? you ? come ? down ? Why are you go come on down here
? think ? I ? met ? somewhere before I think but I am met him somewhere before
? appreciate it if ? call her by ? ? I appreciate it if you call her by the way
I'm standing ? the ? ? ? cafeteria I'm standing in the one of the cafeteria
Would you ? to meet ? ? Tuesday ? Would you nice to meet you in Tuesday and
? gonna ? upstairs ? ? a shower I'm gonna go upstairs and take a shower
? have ? visit the ? room I have to visit the ladies' room
We ? ? a lot ? gifts We don't have a lot of gifts
? ? don't need your ? If I don't need your help
? ? ? decision to make a decision
? still ? believe ? did this I still can't believe you did this
What ? ? ? here What are you doing here
? you ? first ? of medical school Are you go first day of medical school
? ? a dream about ? In ? I had a dream about you in Copenhagen
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
20
Experimental SetupExperimental Setup
The order (k) of an hyperedge Range: 2~4 Fixed order for each experiment
The method of creating hyperedges from training data Sliding window method Sequential sampling from the first word
The number of blanks (question marks) in test data Range: 1~4 Maximum: k - 1
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
21
Learning Behavior Analysis (1/3)Learning Behavior Analysis (1/3)
The performance monotonically increases as the learning corpus grows. The low-order memory performs best for the one-missing-word problem.
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
22
Learning Behavior Analysis (2/3)Learning Behavior Analysis (2/3)
The medium-order (k=3) memory performs best for the two-missing-words problem.
© 2008, SNU Biointelligence Lab, http://bi.snu.ac.kr/
23
Learning Behavior Analysis (3/3)Learning Behavior Analysis (3/3)
The high-order (k=4) memory performs best for the three-missing-words problem.
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
25x8 x9
x12
x1x2
x3
x4
x5
x6
x7x10
x11
x13
x14
x15
x1 =1
x2 =0
x3 =0
x4 =1
x5 =0
x6 =0
x7 =0
x8 =0
x9 =0
x10 =1
x11 =0
x12 =1
x13 =0
x14 =0
x15 =0
y
= 1
x1 =0
x2 =1
x3 =1
x4 =0
x5 =0
x6 =0
x7 =0
x8 =0
x9 =1
x10 =0
x11 =0
x12 =0
x13 =0
x14 =1
x15 =0
y
= 0
x1 =0
x2 =0
x3 =1
x4 =0
x5 =0
x6 =1
x7 =0
x8 =1
x9 =0
x10 =0
x11 =0
x12 =0
x13 =1
x14 =0
x15 =0
y
=1
4 examples
x4 x10 y=1x1
x4 x12 y=1x1
x10 x12 y=1x4
x3 x9 y=0x2
x3 x14 y=0x2
x9 x14 y=0x3
x6 x8 y=1x3
x6 x13 y=1x3
x8 x13 y=1x6
1
2
3
1
2
3
x1 =0
x2 =0
x3 =0
x4 =0
x5 =0
x6 =0
x7 =0
x8 =1
x9 =0
x10 =0
x11 =1
x12 =0
x13 =0
x14 =0
x15 =1
y
=14
x11 x15 y=0x84
Round 1Round 2Round 3
28
Collectively Autocatalytic Collectively Autocatalytic SetsSets “A contemporary cell is a collectively
autocatalytic whole in which DNA, RNA, the code, proteins, and metabolism linking the synthesis of species of some molecular species to the breakdown of other “high-energy” molecular species all weave together and conspire to catalyze the entire set of reactions required for the whole cell to reproduce.” (Kauffman)
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
30
The Hypernetwork Model of The Hypernetwork Model of MemoryMemory
Nn
K
iii
i
D
WWWW
SkXSSS
xxxX
WSXH
I
1)(
)()3()2(
,...,,
}{
:set Training
),...,,(
|| , ,
)(
),,(
as defined isrk hypernetwo The
21
x
1 1 1 2 1 2 1 2 3 1 2 3
1 1 2 1 2 3
( ) (1) ( ) (2) ( ) ( ) (3) ( ) ( ) ( )
, , ,
( ) ( )
The energy of the hypernetwork
1 1 ( ; ) ...
2 6
The probability distribution
1( | ) exp[ ( ; )]
Z( )
i i i i i i i i i i i i
n n n n n n n
i i i i i i
n n
E W w x w x x w x x x
P W E WW
x
x x
1 2 1 2 1 2 3 1 2 31 2 1 2 3
...1 2 1 21 2
(2) ( ) ( ) (3) ( ) ( ) ( )
, , ,
( ) ( ) ( ) ( )
2 , ,...,
1 1 1 exp ...
Z( ) 2 6
1 1 exp ... ,
Z( ) ( )
where the partition function
i i i i i i i i i i
i i i i i ik kk
n n n n n
i i i i i
Kk n n n
k i i i
w x x w x x xW
w x x xW c k
...1 2 1 2( ) 1 2
( ) ( ) ( ) ( )
2 , ,...,
is
1Z( ) exp ...
( ) i i i i i ik km k
Kk m m m
k i i iW w x x x
c k
x
[Zhang, 2006]
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
31
Deriving the Learning RuleDeriving the Learning Rule
N
n
K
k iii
nnnk
N
n
Kn
Nn
WZxxxwkc
WWWP
WP
k
kiiikiii1 2 ,...,,
)()()()(
1
)()3()2()(
1)(
)(ln...)(
1exp
),...,,|(ln
)|}({ln
21
21...21
x
x
)|}({ln 1)(
)(
...21
WPw
Nns
siii
x
N
n
Nn
WP
WP
1
(n)
1)(
)|(
)|}({
x
x
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
32
Derivation of the Learning Derivation of the Learning RuleRule
xx
x
x
x
x
)|(......
...1
...
where
......
......
)(ln...)(
1exp
)(ln...)(
1exp
)|}({ln
2121
2121
2121
2121
...2121
21...21
...21
21
21...21
...21
...21
)|(
1
)()()(
)|(
1)|(
)()()(
1)(
2 ,...,,
)()()()()(
1 2 ,...,,
)()()()()(
1)(
)(
WPxxxxxx
xxxN
xxx
xxxxxxN
xxxxxx
WZw
xxxwkcw
WZxxxwkcw
WPw
siiis
siiis
ss
ssiii
siiik
kiiikiii
siii
k
kiiikiii
siii
siii
WPiii
N
n
nnn
Dataiii
WPiiiDataiii
N
nWPiii
nnn
N
ns
K
k iii
nnnks
N
n
K
k iii
nnnks
Nns
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
34
Probabilistic Graphical Models Probabilistic Graphical Models (PGMs)(PGMs)
Represent the joint probability distribution on some random variables in graphical form. Undirected PGMs Directed PGMs
Generative: The probability distribution for some variables given values of other variables can be obtained. Probabilistic inference
( , , , , )
( ) ( | ) ( | , ) ( | , , )
( | , , , )
( ) ( ) ( | , ) ( | ) ( | )
P A B C D E
P A P B A P C A B P D A B C
P E A B C D
P A P B P C A B P D B P E C
C
A B
E
D
• C and D are independent given B.
• C asserts dependency between A and B.
• B and E are independent given C.
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
35
Kinds of Graphical ModelsKinds of Graphical Models
Graphical Models
- Boltzmann Machines - Markov Random Fields
- Bayesian Networks- Latent Variable Models- Hidden Markov Models- Generative Topographic Mapping- Non-negative Matrix Factorization
Undirected Directed
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
36
Bayesian NetworksBayesian Networks BN = (S, P) consists of a network structure S and a set of local
probability distributions P 1
( ) ( | )
n
i ii
p p xx pa
• Structure can be found by relying on the prior knowledge of causal relationships
<BN for detecting credit card fraud>
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
37
From Bayes Nets to High-Order From Bayes Nets to High-Order PGMsPGMs
G
F
J
A
S
G
F
J
A
S
{ , , , }
( | , , , )
( , , , | ) ( )
( , , , )
( , , , | )
( | ) ( | ) ( | ) ( | )
( | )x J G S A
P F J G S A
P J G S A F P F
P J G S A
P J G S A F
P J F P G F P S F P A F
P x F
G
F
J
A
S
{ , , , , }
( , , , , )
( | ) ( | ) ( | )( | )
( | ( ))x F J G S A
P F J G S A
P G F P J F P J A J S
P x pa x
( , ) {( , )| , { , , , } and }
( , , , , )
( , | ) ( , | ) ( , | )
( , | ) ( , | )
( , | )
( ( , ) | )he x y x y x y J G S A
x y
P F J G S A
P J G F P J S F P J A F
P G S F P G A F
P S A F
P he x y F
(1) Naïve Bayes
(2) Bayesian Net
(3) High-Order PGM
Visual MemoriesVisual Memories
Digit Recognition Face Classification Text Classification Movie Title Prediction
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
39
Digit Recognition: DatasetDigit Recognition: Dataset
Original Data Handwritten digits (0 ~ 9) Training data: 2,630 (263
examples for each class) Test data: 1,130 (113
examples for each class)
Preprocessing Each example is 8x8
binary matrix. Each pixel is 0 or 1.
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
40
Output Layer
x1
x1
xn
x3
x2
Class
x1x3
x1…xn
x1x2
x2
w2
w i
w jw
m
w1
Input Layer
Hidden Layer
•••
•••
•••
n
k k
nm
1
}#,1|{1
copiesofwk
niwW
n
ki
x1 Class
x2 Class
x1 x2 Class
x1 x3 Class
x1 xn Class…
x1 Class
x1 Class
x2 Class
x1 x2 Class
x1 x2 Class
x1 x3 Class
x1 x3 Class
x1 x3 Class
x1 xn Class…
x1 x3 Class
x1 x2 x4 Classx2 x3 Class
x1 x4 Class
x1 x3 Class
x1 x3 Class
x1 x2 x4 Class
x1 x2 x4 Class
x2 x3 x4 Class
x2 x3 x4 Class
x2 x3 x4 Class
x2 x3 Class
x2 x3 Class
x1 x4 Class
x1 x4 Class
x1 Class
x2 Class
x1 x2 Class
x1 x3 Class
x1 xn Class…
x1 Class
x1 Class
x2 Class
x1 x2 Class
x1 x2 Class
x1 x3 Class
x1 x3 Class
x1 x3 Class
x1 xn Class…
x2 Class
x2 Class
x1 x3 Class
x1 x3 Class
Probabilistic Library(DNA Representation)
“Layered” Hypernetwork
Pattern ClassificationPattern Classification
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
41
Simulation Results – without Simulation Results – without Error CorrectionError Correction |Train set| = 3760, |Test set| = 1797.
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
42
Performance ComparisonPerformance Comparison
Methods Accuracy
MLP with 37 hidden nodes 0.941
MLP with no hidden nodes 0.901
SVM with polynomial kernel 0.926
SVM with RBF kernel 0.934
Decision Tree 0.859
Naïve Bayes 0.885
kNN (k=1) 0.936
kNN (k=3) 0.951
Hypernet with learning (k = 10) 0.923
Hypernet with sampling (k = 33) 0.949
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
43
Error Correction AlgorithmError Correction Algorithm
1. Initialize the library as before.2. maxChangeCnt := librarySize.3. For i := 0 to iteration_limit
1. trainCorrectCnt := 0.2. Run classification for all training patterns. For each correctly classifed
patterns, increase trainCorrectCnt.3. For each library elements
1. Initialize fitness value to 0.2. For each misclassified training patterns if a library element is matched to that
example1. if classified correctly, then fitness of the library element gains 2 points.2. Else it loses 1 points.
4. changeCnt := max{ librarySize * (1.5 * (trainSetSize - trainCorrectCnt) / trainSetSize + 0.01), maxChangeCnt * 0.9 }.
5. maxChangeCnt := changeCnt.6. Delete changeCnt library elements of lowest fitness and resample library
elements whose classes are that of deleted ones.
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
44
Simulation Results – with Simulation Results – with Error CorrectionError Correction iterationLimit = 37, librarySize = 382,300,
0 5 10 15 20 25 30 350.9
0.91
0.92
0.93
0.94
0.95
0.96
0.97
0.98
0.99
1
Iteration
Cla
ssifi
catio
n ra
tio
Train
610
14
18
2227
Order
0 5 10 15 20 25 30 35
0.87
0.88
0.89
0.9
0.91
0.92
0.93
Iteration
Cla
ssifi
catio
n ra
tio
Test
610
14
18
2226
Order
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
45
Performance ComparisonPerformance Comparison
Algorithms Correct classification rate
Random Forest (f=10, t=50) 94.10 %
KNN (k=4)
Hypernetwork (Order=26)
93.49 %
92.99 %
AdaBoost (Weak Learner: J48) 91.93 %
SVM (Gaussian Kernel, SMO) 91.37 %
MLP 90.53 %
Naïve Bayes
J48
87.26 %84.86 %
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
47
Face Data SetFace Data Set
Yale dataset 15 people 11 images
per personTotal 165
images
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
48
Training Images of a PersonTraining Images of a Person
10 for training
The remaining 1 for test
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
49
Bitmaps for Training Data Bitmaps for Training Data (Dimensionality = 480)(Dimensionality = 480)
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
50
Classification Rate by Leave-One-Classification Rate by Leave-One-OutOut
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
51
Classification Rate Classification Rate (Dimensionality = 64 by PCA)(Dimensionality = 64 by PCA)
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
52
Learning Hypernets from Movie Learning Hypernets from Movie CaptionsCaptions
Order Sequential Range: 2~3
Corpus Friends Prison Break 24
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
53
Learning Hypernets from Movie Learning Hypernets from Movie CaptionsCaptions
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
54
Learning Hypernets from Movie Learning Hypernets from Movie CaptionsCaptions
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
55
Learning Hypernets from Movie Learning Hypernets from Movie CaptionsCaptions
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
56
Learning Hypernets from Movie Learning Hypernets from Movie CaptionsCaptions
Classification Query generation
- I intend to marry her : I ? to marry her I intend ? marry her I intend to ? her I intend to marry ? Matching
- I ? to marry her order 2: I intend, I am, intend to, …. order 3: I intend to, intend to marry, …
Count the number of max-perfect-matching
hyperedges
© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/
57
Completion & Classification Examples
Query Completion Classification
who are you Corpus: Friends, 24, Prison Break
? are you
who ? you
who are ?
what are you
who are you
who are you
Friends
Friends
Friends
you need to wear it Corpus: 24, Prison Break, House
? need to wear it
you ? to wear it
you need ? wear it
you need to ? it
you need to wear ?
i need to wear it
you want to wear it
you need to wear it
you need to do it
you need to wear a
24
24
24
House
24
Learning Hypernets from Movie Learning Hypernets from Movie CaptionsCaptions