![Page 1: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/1.jpg)
1
Exploiting Syntactic Patterns as Clues in Zero-
Anaphora Resolution
Ryu Iida, Kentaro Inui and Yuji MatsumotoNara Institute of Science and Technology
{ryu-i,inui,matsu}@is.naist.jpJune, 20th, 2006
![Page 2: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/2.jpg)
2
Zero-anaphora resolution
Zero-anaphor = a gap with an anaphoric function Zero-anaphora resolution becoming important in many
applications In Japanese, even obligatory arguments of a predicate
are often omitted when they are inferable from the context 45.5% nominative arguments of verbs are omitted in
newspaper articles
![Page 3: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/3.jpg)
3
Zero-anaphora resolution (cont’d) Three sub-tasks:
Zero-pronoun detection: detect a zero-pronoun Antecedent identification: identify the antecedent for a given z
ero-pronoun Anaphoricity determination:
Mary-wa John-ni (φ-ga ) tabako-o yameru-youni it-taMary-NOM John-DAT (φ-NOM ) smoking-OBJ quit-COMP say-PAST[Mary asked John to quit smoking.]
anaphoric zero-pronounantecedent
![Page 4: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/4.jpg)
4
Zero-anaphora resolution (cont’d) Three sub-tasks:
Zero-pronoun detection: detect a zero-pronoun Antecedent identification: identify antecedent from the set of candidate ant
ecedents for a given zero-pronoun Anaphoricity determination: classify whether a given zero-pronoun is anap
horic or non-anaphoric
(φ-ga) ie-ni kaeri-tai (φ -NOM) home-DAT want to go back [(φ=I) want to go home.]
non-anaphoric zero-pronoun
Mary-wa John-ni (φ-ga ) tabako-o yameru-youni it-taMary-NOM John-DAT (φ-NOM ) smoking-OBJ quit-COMP say-PAST[Mary asked John to quit smoking.]
anaphoric zero-pronounantecedent
![Page 5: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/5.jpg)
5
Previous work on anaphora resolution
Research trend has been shifting from rule-based approaches (Baldwin, 95; Lappin and Leass, 94; Mitkov, 97, etc.) to empirical, or learning-based, approaches (Soon et al., 2001; Ng 04, Yang et al., 05, etc.) Cost-efficient solution for achieving performance comparable to best perf
orming rule-based systems Learning-based approaches represent a problem, anaphoricity
determination and antecedent identification, as a set of feature vectors and apply machine learning algorithms to them
![Page 6: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/6.jpg)
6
Useful clues for both anaphoricity determination and antecedent identification
Syntactic pattern features
Mary-waMary-TOP
predicateyameru-youni
quit-CONP
zero-pronoun φ-ga
φ-NOM
predicateit-ta
say-PAST
Antecedent John-ni
John-DATtabako-o
smoking-OBJ
![Page 7: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/7.jpg)
7
Useful clues for both anaphoricity determination and antecedent identification
Questions How to encode syntactic patterns as features How to avoid data sparseness problem
Syntactic pattern features
Mary-waMary-TOP
predicateyameru-youni
quit-CONP
zero-pronoun φ-ga
φ-NOM
predicateit-ta
say-PAST
Antecedent John-ni
John-DATtabako-o
smoking-OBJ
![Page 8: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/8.jpg)
8
Talk outline
1. Zero-anaphora resolution: Background2. Selection-then-classification model (Iida et al., 05)3. Proposed model
Represents syntactic patterns based on dependency trees
Uses a tree mining technique to seek useful sub-trees to solve data sparseness problem
Incorporates syntactic pattern features in the selection-then-classification model
4. Experiments on Japanese zero-anaphora5. Conclusion and future work
![Page 9: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/9.jpg)
9
A federal judge in Pittsburgh issued a temporary restraining order preventing Trans World Airlines from buying additional shares of USAir Group Inc. The order, requested in a suit filed by USAir, …
A federal judge in Pittsburgh issued a temporary restraining order preventing Trans World Airlines from buying additional shares of USAir Group Inc. The order, requested in a suit filed by USAir, …candidate
anaphor
tournament model
USAir
suit
USAir Group Inc
order
federal judge
candidate anaphor
candidate antecedents …
Selection-then-Classification Model(SCM) (Iida et al., 05)
![Page 10: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/10.jpg)
10
tournament model
USAir
suit
USAir Group Inc
order
federal judge
candidate anaphor
candidate antecedents …
USAir Group IncUSAir Group Inc
USAirsuitUSAir Group IncFederal judgecandidate anaphorcandidate antecedents
…order
Selection-then-Classification Model(SCM) (Iida et al., 05)
(Iida et al. 03)
![Page 11: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/11.jpg)
11
USAir Group Inc
most likelycandidate antecedent
tournament model
USAir
suit
USAir Group Inc
order
federal judge
candidate anaphor
candidate antecedents
…
Selection-then-Classification Model(SCM) (Iida et al., 05)
![Page 12: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/12.jpg)
12
USAir Group Inc
most likelycandidate antecedent
tournament model
USAir
suit
USAir Group Inc
order
federal judge
candidate anaphor
candidate antecedents
…is non-anaphoricUSAir
score θ ana<score ≧ θ ana
is anaphoric andis the USAir
USAirUSAir Group Inc antecedent of
Anaphoricitydetermination model
USAir Group Inc USAir
Selection-then-Classification Model(SCM) (Iida et al., 05)
![Page 13: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/13.jpg)
13
USAir Group Inc
most likelycandidate antecedent
tournament model
USAir
suit
USAir Group Inc
order
federal judge
candidate anaphor
candidate antecedents
…is non-anaphoricUSAir
score θ ana<score ≧ θ ana
is anaphoric andis the USAir
USAirUSAir Group Inc antecedent of
Anaphoricitydetermination model
USAir Group Inc USAir
Selection-then-Classification Model(SCM) (Iida et al., 05)
![Page 14: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/14.jpg)
14
Anaphoric
Non-anaphoric
NANP
NP5
NP4
NP3
NP2
NP1
non-anaphoric noun phrase
set of candidate antecedents
NP3
tournament model
candidate antecedent
Non-anaphoricinstances
NP3 NANP
ANP
NP5
NP4
NP3
NP2
NP1
anaphoric noun phrase
set of candidate antecedents
Antecedent Anaphoricinstances
NP4 ANP
NPi: candidate antecedent
Training the anaphoricity determination model
![Page 15: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/15.jpg)
15
Talk outline
1. Zero-anaphora resolution: Background2. Selection-then-classification model (Iida et al., 05)3. Proposed model
Represents syntactic patterns based on dependency trees
Uses a tree mining technique to seek useful sub-trees to solve data sparseness problem
Incorporates syntactic pattern features in the selection-then-classification model
4. Experiments on Japanese zero-anaphora5. Conclusion and future work
![Page 16: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/16.jpg)
16
USAir Group Inc
most likelycandidate antecedent
tournament model
USAir
suit
USAir Group Inc
order
federal judgecandidate antecedents
…is non-anaphoricUSAir
score θ ana<score ≧ θ ana
is anaphoric andis the USAir
USAirUSAir Group Inc antecedent of
Anaphoricitydetermination model
USAir Group Inc USAir
New model
LeftCand predicatezero-
pronounpredicate
(TL)
LeftCand predicatezero-
pronounpredicate
(TL)
candidate anaphor
LeftCand predicateRightCand
(TI)
LeftCand predicateRightCand
(TI)LeftCand predicate
zero-pronoun
predicate
(TL)
LeftCand predicatezero-
pronounpredicate
(TL)
RightCand
(TR)
predicatezero-
pronounpredicateRightCand
(TR)
predicatezero-
pronounpredicate
![Page 17: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/17.jpg)
17
Use of syntactic pattern features
Encoding parse tree features
Learning useful sub-trees
![Page 18: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/18.jpg)
18
Encoding parse tree features
Mary-waMary-TOP
predicateyameru-youni
quit-CONP
zero-pronoun φ-ga
φ-NOM
predicateit-ta
say-PAST
Antecedent John-ni
John-DATtabako-o
smoking-OBJ
![Page 19: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/19.jpg)
19
Encoding parse tree features
predicateyameru-youni
quit-CONP
zero-pronoun φ-ga
φ-NOM
predicateit-ta
say-PAST
Antecedent John-ni
John-DATMary-wa
Mary-TOPtabako-o
smoking-OBJ
![Page 20: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/20.jpg)
20
Encoding parse tree features
Antecedent predicatezero-pronoun predicate
predicateyameru-youni
quit-CONP
zero-pronoun φ-ga
φ-NOM
predicateit-ta
say-PAST
Antecedent John-ni
John-DAT
![Page 21: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/21.jpg)
21
Encoding parse tree features
Antecedent predicatezero-pronoun predicate
youniCONJ
niDAT
gaCONJ
taPAST
predicateyameru-youni
quit-CONP
zero-pronoun φ-ga
φ-NOM
predicateit-ta
say-PAST
Antecedent John-ni
John-DAT
![Page 22: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/22.jpg)
22
Encoding parse trees
LeftCand predicateRightCand
(TI)LeftCand predicate
zero-pronoun
predicate
(TL)
RightCand
(TR)
predicatezero-
pronounpredicate
LeftCandMary-wa
Mary-TOP
predicateyameru-youni
quit-CONP
zero-pronoun φ-ga
φ-NOM
predicateit-ta
say-PAST
RightCand John-ni
John-DATtabako-o
smoking-OBJ
![Page 23: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/23.jpg)
23
Encoding parse trees
Antecedent identification
root
Three sub-trees
LeftCand predicateRightCand
(TI)
LeftCand predicateRightCand
(TI)LeftCand predicatezero-
pronounpredicate
(TL)
LeftCand predicatezero-
pronounpredicate
(TL)
RightCand
(TR)
predicatezero-
pronounpredicateRightCand
(TR)
predicatezero-
pronounpredicate
![Page 24: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/24.jpg)
24
Encoding parse trees
Antecedent identification
root
Three sub-trees
1 2 n……
f f f
Lexical, Grammatical, Semantic, Positional and Heuristic binary features
LeftCand predicateRightCand
(TI)
LeftCand predicateRightCand
(TI)LeftCand predicatezero-
pronounpredicate
(TL)
LeftCand predicatezero-
pronounpredicate
(TL)
RightCand
(TR)
predicatezero-
pronounpredicateRightCand
(TR)
predicatezero-
pronounpredicate
![Page 25: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/25.jpg)
25
Encoding parse trees
Antecedent identification
root
1 2 n……
f f f
Three sub-trees
Lexical, Grammatical, Semantic, Positional and Heuristic binary features
LeftCand predicateRightCand
(TI)
LeftCand predicateRightCand
(TI)LeftCand predicatezero-
pronounpredicate
(TL)
LeftCand predicatezero-
pronounpredicate
(TL)
RightCand
(TR)
predicatezero-
pronounpredicateRightCand
(TR)
predicatezero-
pronounpredicate
Left or rightlabel
![Page 26: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/26.jpg)
26
Learning useful sub-trees
Kernel methods: Tree kernel (Collins and Duffy, 01) Hierarchical DAG kernel (Suzuki et al., 03) Convolution tree kernel (Moschitti, 04)
Boosting-based algorithm: BACT (Kudo and Matsumoto, 04) system learns a list of
weighted decision stumps with the Boosting algorithm
![Page 27: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/27.jpg)
27
positivepositive
Boosting-based algorithm: BACT Learns a list of weighted decision stumps with Boosting Classifies a given input tree by weighted voting
Learning useful sub-trees
positiveLabels
Training instances
….
0.4weight
Label
positive
sub-tree
decision stumpslearn
Score: +0.34 positive
apply
![Page 28: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/28.jpg)
28
Overall process
Input (a zero-pronoun φ in the sentence S)
Intra-sentential model
Inter-sentential model
scoreintra<θintra
scoreintra≧θintra
Output the most-likely candidate antecedent
appearing in S
scoreinter≧θinter
Output the most-likely candidate appearing
outside of Sscoreinter<θinter
Return ‘‘non-anaphoric’’
syntactic patterns
![Page 29: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/29.jpg)
29
Table of contents
1. Zero-anaphora resolution2. Selection-then-classification model (Iida et al., 05)3. Proposed model
Parse encoding Tree mining
4. Experiments 5. Conclusion and future work
![Page 30: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/30.jpg)
30
Japanese newspaper article corpus comprising zero-anaphoric relations: 197 texts (1,803 sentences) 995 intra-sentential anaphoric zero-pronouns 754 inter-sentential anaphoric zero-pronouns 603 non-anaphoric zero-pronouns
Recall =
Precision =
Experiments
# of correctly resolved zero-anaphoric relations# of anaphoric zero-pronouns
# of anaphoric zero-pronouns the model detected
# of correctly resolved zero-anaphoric relations
![Page 31: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/31.jpg)
31
Experimental settings
Conducting five-fold cross validation Comparison among four models
BM: Ng and Cardie (02)’s model: Identify an antecedent with candidate-wise classification Determine the anaphoricity of a given anaphor as a by-
product of the search for its antecedent BM_STR: BM +syntactic pattern features SCM: Selection-then-classification model (Iida et al., 05) SCM_STR: SCM + syntactic pattern features
![Page 32: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/32.jpg)
32
Results of intra-sentential ZAR
Antecedent identification (accuracy)
The performance of antecedent identification improved by using syntactic pattern features
BM (Ng02) BM_STR SCM (Iida05) SCM_STR
48.0%(478/995)
63.5%(632/995)
65.1%(648/995)
70.5%(701/995)
![Page 33: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/33.jpg)
33
antecedent identification + anaphoricity determination
Results of intra-sentential ZAR
![Page 34: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/34.jpg)
34
Impact on overall ZAR Evaluate the overall performance for both intra-
sentential and inter-sentential ZAR
Baseline model: SCM resolves intra-sentential and inter-sentential zero-anaphora
simultaneously with no syntactic pattern features.
![Page 35: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/35.jpg)
35
Results of overall ZAR
![Page 36: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/36.jpg)
36
AUC curve AUC (Area Under the recall-precision Curve) plotted by
altering θintra Not peaky optimizing parameter θintra is not difficult
![Page 37: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/37.jpg)
37
Conclusion
We have addressed the issue of how to use syntactic patterns for zero-anaphora resolution. How to encode syntactic pattern features How to seek useful sub-trees
Incorporating syntactic pattern features into our selection-then-classification model improves the accuracy for intra-sentential zero-anaphora, which consequently improves the overall performance of zero-anaphora resolution
![Page 38: 1 Exploiting Syntactic Patterns as Clues in Zero- Anaphora Resolution Ryu Iida, Kentaro Inui and Yuji Matsumoto Nara Institute of Science and Technology](https://reader030.vdocument.in/reader030/viewer/2022032607/56649ed15503460f94be0ca9/html5/thumbnails/38.jpg)
38
Future workHow to find zero-pronouns?
Designing a broader framework to interact with analysis of predicate argument structure
How to find a globally optimal solution to the set of zero-anaphora resolution problems in a given discourse? Exploring methods as discussed by McCallum and
Wellner (03)