meaning representation in natural language tasks · 2020-05-13 · meaning representation in...

127
Meaning Representation in Natural Language Tasks Gabriel Stanovsky

Upload: others

Post on 26-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Meaning Representationin Natural Language Tasks

Gabriel Stanovsky

Page 2: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

My Research

Develop text-processing models which exhibit facets of

human intelligence with benefits for users in real-life applications

Page 3: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Grand Challenges in Natural Language Processing (NLP)

Automated assistants“I got one of those terrible headaches from lack of sleep. Can you give me something for it?”

Moon

Machine translation“the universal translator, invented in 2151, is used for deciphering unknown languages”

Star Trek

Star Wars

Information retrieval“What’s the second largest star in this galaxy?”

Page 4: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Grand Challenges in Natural Language Processing (NLP)

NLP models need to capture the meaning behind our words and interact accordingly

Page 5: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Meaning

● The information conveyed in a natural language utterance

Page 6: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Meaning

● The information conveyed in a natural language utterance

● Abstracts over myriad of possible surface forms

Page 7: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Meaning

● The information conveyed in a natural language utterance

● Abstracts over myriad of possible surface forms

Page 8: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Meaning

● The information conveyed in a natural language utterance

● Abstracts over myriad of possible surface forms

Cows mainly eat grass, and can enjoy up to 75 pounds of it a day

Page 9: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Meaning

● The information conveyed in a natural language utterance

● Abstracts over myriad of possible surface forms

Cows mainly eat grass, and can enjoy up to 75 pounds of it a day

Grass is the major ingredient in bovine nutrition, reaching a maximum of 75 pounds consumed daily

Page 10: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Do NLP models capture meaning?ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

Can we integrate meaning into NLP?ACL 2015, EACL 2017, SemEval 2017, NAACL 2017, SemEval 2019

How can we build parsers for meaning?EMNLP 2016a, EMNLP 2016b, ACL 2016a, ACL 2016b, ACL 2017, NAACL 2018, EMNLP2018a,EMNLP2018b, CoNLL 2019 🎉 Honorable mention

Models miss crucial meaning aspectsGender bias in machine translation

Data collectionQA is an intuitive annotation format

Model designRobust performance across domains

Real-world applicationAdverse drug reactions on social media

Outline: Research Questions

Page 11: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Can we integrate meaning into NLP?ACL 2015, EACL 2017, SemEval 2017, NAACL 2017, SemEval 2019

How can we build parsers for meaning?EMNLP 2016a, EMNLP 2016b, ACL 2016a, ACL 2016b, ACL 2017, NAACL 2018, EMNLP2018a,EMNLP2018b, CoNLL 2019 Honorable mention

Do NLP models capture meaning?ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

Models miss crucial meaning aspectsGender bias in machine translation

Outline: Research Questions

Page 12: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Background: How should we represent text?

Page 13: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Background: How should we represent text?

Explicitly! We should define a formal representation of meaning

Page 14: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Explicit Representations

● Many formalisms developed in linguistic literature○ Dictating how meaning should be represented

[1] Banarescu et al, 2013 [2] Oepen et al., 2014 [3] Abend and Rappoport, 2017

Page 15: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Explicit Representations

● Many formalisms developed in linguistic literature○ Dictating how meaning should be represented

[1] Banarescu et al, 2013 [2] Oepen et al., 2014 [3] Abend and Rappoport, 2017

CowBovine...

Page 16: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Explicit Representations - Propositions

● Statements with one predicate (event) and arbitrary number of arguments○ Bob called Mary → called:(Bob, Mary)

○ Bob gave a note to Mary → gave:(Bob, a note, Mary)

● Minimal units of information

● Form the backbone of explicit meaning representations

Page 17: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Explicit Representations

● Pros○ Interpretable models

○ Independent progress on meaning representation

Page 18: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Explicit Representations

● Pros○ Interpretable models

○ Independent progress on meaning representation

● Cons○ Requires expensive expert annotations

○ Arbitrary - unclear that one representation is necessarily “correct”

Page 19: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Background: How should we represent text?

Implicitly! Models should learn a latent useful representation for an end-task

Page 20: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Implicit Representations

● Models find correlations between word representations and task label

[1] Peters et al, 2018 [2] Devlin et al., 2019

Page 21: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Implicit Representations

● Models find correlations between word representations and task label

● Useful text representations found implicitly during the training process○ Monolithic models trained on 100M of parameters over 1B words

[1] Peters et al, 2018 [2] Devlin et al., 2019

Page 22: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Implicit Representations

● Models find correlations between word representations and task label

● Useful text representations found implicitly during the training process○ Monolithic models trained on 100M of parameters over 1B words

[1] Peters et al, 2018 [2] Devlin et al., 2019

Page 23: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Implicit Representations

● Cons○ Opaque models

○ No control over the patterns they find useful in the data

Page 24: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Implicit Representations

● Cons○ Opaque models

○ No control over the patterns they find useful in the data

● Pros○ No need to commit on an explicit representation

○ Impressive gains on many NLP datasets

○ Revolutionized the field

Page 25: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Natural Language Processing in 2019

Implicit representation

Explicit representation

Page 26: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Natural Language Processing in 2019

Implicit representation

Explicit representation

Do implicit NLP models capture meaning?ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

Page 27: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Many Facets to Text Understanding

Factuality1

Restrictiveness2

Word sense disambiguation3

[1] Stanovsky et al, 2017 [2] Stanovsky et al., 2016 [3] Stanovsky and Hopkins, 2018

Page 28: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Many Facets to Text Understanding

Factuality1

Identify if an event happenedJohn forgot that he locked the door

Restrictiveness2

Detect if modifiers are required or elaborating“The boy who stopped the flood.”

“Barack Obama, the former U.S. president.”

Word sense disambiguation3

Distinguishing bat from bat

Coreference resolutionImplications on gender bias in machine translation

Page 29: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Case study: Coreference in machine translation ACL 2019🎉 Nominated for

Best Paper

Page 30: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Case study: Coreference in machine translation

The doctor asked the nurse to help her in the procedure.

ACL 2019🎉 Nominated for

Best Paper

Page 31: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

○ ask for help: (the doctor, the nurse, in the procedure)

○ is female: (the doctor)

Case study: Coreference in machine translation

The doctor asked the nurse to help her in the procedure.

ACL 2019🎉 Nominated for

Best Paper

Page 32: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

○ ask for help: (the doctor, the nurse, in the procedure)

○ is female: (the doctor)

Case study: Coreference in machine translation

The doctor asked the nurse to help her in the procedure.

La doctora le pidió a la enfermera que la ayudara con el procedimiento.

ACL 2019🎉 Nominated for

Best Paper

Page 33: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

○ ask for help: (the doctor, the nurse, in the procedure)

○ is female: (the doctor)

● Can models capture the meaning conveyed through coreference?

Case study: Coreference in machine translation

The doctor asked the nurse to help her in the procedure.

La doctora le pidió a la enfermera que la ayudara con el procedimiento.

ACL 2019🎉 Nominated for

Best Paper

Page 34: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Case study: Coreference in machine translation

● Growing concern that models use bias to bypass meaning interpretation

Page 35: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Case study: Coreference in machine translation

● Growing concern that models use bias to bypass meaning interpretation

● E.g., translating all doctors as men regardless of context

Page 36: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Case study: Coreference in machine translation

● Growing concern that models use bias to bypass meaning interpretation

● E.g., translating all doctors as men regardless of context

● Will work well for most cases seen during training

Page 37: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Is machine translation gender biased?

Page 38: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Is machine translation gender biased?

Page 39: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Is machine translation gender biased?

Page 40: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluating Coreference Translation: Challenges

● Gender bias in machine translation was noticed anecdotally○ Open question: how to quantitatively measure gender translation?

Page 41: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluating Coreference Translation: Challenges

● Gender bias in machine translation was noticed anecdotally○ Open question: how to quantitatively measure gender translation?

● Requires reference translations in various languages and models○ To reach more general conclusions

Page 42: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluating Coreference Translation: Challenges

● Gender bias in machine translation was noticed anecdotally○ Open question: how to quantitatively measure gender translation?

● Requires reference translations in various languages and models○ To reach more general conclusions

● Gender can be unspecified○ The doctor had very good news

Page 43: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluating Coreference in Machine Translation

Challenge

How to evaluate gender translation across different models & languages?

Page 44: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluating Coreference in Machine Translation

● Input:○ Machine translation model: M

○ Target language with grammatical gender: L

Challenge

How to evaluate gender translation across different models & languages?

Page 45: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluating Coreference in Machine Translation

● Input:○ Machine translation model: M

○ Target language with grammatical gender: L

● Output:○ Accuracy score ∈ [0, 100]

How well does M translates gender information from English to L?

Challenge

How to evaluate gender translation across different models & languages?

Page 46: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

English Source Texts

● Winogender1 & WinoBias2 - bias in coreference resolution

The doctor asked the nurse to help her in the procedure.

The doctor asked the nurse to help him in the procedure.

[1] Rudinger et al, 2018 [2] Zhao et al., 2018

Page 47: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Winogender1 & WinoBias2 - bias in coreference resolution

● Equally split between stereotypical and non-stereotypical role assignments○ Based on U.S. labor statistics

The doctor asked the nurse to help him in the procedure.

The doctor asked the nurse to help her in the procedure.

English Source Texts

[1] Rudinger et al, 2018 [2] Zhao et al., 2018

Page 48: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

English Source Texts

● Winogender1 & WinoBias2 - bias in coreference resolution

● Equally split between stereotypical and non-stereotypical role assignments○ Based on U.S. labor statistics

● Gender-role assignments are specified (+90% human agreement)

The doctor asked the nurse to help her in the procedure.

The doctor asked the nurse to help him in the procedure.

[1] Rudinger et al, 2018 [2] Zhao et al., 2018

Page 49: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Methodology: Automatic evaluation of gender accuracy

Input: MT model + target languageOutput: Gender accuracy

Page 50: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Methodology: Automatic evaluation of gender accuracy

1. Translate the coreference bias datasets

The doctor asked the nurse to help her in the procedure.

Input: MT model + target languageOutput: Gender accuracy

Page 51: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Methodology: Automatic evaluation of gender accuracy

1. Translate the coreference bias datasets

The doctor asked the nurse to help her in the procedure.

La doctora le pidió a la enfermera que le ayudara con el procedimiento.

Input: MT model + target languageOutput: Gender accuracy

Page 52: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Methodology: Automatic evaluation of gender accuracy

1. Translate the coreference bias datasets

2. Align between source and target

The doctor asked the nurse to help her in the procedure.

La doctora le pidió a la enfermera que le ayudara con el procedimiento.

Input: MT model + target languageOutput: Gender accuracy

Page 53: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Methodology: Automatic evaluation of gender accuracy

1. Translate the coreference bias datasets

2. Align between source and target

3. Identify gender in target language

The doctor asked the nurse to help her in the procedure.

La doctora le pidió a la enfermera que le ayudara con el procedimiento.

Input: MT model + target languageOutput: Gender accuracy

Page 54: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Methodology: Automatic evaluation of gender accuracy

1. Translate the coreference bias datasets

2. Align between source and target

3. Identify gender in target language

The doctor asked the nurse to help her in the procedure.

El doctor le pidió a la enfermera que le ayudara con el procedimiento.

Input: MT model + target languageOutput: Gender accuracy

Page 55: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Methodology: Automatic evaluation of gender accuracy

1. Translate the coreference bias datasets

2. Align between source and target

3. Identify gender in target language

The doctor asked the nurse to help her in the procedure.

El doctor le pidió a la enfermera que le ayudara con el procedimiento.

Input: MT model + target languageOutput: Gender accuracy

Quality estimated at > 90%

Page 56: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Results Google Translate

Acc

(%

)

Human performance

random

The doctor asked the nurse to help him in the procedure.

Page 57: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Results Google Translate

Acc

(%

)

The doctor asked the nurse to help her in the procedure.

Human performance

random

Page 58: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Results

Acc

(%

)

Google Translate

Gender bias

Human performance

random

Page 59: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Results

● Translation models struggle with non-stereotypical roles

Google Translate Microsoft Translator

Amazon Translate Systran

Page 60: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Results

● Translation models struggle with non-stereotypical roles

Our metric can evaluate future progress on gender bias in machine translation

Page 61: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● NLP models do not capture important facets of meaning

Do NLP models capture meaning?ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

Page 62: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● NLP models do not capture important facets of meaning

● Instead, they find spurious patterns in the data○ Leading to the biased performance we’ve seen

○ Biased performance in question answering, inference, and more

Do NLP models capture meaning?ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

Page 63: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● NLP models do not capture important facets of meaning

● Instead, they find spurious patterns in the data○ Leading to the biased performance we’ve seen

○ Biased performance in question answering, inference, and more

Do NLP models capture meaning?ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

task label

Page 64: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Do models fail at capturing meaning because of architecture or data?

Open Questions

task label

Page 65: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Do models fail at capturing meaning because of architecture or data?

● Is there a dataset that could force models to learn meaningful patterns?○ E.g., equally distributed between genders

task label

Open Questions

Page 66: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Do models fail at capturing meaning because of architecture or data?

● Is there a dataset that could force models to learn meaningful patterns?○ E.g., equally distributed between genders

● Current data augmentation efforts find models are stubbornly biased[1,2,3]

task label

Open Questions

[1] Wang et al., 2019 [2] Gonen & Goldberg, 2019 [3] Elazar & Goldberg, 2018

Page 67: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Meaning Representation in Neural Networks

implicit explicit

Best of both worlds: models over meaningful explicit representations leveraging strong implicit architectures

Page 68: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Research Questions

Can we integrate meaning into NLP?ACL 2015, EACL 2017, SemEval 2017, NAACL 2017, SemEval 2019

How can we build parsers for meaning?EMNLP 2016a, EMNLP 2016b, ACL 2016a, ACL 2016b, ACL 2017, NAACL 2018, EMNLP2018a,EMNLP2018b, CoNLL 2019 🎉 Honorable mention

Weaknesses in state of the art ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

Data collectionQA is an intuitive annotation format

Model designRobust performance across domains

Real-world applicationAdverse drug reactions on social media

Page 69: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Extracts stand-alone propositions from text○ Barack Obama, a former U.S president, was born in Hawaii

(Barack Obama, was born in, Hawaii)

(a former U.S president, was born in, Hawaii)

(Barack Obama, is, a former U.S. president)

Open Information Extraction (Open IE)

Banko et al, 2007

Page 70: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Extracts stand-alone propositions from text○ Barack Obama, a former U.S president, was born in Hawaii

(Barack Obama, was born in, Hawaii)

(a former U.S president, was born in, Hawaii)

(Barack Obama, is, a former U.S. president)

○ Obama and Bush were born in America

(Obama, born in, America)

(Bush, born in, America)

Open Information Extraction (Open IE)

Banko et al, 2007

Page 71: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Open Information Extraction (Open IE)

Mr. Pratt, head of marketing, thinks that lower wine prices have come about because producers don’t like it when hit wines dramatically increase in price.

Page 72: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Open Information Extraction (Open IE)

Mr. Pratt, head of marketing, thinks that lower wine prices have come about because producers don’t like it when hit wines dramatically increase in price.

1. Mr Pratt is the head of marketing

Page 73: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Open Information Extraction (Open IE)

Mr. Pratt, head of marketing, thinks that lower wine prices have come about because producers don’t like it when hit wines dramatically increase in price.

1. Mr Pratt is the head of marketing2. lower wine prices have come about

Page 74: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Open Information Extraction (Open IE)

Mr. Pratt, head of marketing, thinks that lower wine prices have come about because producers don’t like it when hit wines dramatically increase in price.

1. Mr Pratt is the head of marketing2. lower wine prices have come about 3. hit wines dramatically increase in price

Page 75: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Open Information Extraction (Open IE)

Mr. Pratt, head of marketing, thinks that lower wine prices have come about because producers don’t like it when hit wines dramatically increase in price.

1. Mr Pratt is the head of marketing2. lower wine prices have come about 3. hit wines dramatically increase in price

4. producers don’t like (3)

Page 76: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Open Information Extraction (Open IE)

Mr. Pratt, head of marketing, thinks that lower wine prices have come about because producers don’t like it when hit wines dramatically increase in price.

1. Mr Pratt is the head of marketing2. lower wine prices have come about 3. hit wines dramatically increase in price

4. producers don’t like (3)

5. (2) happens because of (4)

Page 77: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Open Information Extraction (Open IE)

Mr. Pratt, head of marketing, thinks that lower wine prices have come about because producers don’t like it when hit wines dramatically increase in price.

1. Mr Pratt is the head of marketing2. lower wine prices have come about 3. hit wines dramatically increase in price

4. producers don’t like (3)

5. (2) happens because of (4)

6. Mr Pratt thinks that (5)

Page 78: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Parsers for Meaning Representation

● Goal - Build Open Information Extraction parsers from raw text

● Challenges

○ Obtaining data for the taskExpensive and non-trivial manual annotation

○ Designing a parserWhich works well for real-world texts

Page 79: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Parsers for Meaning Representation

● Goal - Build Open Information Extraction parsers from raw text

● Challenges

○ Obtaining data for the taskExpensive and non-trivial manual annotation

○ Designing a parserWhich works well for real-world texts

Page 80: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Data Collection: Challenges

● Direct annotation requires linguistic expertise○ Formal definitions for predicates and arguments

Page 81: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Data Collection: Challenges

● Direct annotation requires linguistic expertise○ Formal definitions for predicates and arguments

● Existing datasets annotated only hundreds of sentences○ Conflicting guidelines between different works

○ Do not support training

Page 82: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

QA is an intuitive interface for data collection

● QA pairs can be deterministically converted to Open IE propositions

EMNLP 2016

Where was Obama born? Hawai

(Obama, was born in, Hawaii)

Questions& answers

raw textMeaning representation

Converted based on question template

Page 83: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

QA is an intuitive interface for data collection

● QA pairs can be deterministically converted to Open IE propositions

Where was Obama born? Hawaii Who was born in Hawaii? Obama

EMNLP 2016

Questions& answers

raw textMeaning representation

(Obama, was born in, Hawaii)

Page 84: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Question-Answer Meaning Representation NAACL

2018a

“Mr. Pratt, head of marketing, thinks that lower wine prices have come about

because producers don’t like it when hit wines dramatically increase in price.”

○ Who is the head of marketing? Mr. Pratt

○ What have come about? lower wine prices

○ What increased in price? hit wines

○ ….

Questions& answers

raw textMeaning representation

Page 85: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Question-Answer Meaning Representation NAACL

2018a

“Pierre Vinken, 61 years old, will join the board as a nonexecutive director Nov. 29.”

○ Who will join the board? Pierre Vinken

○ What will he join the board as? Nonexecutive director

○ When will Vinken join the board ? Nov. 29

Intuitive interface for non-expert annotation of meaning!

Questions& answers

raw textMeaning representation

Page 86: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

QA as an interface for data collection

● Yields the largest supervised dataset for the Open Information Extraction

[1] Banko et al, 2007 [2] Wu and Weld, 2010 [3] Fader et al., 2011

Our dataset

Our dataset enables the development of the first supervised models for Open IE

Page 87: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Open IE: Challenges

● Obtaining data for the task○ Expensive and non-trivial for manual annotation

● Building an Open IE parser○ Which works well for real-world texts

Page 88: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

(John; jumped)

(Mary; ran)

Supervised Open IE Parser

● Approach: word-level tagging task (Beginning, Inside, Outside)

NAACL 2018b

John jumped and Mary ran↔ John

outside jumped

outside and

outside Mary

argument-1 ran

Predicate

↔ JohnArgument-1

jumpedPredicate

andoutside

Maryoutside

runoutside

Page 89: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Supervised Open IE Parser NAACL 2018b

jumpedJohn and Mary ran

Page 90: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Supervised Open IE Parser NAACL 2018b

Predicate Identificationfinding verbs in the sentence

jumpedJohn and Mary ran

Page 91: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Contextualizedrepresentation

Supervised Open IE Parser NAACL 2018b

Argument1 Predicate Outside Outside Outside

jumpedJohn

Forward& backwardLSTM

Softmax

and Mary ran

(John; jumped)

Page 92: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Contextualizedrepresentation

Supervised Open IE Parser NAACL 2018b

Argument1 Predicate Outside Outside Outside

jumpedJohn

Forward& backwardLSTM

Softmax

and Mary ran

(John; jumped)

Predicate features concatenated to all words

Page 93: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Contextualizedrepresentation

Supervised Open IE Parser NAACL 2018b

Argument1 Predicate Outside Outside Outside

jumpedJohn

Forward& backwardLSTM

Softmax

and Mary ran

(John; jumped)

Page 94: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Contextualizedrepresentation

Supervised Open IE Parser NAACL 2018b

Argument1 Predicate Outside Outside Outside

jumpedJohn

Forward& backwardLSTM

Softmax

and Mary ran

(John; jumped)

Page 95: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Contextualizedrepresentation

Supervised Open IE Parser NAACL 2018b

Argument1 Predicate Outside Outside Outside

jumpedJohn

Forward& backwardLSTM

Softmax

and Mary ran

Confidence (John; jumped) = 𝛱(word confidence)

Page 96: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluation - Open IE

QA dataHigh confidence threshold→ Accurate propositions, relatively few of them

Low confidence threshold→ More propositions, relatively less accurate

Page 97: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluation - Open IE

QA dataOur approach presents a favorable precision-recall tradeoff on our data

Page 98: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Evaluation - Open IE

QA data Other datasets

We generalize well to datasets unseen during training

Page 99: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

4 points over state of the art

Evaluation - Open IE

QA data Other datasets

Our method

Page 100: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Supervised Parser - Adaptation

● Integrated into the popular AllenNLP framework○ Online demo receives thousands of requests per month

● Used by researchers in academia and tech (e.g., plasticity.ai, Diffbot)

Albert Einstein published the theory of relativity in 1915

demo.allennlp.org

Page 101: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Research Questions

Building meaning representationsEMNLP 2016a, EMNLP 2016b, ACL 2016a, ACL 2016b, ACL 2017, NAACL 2018, EMNLP2018a,EMNLP2018b, CoNLL 2019 🎉 Honorable mention

Weaknesses in state of the art ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

Can we integrate meaning into NLP?ACL 2015, EACL 2017, SemEval 2017, NAACL 2017, SemEval 2019

Real-world applicationAdverse drug reactions on social media

Page 102: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Adverse Drug Reaction on Social Media EACL 2017

Page 103: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Adverse Drug Reaction on Social Media

I stopped taking Ambien after a week, it gave me a terrible headache!

EACL 2017

Page 104: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Discover unknown side-effects

● Monitor drug reaction over time

● Respond to patient complaints

Adverse Drug Reaction on Social Media

I stopped taking Ambien after a week, it gave me a terrible headache!

EACL 2017

Page 105: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Challenges

● Context dependent○ Ambien gave me terrible headaches

○ Ambien made my terrible headaches go away

● Colloquial○ been having a hard time getting some Z’s

Page 106: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Approach

● Recurrent neural network over propositions extracted from text

● Beneficial for the small amounts of data ○ Train: 5723 instances

○ Test: 1874 instances

implicit explicit

Page 107: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Model

Page 108: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Model

Open IE:

Page 109: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Outside

ModelOutside

Open IE:

Adverse reactionOutsideOutside

Page 110: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Mention Oracle marks all gold mentions, regardless of context○ Errs on 45% of instances → Context matters!

Results and Analysis

Page 111: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Oracle marks all gold mentions, regardless of context○ Errs on 45% of instances → Context matters!

● Recurrent neural networks achieve ~80% F1

Results and Analysis

Page 112: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Oracle marks all gold mentions, regardless of context○ Errs on 45% of instances → Context matters!

● Recurrent neural networks achieve ~80% F1

● Meaning representation provides 8% absolute improvement

Results and Analysis

implicit explicit

Page 113: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Implicit representations lead to biased performance

Conclusions

Page 114: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Implicit representations lead to biased performance

● Middle ground: Implicit models with explicit meaning representation

Conclusions

implicit explicit

Page 115: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Implicit representations lead to biased performance

● Middle ground: Implicit models with explicit meaning representation

● Useful in real-world application

Conclusions

implicit explicit

Page 116: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Conclusion: My contributions

Do NLP models capture meaning?ACL 2019 🎉 Nominated for Best PaperMRQA 2019 🎉 Best Paper awardEMNLP 2018

How can we build parsers for meaning?EMNLP 2016a, EMNLP 2016b, ACL 2016a, ACL 2016b, ACL 2017, NAACL 2018, EMNLP2018a,EMNLP2018b, CoNLL 2019 🎉 Honorable mention

Can we integrate meaning into NLP?ACL 2015, EACL 2017, SemEval 2017, NAACL 2017, SemEval 2019

QA reasoning

First German Open IE

Paraphrase datasets

Document representation

Open IE datasetOpen IE model

Machine translation QA evaluation

Word polysemy QA Active learning

Factuality detectionReading comprehension

Math QAAdverse drug reactions

Page 117: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Future Work

Page 118: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Future Work: Interactive Semantics

● Current NLP setting assume single-input single-output

Page 119: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Future Work: Interactive Semantics

● Current NLP setting assume single-input single-output

● Human interaction is often iterative

Page 120: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Future Work: Interactive Semantics

● Current NLP setting assume single-input single-output

● Human interaction is often iterative

● Interactivity allows models and humans iterate to reach a solution○ Will benefit from an explicit meaning representation

Page 121: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Future Work: Multilingual Meaning Bank

● Meaning representations are constructed almost exclusively in English○ Linguistic theory needs to be adapted

○ Expert annotation is expensive

Page 122: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● Meaning representations are constructed almost exclusively in English○ Linguistic theory needs to be adapted

○ Expert annotation is expensive

● A multilingual representation will facilitate:○ Semantically coherent machine translation

○ NLP applications in low-resource languages

Future Work: Multilingual Meaning Bank

Page 123: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Future Work: Multilingual Meaning Bank

● Meaning representations are constructed almost exclusively in English○ Linguistic theory needs to be adapted

○ Expert annotation is expensive

● A multilingual representation will facilitate:○ Semantically coherent machine translation

○ NLP applications in low-resource languages

● QA is appealing for multilingual representation○ Intuitive for non-expert annotation

○ Hebrew as an intuitive first language

Page 124: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

● NLP technology is ripe to extract large-scale aggregates

🎉 Featured in

Future Work: NLP to inform decision making In submission, 2019

Number of CS authors

# o

f au

tho

rs [

mill

ion

s]

# o

f au

tho

rs [

mill

ion

s]

female

Number of MEDLINE authors

malefemalemale

Page 125: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Future Work: NLP to inform decision making

● NLP technology is ripe to extract large-scale aggregates

🎉 Featured in

● Can aid in debates on other issues such as immigration or gun-violence○ Extract gun assault trends, how weapons were obtained from news articles

In submission, 2019

Number of CS authors

# o

f au

tho

rs [

mill

ion

s]

# o

f au

tho

rs [

mill

ion

s]

female

Number of MEDLINE authors

malefemalemale

Page 126: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

BSc, MScBGU 2012

PhDBIU 2018

Post-DocAI2 & UW 2020

Page 127: Meaning Representation in Natural Language Tasks · 2020-05-13 · Meaning Representation in Natural Language Tasks Gabriel Stanovsky. My ResearchDevelop text-processing models which

Thanks for listening!

BSc, MScBGU 2012

PhDBIU 2018

Post-DocAI2 & UW 2020