· computational linguistics and talking robots 1 table of contents 1. introduction: how to build...
TRANSCRIPT
Computational Linguistics and Talking Robots
ROLAND HAUSSER
Computational LinguisticsUniversität Erlangen Nürnberg
Germany
2011
Computational Linguistics and Talking Robots 1
Table of Contents1. Introduction: How to Build a Talking Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . 6
1.1 Universals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61.2 Declarative Specification . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91.3 Comparison with Other Systems . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Part I. Five Mysteries of Natural Language Communication
2. Mystery Number One: Using Unanalyzed External Surfaces. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.1 Structure of Words . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.2 Modality-Dependent Unanalyzed External Surfaces . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162.3 Modality Conversion in the Speak and Hear Modes . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.4 Automatic Word Form Recognition . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212.5 Backbone of the Communication Cycle . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3. Mystery Number Two: Natural Language Communication Cycle. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.1 Choosing the Data Structure . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.2 Representing Content . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253.3 Hear, Think, and Speak Modes. . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 2
3.4 Algorithm of LA-Grammar . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313.5 Relating Kinds of Proplets to Traditional Parts of Speech . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353.6 Linguistic Relativism vs. Universal Grammar . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4. Mystery Number Three: Memory Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . 414.1 Database Schema of a Word Bank . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414.2 Retrieving Answers to Questions . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424.3 Reference as a Purely Cognitive Procedure . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 434.4 Coreference-by-Address . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 464.5 Component Structure and Functional Flow . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 484.6 Embedding the Cycle of Communication into the Agent . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
5. Mystery Number Four: Autonomous Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . 565.1 Pinball Machine Model of Cognition . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 565.2 DBS Inferences for Maintaining Balance . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 585.3 DBS Inferences for Meaning and Event Relations. . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .615.4 Subactivation and Intersection . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 645.5 Analogical Models for Problem Solving . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 665.6 Subjunctive Transfer and Managing the Data Stream . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
6. Mystery Number Five: Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . 716.1 Fixed Behavior Agents . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 3
6.2 Guided Patterns to Expand a Fixed Behavior Repertoire . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 776.3 Transition from Fixed to Adaptive Behavior . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 806.4 Upscaling from Coordination to Functor-Argument . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 836.5 Schema Derivation and Hierarchy Inferencing . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 866.6 Natural vs. Artificial Language Learning. . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .93
Part II. The Coding of Content
7. Compositional Semantics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 1017.1 Forms of Graphical Representation . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1017.2 Absorption and Precipitation of Function Words . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1037.3 Deriving DBS Graphs from Sets of Proplets . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1087.4 Producing Natural Language Surfaces from Content . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1127.5 Transparent vs. Opaque Functor-Argument . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1207.6 Possible and Actual Semantic Relations of Structure . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
8. Simultaneous Amalgamation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . 1318.1 Intuitive Outline of LA-Content . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1318.2 Formal Definition of LA-Content . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1368.3 Linear Complexity of LA-Content . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 4
8.4 Infinitive Content Constructions . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1428.5 Selectional Constellations of Elementary Signatures .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1468.6 Appear, Promise, andPersuade Class Infinitives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . 150
9. Graph Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1579.1 Content Analysis as Undirected and Directed Graphs . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1579.2 Extrapropositional Coordination . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1639.3 Sharing in Relative Clauses . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1689.4 Unbounded Dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1729.5 Subject Gapping and Verb Gapping . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1749.6 Object Gapping and Noun Gapping . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
10. Computing Perspective in Dialogue. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . 18210.1 Agent’s STAR-0 Perspective on Current Content . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18210.2 Speaker’s STAR-1 Perspective on Stored Content. . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .18410.3 Hearer’s STAR-2 Perspective on Language Content . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .18610.4 Dialogue with a WH Question and Its Answer . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19010.5 Dialogue with a Yes/No Question and Its Answer. . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .19710.6 Dialogue with Request and Its Fulfillment . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
11. Computing Perspective in Text. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . .20611.1 Coding the STAR-1 in Written Text . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 5
11.2 Direct Speech in Statement Dialogue . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20711.3 Indexical vs. Coreferential Uses of3rd Pronouns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . 21011.4 Langacker-Ross Constraint for Sentential Arguments .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21211.5 Coreference in Adnominal Sentential Modifiers . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21811.6 Coreference in Adverbial Sentential Modifiers . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
Part III. Final Chapter
12. Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23012.1 Level of Abstraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23012.2 Evolution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .23112.3 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 6
1. Introduction: How to Build a Talking Robot
1.1 Universals
1.1.1 Universals of natural language communication
1. The cycle of natural language communication is based on thehear, thethink, and thespeakmodes of cogni-tive agents.
2. In communication, expressions of natural language are interpreted relative to an agent-internalcontextof use.
3. All natural languages have atime-linearstructure, i.e., linear like time and in the direction of time.
4. All natural languages use the three kinds of signsymbol, index,andname, each with its own mechanism ofreference.
5. All natural languages usecoordinationand functor-argumentto compose content at theelementary, thephrasal, and theclausallevel.
6. All natural languages distinguish parts of speech, e.g.,noun(object, argument),verb(relation, functor), andadjective(property, modifier).
7. All natural languages have the sentential moodsdeclarative, interrogative,andimperative.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 7
1.1.2 Requirements of a grounded artificial agent
In order to be grounded, a cognitive agent requires a body with
1. interfacesfor recognition and action, based on
2. adata structurefor representing content,
3. adatabasefor storing and retrieving content,
4. analgorithmfor reading content into and out of the database as well as forprocessing content, and combined into
5. asoftware programwhich models the cycle of natural language communication aswell as language and nonlanguage inferencing.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 8
1.1.3 The Conversion Universals of DBS
1. From the agent’sspeakmode to itshearmode and back,
2. from amodality-freeto amodality-dependentrepresentation of the surface in the speak mode and back in thehear mode, in word form production (synthesis) and recognition,
3. fromorder-free contentto ordered surfacesin the speak mode and back in the hear mode, and
4. from the STAR-0 to the STAR-1 perspective in the speak modeand from the STAR-1 to the STAR-2 per-spective in the hear mode.
1.1.4 Internal structure of the DBS Conversion Universals
STAR−2STAR−1
ordered
mod−dep
order−freeordered
speak mode hear mode
mod−dep mod−freemod−free
order−free
STAR−1STAR−0
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 9
1.2 Declarative Specification
1.2.1 Advantages of proplets
1. Flat ordered feature structures are easier to read and computationally more efficient than recursive featurestructures with unordered attributes.
2. Flat ordered feature structures provide for easy schema derivation and for easy pattern matching.
3. The combination of a proplet’s core andprn value provides a natural primary key for storage in and retrievalfrom memory.
4. Coding the semantic relations between proplets as addresses makes proplets order-free and thereforeamenable to the needs of one’s database.
5. The semantic relations between proplets enable time-linear navigation along those relations, reintroducingorder and serving as the selective activation of content, asneeded in language production and inferencing.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 10
1.2.2 Comparison of language, content, and pattern proplets
German proplet French proplet content proplet pattern proplet
sur: Hundnoun: dogcat: m-gsem: sgfnc: barkmod: oldprn: 23
sur: chiennoun: dogcat: sn msem: sgfnc: barkmod: oldprn: 24
sur:noun: dogcat: snsem: sgfnc: barkmod: oldprn: 25
sur:noun:αcat: snsem: sgfnc:mod:prn: K
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 11
1.3 Comparison with Other Systems
1.3.1 Summary of differences between DBS and other systems
1. Derivation Order:The parsing algorithm of DBS, i.e., LA-grammar, uses a strictly time-linear derivation order to computepossible continuations. The derivations of Phrase Structure Grammars and Categorial Grammars, in contrast,are partially ordered and computepossible substitutions. As a consequence, the application of LA-grammarto natural language has been shown to be of linear complexity, while the complexity of the other grammarformalisms is either polynomial but empirically inadequate (context-free), or computationally intractableranging from context-sensitive (exponential) to recursively enumerable (undecidable).
2. Ontology:In DBS, the model of natural language communication is located inside the speaker-hearer as a softwaremachine attached to the agent’s external and internal interfaces. In Truth-Conditional Semantics, in contrast,the speaker-hearer is part of a set-theoretic model, and nonprocedural metalanguage definitions are used toconnect the language expressions directly to referents in the model. As a consequence, DBS is designed fora talking robot, while Truth-Conditional Semantics is not.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 12
3. Elementary Meaning:In DBS, the agent’s basic recognition and action proceduresare reused as the elementary meanings of lan-guage. In Truth-Conditional Semantics, in contrast, elementary meanings are defined in terms of their truth-conditions relative to a set-theoretic model. As a consequence, the meanings in DBS have concrete realiza-tions in terms of software and hardware procedures, while those of Truth-Conditional Semantics do not.
4. Database:In DBS, the content derived in the hear mode or by inferencingis stored in acontent-addressablememory,called Word Bank. Most current applications, in contrast, use acoordinate-addressabledatabase, for exam-ple, an RDBMS, if they use a database at all. The crucial property of content-addressable memories is thatthey are good for content which is written once and never changed. Given that a cognitive agent is constantlychanging, this seems to be a paradoxical quality. It turns out, however, that it is the no-rewrite property whichallows for a simple, powerful definition of inferences in DBS.
5. Data Structure:DBS uses flat (non-recursive) feature structures with ordered attributes. Current systems of Nativism, incontrast, use recursive feature structures with unorderedattributes to model “constituent structure” trees.Flat feature structures with ordered attributes are of superior computational efficiency for a wide range ofoperations, such as pattern matching, which is ubiquitous in Database Semantics.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 13
6. IntentionDBS reconstructs the phenomenon of intention as part of an autonomous control designed to maintain theagent in a state of balance. This is in contrast to other schools of linguistics and philosophy who refereclectically to Grice whenever the need arises, but are oblivious to the fact that Grice’s elementary, atomic,presupposed notion is of little use for the computational reconstruction of intention and, by Grice’s owndefinition, of meaning in a cognitive agent.
7. PerspectiveThe speak and the hear modes in the agent-oriented approach of DBS provide the foundation for modelingthe perspectives of the speaker/writer and the hearer/reader in dialogue/text. They are (i) the perspectiveof an agent recording a current situation as a content, (ii) aspeaker’s perspective on a stored content, and(iii) the hearer’s perspective on a content transmitted by natural language. In DBS, the computation of theseperspectives is based on (i) suitable inferences and (ii) the values of the agent’s STAR parameters, forSpace,Time,Agent,Recipient.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 14
Part I.Five Mysteries of Natural Language Communication
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 15
2. Mystery Number One: Using Unanalyzed External Surfaces
2.1 Structure of Words
2.1.1 Informal examples showing basic word structure
water eau
English French
convention
surface:
meaning:
2.1.2 Tasks of learning the words of a foreign language
• learning to recognize and produce the foreign surfaces in the modalities of spoken and written language,and• learning the conventional connections between the foreignsurfaces and meanings familiar from one’sfirst language.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 16
2.2 Modality-Dependent Unanalyzed External Surfaces
2.2.1 Production and recognition of a word
water water
water
cogitive agent B cognitive agent A
waterrealize surface
waterlexical lookup
match surface
hear mode speak mode
copy surfaceunanalyzedexternalsurface
external world
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 17
2.2.2 The First Mechanism of Communication (MoC-1)
MoC-1 Natural language communication relies on modality-dependentexternal surfaceswhich are lin-guistically unanalyzed in that they have neither meaning nor any grammatical property.
2.2.3 Functional model of natural language communication
A functional model of natural language communication requires
1. a set of cognitive agents each with (i) a body, (ii) external interfaces for recognition and action, and (iii) amemory for the storage and processing of content,
2. a set of external language surfaces which can be recognized and produced by these agents by means of theirexternal interfaces using pattern matching,
3. a set of agent-internal (cognitive) surface-meaning pairs established by convention and stored in memory,whereby the internal surfaces correspond to the external ones, and
4. an agent-internal algorithm which constructs complex meanings from elementary ones by establishing se-mantic relations between them.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 18
2.2.4 Forms of communication without natural language
• endocrinic messaging by means of hormones,
• exocrinic messaging by means of pheromones, for example in ants, and
• the use of samples, for example in bees communicating a source of pollen.
2.2.5 Advantages following from MoC-1
1. The modality-free internal meanings attached to the internal surfaces are not restricted by the modality of theexternal surfaces.
2. The modality-dependent external surfaces are much better suited for (i) content transfer and (ii) agent-externallong-term storage than the associated agent-internal modality-free meanings.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 19
2.3 Modality Conversion in the Speak and Hear Modes
2.3.1INTER-AGENT COMMUNICATION USING SPEECH
surfacetemplate
surfacetemplate
modality freeinternal coding internal coding
modality free
modality
external world
auditory
typetype
agent B in hear mode agent A in speak mode
token
unanalyzedexternal surface
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 20
2.3.2 Two kinds of modality conversion
modality
visual modality
modality freeinternal coding
agent reading aloud
modality
visual modality
internal codingmodality free
agent taking dictation external worldexternal world
auditory auditory
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 21
2.4 Automatic Word Form Recognition
2.4.1 Matching an unanalyzed surface onto a key
matching
unanalyzed word form surface: learns
morphosyntactic analysis: learn/s,[ categorization, lemmatization]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 22
2.5 Backbone of the Communication Cycle
2.5.1 Backbone of surface-based information transfer
external surface
internal surface
analysislexical
analysislexical
analyzed
(word form)surface
analyzed
(word form)surface
internal surface
recognition synthesis
pattern matching (lookup)
hear mode speak mode
surfacetemplate
surfacetemplate surface
unanalyzed
pattern matching (realization)
unanalyzedsurface
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 23
2.5.2 Interface Equivalence Principle
For unrestricted human-machine communication, the artificial cognitive agent with language (DBS robot)must be equipped with the same interfaces to the external environment as the human prototype.
2.5.3 Input/Output Equivalence Principle
In natural language interpretation, the artificial agent must analyze themodality-dependent unanalyzed external surface by
(i) segmenting it in the same way into parts (i.e., word forms) and
(ii) ordering the parts in the same way (i.e., in a time-linear sequence)
as the human prototype; and accordingly for language production.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 24
3. Mystery Number Two: Natural Language Communication Cycle
3.1 Choosing the Data Structure
3.1.1 Development of the proplet format
analysislexical
word formanalyzed
surfacemeaning
syntactic−semanticconnectionsto other proplets
propertiesmorpho−syntactic
lexical proplet2.1.1
internal surface
noun: water
sem: massfnc: mdr:pc:nc: prn:
sur: eau
convention
meaning level:
surface level: eau
cat: sn f
2.6.1
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 25
3.2 Representing Content
3.2.1 Functor-argument ofJulia knows John.
noun: Juliafnc: knowprn: 625
verb: knowarg: Julia Johnprn: 625
noun: Johnfnc: knowprn: 625
3.2.2 Turning 3.2.1 into a schema
noun:αfnc: βprn: K
verb:βarg:α γprn: K
noun:γfnc: βprn: K
3.2.3 Pattern matching between schema 3.2.2 and content 3.2.1
schema level
noun:αfnc: βprn: K
verb:βarg:α γprn: K
noun:γfnc: βprn: K
internal matching
content level
noun: Juliafnc: knowprn: 625
verb: knowarg: Julia Johnprn: 625
noun: Johnfnc: knowprn: 625
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 26
3.2.4 Maintaining semantic relations regardless of order
noun: Juliafnc: knowprn: 625
verb: knowarg: Julia Johnprn:625
noun: Johnfnc: knowprn: 625
noun: Juliafnc: knowprn: 625
noun: Johnfnc: knowprn: 625
verb: knowarg: Julia Johnprn:625
verb: knowarg: Julia Johnprn:625
noun: Juliafnc: knowprn: 625
noun: Johnfnc: knowprn: 625
noun: Johnfnc: knowprn: 625
verb: knowarg: Julia Johnprn:625
noun: Juliafnc: knowprn: 625
noun: Juliafnc: knowprn: 625
noun: Johnfnc: knowprn: 625
verb: knowarg: Julia Johnprn:625
bac
noun: Johnfnc: knowprn: 625
verb: knowarg: Julia Johnprn:625
noun: Juliafnc: knowprn: 625
abc acb
bca cab cba
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 27
3.2.5 Coordination structure ofJulia sang. Sue slept. John read.
noun: Juliafnc: singprn: 10
verb: singarg: Julianc: (sleep 11)pc:prn: 10
noun: Suefnc: sleepprn: 11
verb: sleeparg: Suenc: (read 12)pc: (sing 10)prn: 11
noun: Johnfnc: readprn: 12
verb: readarg: Johnnc:pc: (sleep 11)prn: 12
3.2.6 Turning 3.2.5 into a schema
noun:αfnc: βprn: K
verb:βarg:αnc: (δ K+1)pc:prn: K
noun:γfnc: δprn: K+1
verb:δarg:γnc: (ψ K+2)pc: (β K)prn: K+1
noun:φfnc: ψprn: K+2
verb:ψarg:φnc:pc: (δ K+1)prn: K+2
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 28
3.2.7 Functions based on the order-free nature of proplets
1. Hear mode: storage of proplets in the content-addressable database of a Word Bank.
2. Think mode: selective activation of proplets stored in the Word Bank by means of a navigation along thesemantic relations between them, reintroducing a time-linear order.
3. Speak mode: production of natural language as a time-linear sequence of surfaces based on the selectiveactivation of a navigation.
4. Query answering: retrieval of content corresponding to aschema.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 29
3.3 Hear, Think, and Speak Modes
3.3.1 DBS hear mode derivation ofJulia knows John.
prn:fnc:cat: nmnoun: John
prn:fnc:cat: nmnoun: John
verb: know
prn:arg:cat: s3’ a’ v
Julia knows John
noun: Juliacat: nmfnc:
verb: know
prn:arg:cat: s3’ a’ v
noun: Juliacat: nm
verb: know
fnc: know arg: Julia Johncat: nmnoun: John
fnc: knowcat: decl
1 Nom+FV
3 S+IP
2 FV+Nom
syntactic−semantic parsing
lexical lookup
result of syntactic−semantic parsing
prn: 625
noun: Juliacat: nm
verb: know
fnc: know arg: Juliacat: a’ v
noun: Juliacat: nm
verb: know
fnc: knowcat: varg: Julia John
cat: nmnoun: John
fnc: know
prn: 625 prn: 625
prn: 625 prn: 625 prn: 625
prn: 625 prn: 625 prn: 625
noun: Juliacat: nmfnc:prn:
..
prn:arg:
verb: .
prn:arg:
verb: .
cat: v’ decl
cat: v’ decl
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 30
3.3.2 DBS think mode navigation
verb: knowcat: decl
noun: Juliacat: nmfnc: know
cat: nmnoun: John
fnc: knowprn: 625 prn: 625 prn: 625
1
2
3
4
arg: Julia John
3.3.3 DBS speak mode realization
verb: knowcat: decl
noun: Juliacat: nmfnc: know
cat: nmnoun: John
fnc: knowprn: 625 prn: 625 prn: 625
1
2
3
4
arg: Julia John
Julia knows John1 2 3 4
.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 31
3.4 Algorithm of LA-Grammar
3.4.1 LA-hear rule application
i. rule name ii. rule package
Nom+FV {FV+Nom}
iii. ss-pattern iv. nw-pattern v. resulting ss′-pattern
rulelevel
noun:αcat: NPfnc:
verb:βcat: NP′ X VTarg:
⇒
noun:αcat: NPfnc: β
verb:βcat: X VTarg:α
matching and binding of variables⇑ output ⇓
languagelevel
noun: Juliacat: nmfnc:prn: 1
verb: knowcat: s3’ a’ varg:prn:
noun: Juliacat: nmfnc: knowprn: 1
verb: knowcat: a’ varg: Juliaprn:1
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 32
3.4.2 LA-think rule application
i. rule name ii. rule packageVNs {NVs}
iii. current proplet iv. next propletrule level
verb:βarg: Xα Yprn: K
⇒
noun:αfnc: βprn: K
matching and binding of variables ⇑ ⇓
Word Bank level
verb: knowcat: declarg: Julia Johnprn: 625
noun: Juliacat: nmfnc: knowprn: 625
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 33
3.4.3 LA-speak rule application
i. rule name ii. rule packageNVs {VNs}
iii. current proplet iv. next proplet output
rule level
noun:αcat:γfnc: βprn: K
⇒
verb:βsem:δarg:α Yprn: K
know+s⇑
⇒ lex(β γ δ)
matching and binding of variables ⇑ m
Word Bank level
noun: Juliacat: nmfnc: knowprn: 625
verb: knowsem: presarg: Julia Johnprn: 625
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 34
3.4.4THE SECONDMECHANISM OF COMMUNICATION (MOC-2)
The external time-linear surface order is used for coding grammatical relations proplet-internally (hearmode), while the grammatical relations coded proplet-internally are used for coding a time-linear surfaceorder externally (speak mode).
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 35
3.5 Relating Kinds of Proplets to Traditional Parts of Speech
3.5.1 Traditional parts of speech
1. verbIncludes finite forms likesang and non-finite forms likesinging or sung of main verbs, as well as auxiliarieslike was or had and modals likecould andshould. Some traditional grammars treat non-finite verb formsas a separate class calledparticiple.
2. nounIncludes common nouns liketable and proper names likeJulia. Also, count nouns likebook and mass nounslike wine are distinguished.
3. adjectiveIncludes determiners likea(n), the, some, all, andmy as well as adnominals likelittle, black, andbeautiful.Some traditional grammars treat determiners as a separate class.
4. adverbIncludes adverbial modifiers likebeautifully and intensifiers likevery.
5. pronounIncludes nouns with an indexical meaning component such asI, me, mine, you, yours, he, him, his, she,her, hers, etc.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 36
6. prepositionFunction words which combine with a noun into an adjective, such ason in [the book] on [the table].
7. conjunctionIncludes coordinating conjunctions (parataxis) likeand and subordinating conjunctions (hypotaxis) likethat(introducing subject or object sentence) orwhen (introducing adverbial sentence).
8. interjectionIncludes exclamations likeouch!, greetings likehi!, and answers likeyes.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 37
3.5.2 Analyzing different kinds of nouns as lexical proplets
common noun pronoun proper name determiner
sur: booksnoun: bookcat: pnsem: count plfnc:mdr:prn:
sur: theynoun: çacat: pnpsem: count plfnc:mdr:prn:
sur: Julianoun: Juliacat: nmsem: sgfnc:mdr:prn:
sur: everynoun: n 1cat: snpsem: pl exhfnc:mdr:prn:
3.5.3 Anlyazing different adjectives as lexical proplets
adnominal adverbial indexical adjective preposition
sur: beautifuladj: beautifulcat: adnsem: psvmdd:prn:
sur: beautifullyadj: beautifulcat: advsem: psvmdd:prn:
sur:hereadj: idx loccat: adnvsem:mdd:prn:
sur: onadj: onn 2cat: adnvsem:mdd:prn:
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 38
3.5.4 Relation between theadnv, adn, and adv values in English
adnominal [cat: adn]beautiful
adverbial [cat: adv]beautifully
adjectiveon the tablehere, fast,
[cat: adnv]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 39
3.5.5 Analyzing different verb forms as lexical proplets
finite main verb finite auxiliary non-finite main verb
sur:knowsverb: knowcat: ns3′ a′ vsem: ind presmdr:arg:prn:
sur: isverb: v 1cat: ns3′ be′ vsem: ind presmdr:arg:prn:
sur: knowingverb: knowcat: a′ besem: progmdr:arg:prn:
3.5.6 Parts of speech and levels of complexity
elementary phrasal clausalnoun Julia, she the beautiful girl that Fido barkedverb barked could have barked Fido barked.adj here, beautiful in the garden When Fido barked
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 40
3.6 Linguistic Relativism vs. Universal Grammar
3.6.1 Equivalent clauses with different constructions
English: I don’t care.German: Es ist mir egal. (It is me equal.)Italian: Mi lascia indifferente. (Me leaves-it indifferent.)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 41
4. Mystery Number Three: Memory Structure
4.1 Database Schema of a Word Bank
4.1.1 Storing the proplets of 3.2.1 in a Word Bank
member proplets now front owner proplets. . . . . .
. . .
noun: Johncat: nmfnc: ...prn: 610
noun: Johncat: nmfnc: knowprn: 625
[
core: John]
. . . . . .
. . .
noun: Juliacat: nmfnc: ...prn: 605
noun: Juliacat: nmfnc: knowprn: 625
[
core: Julia]
. . . . . .
. . .
verb: knowcat: declarg: ...prn: 608
verb: knowcat: declarg: Julia Johnprn: 625
[
core: know]
. . . . . .
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 42
4.2 Retrieving Answers to Questions
4.2.1 Example of a token line
member proplets now front owner proplet
noun: girlfnc: walkmdr: youngprn: 10
noun: girlfnc: sleepmdr: blondprn: 12
noun: girlfnc: eatmdr: smallprn: 15
noun: girlfnc: readmdr: smartprn: 19
[
core:girl]
4.2.2 Applying a query pattern
query pattern
noun:girlfnc: walkmdr:σprn: K
matching (?)
noun: girlfnc: walkmdr: youngprn: 10
noun: girlfnc: sleepmdr: blondeprn: 12
noun: girlfnc: eatmdr: smallprn: 15
noun: girlfnc: readmdr: smartprn: 19
[
core:girl]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 43
4.3 Reference as a Purely Cognitive Procedure
4.3.1 Model-theoretic reconstruction of reference
natural language expression
set−theoretic model of the world
reference meta−language
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 44
4.3.2 Interfaces and components of an agent with language
peripheral cognition
central cognition
sign recognitionsign synthesis
context action
language component
context component
pragmatics
Cognitive Agent
External Reality
theory of grammar
theory of language
context recognition
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 45
4.3.3 Reference as language-context pattern matching
noun: dogfnc: bark
sur: der Hund
prn: 23
sur: bellteverb: barkarg: dogprn: 23
prn: c16fnc: barknoun: dogsur: sur:
verb: barkarg: dogprn: c16
context level
language level
types
tokens
reference
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 46
4.4 Coreference-by-Address
4.4.1 Symbolic addresses resulting from copying
noun: Juliacat: nm
verb: know
fnc: know arg: Juliacat: a’ v
prn: 625 prn: 625
noun: Juliacat: nmfnc:
verb: know
prn:arg:cat: s3’ a’ v
1 Nom+FVprn: 625
result
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 47
4.4.2 Coreferential coordination in a Word Bank
. . .
noun: Juliafnc: sleepprn: 675
. . .
noun: (Julia 675)fnc: wakeprn: 702
. . .[
core: Julia]
. . .
. . . . . .
verb: wakearg: (Julia 675)prn: 702
. . .[
core: wake]
. . .
. . .
verb: sleeparg: Juliaprn: 675
. . . . . .[
core: sleep]
4.4.3 Coreferential navigation
verb: sleeparg: Juliaprn: 675
1↔
noun: Juliafnc: sleepprn: 675
2↔
noun: (Julia 675)fnc: wakeprn: 702
3↔
verb: wakearg: (Julia 675)prn: 702
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 48
4.5 Component Structure and Functional Flow
4.5.1 Pattern matching based on the type-token relation
a. Recognition:matching between concept types and raw input
b. Action:matching between concept tokens and concept types
c. Reference:matching between language and context proplets
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 49
4.5.2 Pattern matching based on restricted variables
a. Natural Language Interpretation:matching between LA-hear rules and language proplets (3.4.1)
b. Navigation:matching between LA-think rules and content proplets (3.4.2)
c. Production from Stored Content:matching between LA-speak rules and content proplets (3.4.3)
d. Querying:matching between query patterns and content proplets (4.2.2)
e. Inferencing:matching between inference rules and content proplets (5.2.3 and 5.3.4)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 50
4.5.3 Refined component structure of a cognitive agent
peripheralcognition
rule component
central cognition
cognitive agent
5
6
78
2
1
43
I/O com
ponent Word Bank
1 = external recognition2 = external action3 = internal recognition4 = internal action5 = input to rule component6 = output of Word Bank7 = rule-Word Bank interaction8 = Word Bank-rule interaction
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 51
4.5.4 Integrating diagram 4.3.2 into diagram 4.5.3
peripheralcognition
contextcom
ponent
languagecom
ponent
rule component
central cognition
cognitive agent
5
62
1
43
I/O com
ponent
Word Bank
7i 8i 8ii7ii
refe
ren
ce
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 52
4.5.5 The Third Mechanism of Communication (MoC-3)
The operations of cognition in general and of natural language communication in particular require amemory with a storage and retrieval mechanism supporting (i) extensive data coverage, (ii) functionalcompleteness, and (iii) efficiency which enables real-timeperformance.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 53
4.6 Embedding the Cycle of Communication into the Agent
4.6.1 Mapping incoming surfaces into content (hear mode)
noun: Juliafnc:prn:
verb: knowarg:prn:
noun: Johnfnc:prn:
lexical lookup
matching and binding
fnc:prn:
arg:prn:
fnc:prn:
noun: verb: noun: α β γ
Julia Johnknowssurfaces: 1
5, 7
7
content level:
rule level:
lexical proplets
pattern proplets
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 54
4.6.2 Mapping stored content into surfaces (speak mode)
noun: Johnfnc: knowprn: 625
noun: Juliafnc: knowprn: 625
verb: knowarg: Julia Johnprn: 625
fnc:prn:
arg:prn:
fnc:prn:
noun: verb: noun: αα
βγ
γββ
synthesis
Julia knows Johnsurfaces:
matching and binding
8
7
6
K K Krule level:
content level:
content proplets
pattern proplets
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 55
4.6.3 Inference producing outgoing surfaces
contentlevel
rulelevel
fnc: eatprn: 220
noun: moiverb: hungryfnc: hungrynoun: moi
prn: 211arg: moi prn: 211
trigger situation (input)
antecedent
8
prn:
verb: hungryarg: β
prn:
βnoun: fnc: hungry
K K
3, 5, 7
and bindingmatching
cmprn:fnc: eat
surfaces:
synthesis
I
consequent
would like to eat some food
new content (output)
noun: foodfnc: eat
prn:
noun: foodfnc: eat
7
6
K+Marg:prn:
verb: eatfood
K+M K+M( K)β
βnoun: ( K)
verb: eat
prn: 220arg: moi food
prn: 220
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 56
5. Mystery Number Four: Autonomous Control
5.1 Pinball Machine Model of Cognition
5.1.1 Definition of meaning by Grice
Definiendum: U meant something by uttering x.Definiens: For some audience A, U intends his utterance of x toproduce in A some effect (response) E,by means of A’s recognition of the intention.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 57
5.1.2 The Fourth Mechanism of Communication (MoC-4)
The language as well as the nonlanguage behavior of a cognitive agent is driven by the goal of autonomouscontrol to maintain a continuous state of balance vis à vis constantly changing external and internalenvironments. The success of autonomous control, short-, mid-, and long-term, is defined in terms ofsurvival in the agent’s ecological niche.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 58
5.2 DBS Inferences for Maintaining Balance
5.2.1 Chaining R, D, and E inferences
1. R: β be hungry K cm β eat food K+12. D: β eat food K+1 pre β get food K+23. D: β get food K+2 down β get α, K+3 where α ǫ {apple, pear, salad, steak}4. E: β get α K+3 exec β locate α at γ K+45. E: β locate α at γ K+4 exec β take α K+56. E: β take α K+5 exec β eat α K+67. D: β eat α K+6 up β eat food K+7
5.2.2 One-step chain based on an R/E inference
R/E: α feel full K cm/execα stop eating K+1
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 59
5.2.3 Formal definition and application of a DBS inference
antecedent consequent
rulelevel
noun:βfnc: hungryprn: K
verb: hungryarg:βprn: K
cm
noun: (β K)fnc: eatprn: K+1
verb: eatarg: (β K) foodprn: K+1
noun: foodfnc: eatprn: K+1
matching and binding ⇑ ⇓
WordBank
noun: moifnc: hungryprn: 211
verb: hungryarg: moiprn: 211
noun: moifnc: eatprn: 211+1
verb: eatarg: moi foodprn: 211+1
noun: foodfnc: eatprn: 211+1
input output
5.2.4 Sequential Inferencing Principle (SIP)
Any two inferences x and y may be applied in sequence if, and only if, the consequent of x equals theantecedent of y.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 60
5.2.5 New content derived by the inference chain 5.2.1
rule level: β be hungry K cm β eat food K+1 preWord Bank: moi be hungry 211 moi eat food 212
β get (food K+1) K+2 down β get α K+3 execmoi get (food 212) 213 moi get apple 214
β locate α atγ K+4 exec β take α K+5 execmoi locate (apple 214) at cupboard 215 moi take (apple 214) 216
β (eat K+1) α K+6 up β (eat K+1) (food K+1) K+7moi (eat 212) (apple 214) 217 moi (eat 212) (food 212) 218
The four double lines should be read as one, i.e., as
rule level: p1 cm p2 pre p3 down p4 exec p5 exec p6 exec p7 up p8Word Bank:q1 q2 q3 q4 q5 q6 q7 q8.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 61
5.3 DBS Inferences for Meaning and Event Relations
5.3.1 Inference rule implementing a synonymy
noun: abstractfnc: αprn: K
impl
noun: summaryfnc: αprn: K+M
whereα ǫ {write, read, discuss, ...}
5.3.2 Inference rule implementing an antonymy
adj: goodmdd:αprn: K
impl
adj: not badmdd:αprn: K+M
5.3.3 Inference rule implementing a cause and effect relation
If a car has no fuel then it does not start
noun: carfnc: haveprn: K
verb: havearg: car no fuelprn: K
noun: no fuelfnc: haveprn: K
impl
noun: (car K)fnc: no startprn: K+M
verb: no startarg: (car K)prn: K+M
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 62
5.3.4 Relating summarycar accident to text
The heavy old car hit a beautiful tree. The car had been speeding. A farmer gave the driver a lift.
member proplets owner proplets. . .
noun: accidentmdr: (car 1)prn: 67
. . .[
core: accident]
. . .
. . .
noun: carfnc: hitprn: 1
noun: (car 1)fnc: speedprn: 2
. . .
noun: (car 1)mdd: accidentprn: 67
. . .[
core: car]
. . .
. . .
verb: hitarg: car treeprn: 1
. . . . . .[
core: hit]
. . .
verb: speedarg: (car 1)prn: 2
. . .[
core: speed]
. . .
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 63
5.3.5 Summary-creating D inference
antecedent consequent
rulelevel
noun:αfnc: hitprn: K
verb: hitarg:α βprn: K
noun:βfnc: hitprn: K
sum
noun: (α K)mdd: accidentprn: K+M
noun: accidentmdr: (α K)prn: K+M
whereα ǫ {car, truck, boat, sh ip, plane, ...} andβ ǫ {tree, rock, wall, mountain, ...}∪ αmatching and bind ing ⇑ ⇓
WordBank
noun: carfnc: hitprn: 1
verb: hitarg: car treeprn: 1
noun: treefnc: hitprn: 1
noun: (car 1)mdd: accidentprn: 67
noun: accidentmdr: (car 1)prn: 67
input output
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 64
5.4 Subactivation and Intersection
5.4.1 Trigger concept subactivating corresponding token line
member proplets owner proplet trigger concept
adj: hotmdd: potatoprn: 20
adj: hotmdd: waterprn: 32
adj: hotmdd: potatoprn: 55
adj: hotmdd: dayprn: 79
. . .[
core: hot]
⇐ hot
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 65
5.4.2 Intersecting token lines forhotand potato
member proplets owner proplets
. . .
adj: hotmdd: potatoprn: 20
adj: hotmdd: waterprn: 32
adj: hotmdd: potatoprn: 55
adj: hotmdd: dayprn: 79
[
core:hot]
. . .
. . .
noun: potatofnc: look formdr: hotprn: 20
noun: potatofnc: cookmdr: bigprn: 35
noun: potatofnc: findmdr: hotprn: 55
noun: potatofnc: eatmdd: smallprn: 88
[
core:potato]
5.4.3 Completing an intersection by spreading activation
noun: Johnfnc: look forprn: 20
verb: look forarg: John, potatopc: cook 19nc: eat 21prn: 20
noun: potatofnc: look formdr: hotprn: 20
adj: hotmdd: potatoprn: 20
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 66
5.5 Analogical Models for Problem Solving
5.5.1 Two Mary eat intersections
noun: (Mary 25)fnc: eatprn: 48
verb: eatarg: (Mary 25)(apple 46)pc: take 47prn: 48
noun: (Mary 25)fnc: eatprn: 82
verb: eatarg: (Mary 25) (müsli 80)pc: prepare 81prn: 82
5.5.2 Subactivation spreading fromMary eatto Mary eat apple
noun: (Mary 25)fnc: eatprn: 48
verb: eatarg: (Mary 25)(apple 46)pc: take 47prn: 48
noun: (apple 46)fnc: eateval: attractprn: 48
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 67
5.5.3 Stored content matching consequent in inference chain
rule level: β be hungry Kcm β eat food K+1 preWord Bank # #
β get food K+2 down β getα K+3 exec# #
β locateα atγ K+4 exec β take α K+5 exec# #
β eat α K+6 up β eat food K+7(Mary 25) eat (apple 46) 48 #
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 68
5.5.4 Extending matching content by secondary subactivation
rule level: β be hungry Kcm β eat food K+1 preWord Bank: # #
β get food K+2down β getα K+3 exec# #
β locate α atγ K+4 exec(Mary 25) locate apple at cupboard 46
β take α K+5 exec(Mary 25) take (apple 46) 47
β eat α K+6 up β eat food K+7(Mary 25) eat (apple 46) 48 #
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 69
5.5.5 Transfer and completion
rule level: β be hungry Kcm β eat food K+1 preWord Bank: # #
β get food K+2down β getα K+3 exec# #
β locate α atγ K+4 execmoi locate apple at cupboard 91
β take α K+5 execmoi take (apple 91) 92
β eat α K+6 up β eat food K+7moi eat (apple 91) 93 moi eat food 94
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 70
5.6 Subjunctive Transfer and Managing the Data Stream
5.6.1 Inference changing subjunctive to imperative content
verb:αsem: Xsbjv Yprn: K
mc
verb:αsem: Ximpv Yprn: K+M
noun: moifnc: takeprn: 92
verb: takearg: (apple 91)sem:sbjvprn: 92
noun: (a. 91)fnc: takeprn: 92
noun: moifnc: (take 92)prn: 95
verb: (take 92)arg: (apple 91)sem:impvprn: 95
noun: (a. 91)fnc: (take 92)prn: 95
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 71
6. Mystery Number Five: Learning
6.1 Fixed Behavior Agents
6.1.1 Motion patterns of a fixed behavior agent
bluered green
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 72
6.1.2 Coding motion triggered byred as set of proplets
rec: redprev:next: strghtprn: x1
act: strghtprev: rednext: leftprn: x2
act: leftprev: strghtnext: rightprn: x3
act: rightprev: leftnext: strghtprn: x4
act: strghtprev: rightnext: rightprn: x5
act: rightprev: strghtnext: leftprn: x6
act: leftprev: rightnext:prn: x7
6.1.3 Coding motion triggered bygreen as set of proplets
rec: greenprev:next: strghtprn: y1
act: strghtprev: greennext: leftprn: y2
act: leftprev: strghtnext: strghtprn: y3
act: strghtprev: leftnext: leftprn: y4
act: leftprev: strghtnext: strghtprn: y5
act: strghtprev: leftnext: leftprn: y6
act: leftprev: strghtnext:prn: y7
6.1.4 Coding motion triggered byblue as set of proplets
rec: blueprev:next: strghtprn: z1
act: strghtprev: bluenext: rightprn: z2
act: rightprev: strghtnext: strghtprn: z3
act: strghtprev: rightnext: rightprn: z4
act: rightprev: strghtnext: strghtprn: z5
act: strghtprev: rightnext: strghtprn: z6
act: rightprev: strghtnext:prn: z7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 73
6.1.5 Variable definition of the LA-act1 grammar
Tn ǫ {red, green, blue} andn ǫ {1, 2, 3, ...}
M1 ǫ {strght, left, right}
M2 ǫ {strght, left, right}
K ǫ {x i, yi, zi, ...} andi ǫ {1, 2, 3, ...}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 74
6.1.6 Rule system of the LA-act1 grammar
STS =def { ([
rec: Tn]
{Rule 0, Rule 1}) }
Rule 0 {Rule 0, Rule 1}
rec: Tnnext: Tn+1
prn: Ki
rec: Tn+1
prev: Tnprn: Ki+1
output position nw
Rule 1 {Rule 2}
rec: Tnnext: M1prn: Ki
act: M1prev: Tnprn: Ki+1
output position nw
Rule 2 {Rule 2}
act: M1next: M2prn: Ki
act: M2prev: M1prn: Ki+1
output position nw
STF =def {([
next:]
rpRule 2)}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 75
6.1.7 Applying Rule 1 of LA-act1 to a red trigger
Rule 1 {Rule 2}
rule level
rec: Tnnext: M1prn: Ki
act: M1prev: Tnprn: Ki+1
output position nw
matching and binding
Word Bank level
rec: redprev:next: strghtprn: x1
act: strghtprev: rednext: leftprn: x2
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 76
6.1.8 Applying Rule 2 of LA-act1 to a strght motion
Rule 2 {Rule 2}
rule level
act: M1next: M2prn: Ki
act: M2prev: M1prn: Ki+1
output position nw
matching and binding
Word Bank level
act: strghtprev: rednext: leftprn: x2
act: leftprev: strghtnext: rightprn: x3
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 77
6.2 Guided Patterns to Expand a Fixed Behavior Repertoire
6.2.1 New pattern for a fixed behavior agent
red green
6.2.2 Coding motion triggered byred green as set of proplets
rec: redprev:next: greenprn: q1
rec: greenprev: rednext: strghtprn: q2
act: strghtprev: greennext: leftprn: q3
act: leftprev: strghtnext: leftprn: q4
act: leftprev: leftnext: leftprn: q5
act: leftprev: leftnext: leftprn: q6
act: leftprev: leftnext:prn: q7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 78
6.2.3 Lexical proplets of an extended fixed behavior agent
rec: redprev:next:prn:
rec: greenprev:next:prn:
rec: blueprev:next:prn:
act: strghtprev:next:prn:
act: leftprev:next:prn:
act: rightprev:next:prn:
6.2.4 Recognition and lexical lookup of motion pattern 6.2.1
rec: redprev:next:prn: q1
rec: greenprev:next:prn: q2
act: strghtprev:next:prn: q3
act: leftprev:next:prn: q4
act: leftprev:next:prn: q5
act: leftprev:next:prn: q6
act: leftprev:next:prn: q7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 79
6.2.5 Rule system of the LA-rec1 grammar
STS =def { ([
rec: Tn]
{Rule 0, Rule 1}) }
Rule 0 {Rule 0, Rule 1}
rec: Tnnext:prn: Ki
rec: Tn+1
prev:prn: Ki+1
⇒
rec: Tnnext: Tn+1
prn: Ki
rec: Tn+1
prev: Tnprn: Ki+1
Rule 1 {Rule 2}
rec: Tnnext:prn: Ki
act: M1prev:prn: Ki+1
⇒
rec: Tnnext: M1prn: Ki
act: M1prev: Tnprn: Ki+1v
Rule 2 {Rule 2}
act: M1next:prn: Ki
act: M2prev:prn: Ki+1
⇒
act: M1next: M2prn: Ki
act: M2prev: M1prn: Ki+1
STF =def {([
next:]
rpRule 2)}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 80
6.3 Transition from Fixed to Adaptive Behavior
6.3.1 Extensions required by an adaptive behavior agent
1. Writable memoryIn order to record individual recognition action episodes,the agent’s non-writable memory must be comple-mented with a writable memory.
2. Decoupling of recognition and actionThe agent must be capable of recognition without having to perform the associated fixed behavior action(recognition per se), just as there must be action triggered by reasoning ratherthan by a fixed behaviorstimulus.
3. UnknownsThe agent must be able to recognize and store unknowns consisting of previously unencountered constella-tions of available recognition elements.
4. AppraisalIn order to learn from past experiences, the agent must be able to evaluate the implication of recognitions andthe outcome of actions.
5. Automatic schema derivationIn order to generalize over similar constellations, the agent must be capable of automatic schema derivation.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 81
6.3.2 Arrangement of writable and non-writable memory
writable memory non-writable memorynow front owner proplets
rec: blueprev:next: strghtprn: 21
rec: blueprev:next: strghtprn: 37
rec: blueprev:next: strghtprn: 55
. . .[
blue]
rec: blueprev:next: strghtprn: z1
rec: greenprev:next: strghtprn: 14
rec: greenprev:next: strghtprn: 38
rec: greenprev:next: strghtprn: 42
. . .[
green]
rec: greenprev:next: strghtprn: y1
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 82
6.3.3 Two simpleR/E one-step inference chains
R/E [rec: red square] K cm/exec[act: feed] K+1R/E [rec: green circle] K cm/exec[act: hide] K+1
6.3.4 Decoupled recognitions and actions
rec: red act: hiderec: green act: feedrec: squarerec: circle
6.3.5 Possible constellations when faced with an unknown
1 rec: red circle act: hide rec: good2 rec: red circle act: hide rec: bad3 rec: red circle act: feed rec: good4 rec: red circle act: feed rec: bad
6.3.6 Consequence inference for negative experience (CIN)
rec: α act: β rec: bad csq rec: α act: no β
6.3.7 Consequence inference for positive experience (CIP)
rec: α act: β rec: good csq rec: α act: β
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 83
6.4 Upscaling from Coordination to Functor-Argument
6.4.1 Use of propositional calculus in predicate calculus
[p∧q] =⇒ ∃x[red(x)∧circle(x)]
6.4.2 Integrating functor-argument in DBS
rec: rednext: circleprev:prn: 62
rec: circlenext:prev: redprn: 62
=⇒
adj: redcat: adnmdd: circlenc:pc:prn: 62
noun: circlecat: snmdr: rednc:pc:prn: 62
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 84
6.4.3 Coding nonlanguage and language content alike
member proplets now front owner proplets. . . . . .
. . .
sur:noun: circlecat: pnpsem: pl selfnc:mdr: rednc:pc:prn: 37
. . .
sur: circlesnoun: (circle 37)cat: pnpsem: pl selfnc:mdr: (red 37)nc:pc:prn: 62
[
core: circle]
. . .
. . .
sur:adj: redcat: adnsem: psvmdd: circlenc:pc:prn: 37
. . .
sur: redadj: (red 37)cat: adnsem: psvmdd: (circle 37)nc:pc:prn: 62
[
core: red]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 85
6.4.4 The Fifth Mechanism of Communication (MoC-5)
Exporting coordination and functor-argument from language content to nonlanguage (context) content,coded uniformly as sets of proplets, allows us to modelreferenceas a pattern matching between languagecontent and context content in the hear mode, and between context content and language content in thespeak mode.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 86
6.5 Schema Derivation and Hierarchy Inferencing
6.5.1 Converting a content into an equivalent schema
content
noun: childcat: snpsem: pl exhfnc: sleepmdr:nc:pc:prn: 26
verb: sleepcat: declsem: pastarg: childmdr:nc: (snore 27)pc:prn: 26
noun: Fidocat: nmsem: animalfnc: snoremdr:nc:pc:prn: 27
verb: snorecat: declsem: pastarg: Fidomdr:nc:pc: (sleep 26)prn: 27
⇐⇒
schema
noun:αcat: snpsem: pl exhfnc: βmdr:nc:pc:prn: K
verb:βcat: declsem: pastarg:αmdr:nc: (δ K+1)pc:prn: K
noun:γcat: nmsem: animalfnc: δmdr:nc:pc:prn: K+1
verb:δcat: declsem: pastarg:γmdr:nc:pc: (β K)prn: K+1
whereα ǫ {child}, β ǫ {sleep},γ ǫ {Fido}, δ ǫ {snore}, andK ǫ {26}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 87
6.5.2 Converting a schema into equivalent contents
schema
noun:αcat: snpsem: pl exhfnc: βmdr:nc:pc:prn: K
verb:βcat: declsem: pastarg:αmdr:nc:pc:prn: K
whereα ǫ {man, woman, child},β ǫ {sleep, sing, dream}and Kǫ N
⇐⇒
contentEvery man slept.Every woman slept.Every child slept.
Every man sang.Every woman sang.Every child sang.
Every man dreamed.Every woman dreamed.Every child dreamed.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 88
6.5.3 Set of contents with partial overlap
Julia eats an apple John eats an apple Suzy eats an apple Bill eats an appleJulia eats a pear John eats a pear Suzy eats a pear Bill eats a pearJulia eats a salad John eats a salad Suzy eats a salad Bill eatsa saladJulia eats a steak John eats a steak Suzy eats a steak Bill eatsa steak
6.5.4 Summarizing the set 6.5.3 as a schema
noun:αfnc: eatprn: K
verb: eatarg:α βprn: K
noun:βfnc: eatprn: K
whereα ǫ {Julia, John, Suzy, Bill} andβ ǫ {apple, pear, salad, steak}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 89
6.5.5 Coding the subclass relation forfood
noun: foodfnc: βprn: K
wherefoodǫ {apple, pear, salad, steak}
6.5.6 Representing the semantic hierarchy 6.5.5 as a tree
apple pear salad steak
food
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 90
6.5.7 Meta-inference deriving down and up inferences
antecedent consequent 1 consequent 2
noun: HTfnc: βprn: K
=⇒
noun: HTfnc: βprn: K
down
noun:αfnc: (β K)prn: K+M
noun:αfnc: βprn: K
up
noun: HTfnc: (β K)prn: K+M
where HTǫ {A, B, C, D, ...} whereα ǫ {A, B, C, D, ...} whereα ǫ {A, B, C, D, ...}
6.5.8 Applying meta-inference 6.5.7 to derive down inference
antecedent consequent
noun: HTfnc: βprn: K
=⇒
noun: HTfnc: βprn: K
down
noun:αfnc: (β K)prn: K+M
where HTǫ {A, B, C, D, ...} whereα ǫ {A, B, C, D,...}⇑ matching and binding ⇓
noun: foodfnc: eatprn: K
noun: foodfnc: βprn: K
down
noun:αfnc: (β K)prn: K+M
where foodǫ {apple, pear, salad, steak} whereα ǫ {apple, pear, salad, steak}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 91
6.5.9 Applying inference for downward traversal
antecedent consequent
rule level
noun: foodfnc: βprn: K
down
noun:αfnc: (β K)prn: K+M
whereα ǫ {apple, pear, salad, steak}matching and binding⇑ ⇓
content level
noun: Juliafnc: look forprn: 18
verb: look forarg: Julia foodprn: 18
noun: foodfnc: look forprn: 18
noun:αverb: (look for 18)prn: 25
6.5.10 Output disjunction of the downward inference 6.5.8
noun: appleorfnc: (look for 18)nc: pearprn: 25
noun: pearpc: applenc: saladprn: 25
noun: saladpc: pearnc: steakprn: 25
noun: steakpc: saladnc:prn: 25
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 92
6.5.11 Proposition resulting from downward inference 6.5.9
noun: (Julia 18)fnc: (look for 18)prn: 25
verb: (look for 18)arg: (Julia 18) appleorprn: 25
noun: appleorfnc: (look for 18)nc: pearprn: 25
noun: pearpc: applenc: saladprn: 25
noun: saladpc: pearnc: steakprn: 25
noun: steakpc: saladnc:prn: 25
6.5.12 Hierarchy-inference for upward traversal
antecedent consequent
rule level α ǫ {apple, pear, salad, steak} &
noun:αfnc: βprn: K
up
noun: foodfnc: (β K)prn: K+M
matching and binding
content level
noun: Juliafnc: prepareprn: 23
verb: preparearg: Julia saladprn: 23
noun: saladfnc: prepareprn: 23
noun: foodfnc: (prepare 23)prn: 29
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 93
6.6 Natural vs. Artificial Language Learning
6.6.1 One proplet shell taking different core values
proplet shell context proplets
sur:noun:αcat: pnsem: count plfnc:mdr:prn:
⇒
sur:noun: dogcat: pnsem: count plfnc:mdr:prn:
sur:noun: bookcat: pnsem: count plfnc:mdr:prn:
sur:noun: childcat: pnsem: count plfnc:mdr:prn:
sur:noun: applecat: pnsem: count plfnc:mdr:prn:
6.6.2 Turning context proplets into language proplets
proplet shell language proplets
sur:α’+xnoun:αcat: pnsem: count plfnc:mdr:prn:
⇒
sur: dog+snoun: dogcat: pnsem: count plfnc:mdr:prn:
sur: book+snoun: bookcat: pnsem: count plfnc:mdr:prn:
sur: child+rennoun: childcat: pnsem: count plfnc:mdr:prn:
sur: apple+snoun: applecat: pnsem: count plfnc:mdr:prn:
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 94
6.6.3 Taking sur values from different languages
proplet shell language proplets
sur:α’noun:αcat: snsem: count sgfnc:mdr:prn:
⇒
sur: dognoun: dogcat: snsem: count sgfnc:mdr:prn:
sur: chiennoun: dogcat: snsem: count sgfnc:mdr:prn:
sur: Hundnoun: dogcat: snsem: count sgfnc:mdr:prn:
sur: canenoun: dogcat: snsem: count sgfnc:mdr:prn:
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 95
6.6.4 Examples usingbookin different parts of speech
Mary loves a goodbook(noun).Mary booked (verb) a flight to Paris.Mary is a ratherbookish (adj) girl.
6.6.5 Core valuebookin noun, verb, and adj proplets
book=⇒
sur: booknoun: bookcat: snsem: count sgfnc:mdr:prn:
sur: bookedverb: bookcat: n’ a’ vsem: pastfnc:mdr:prn:
sur: bookishadj: bookcat: adnsem: psvmdd:prn:
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 96
6.6.6 Examples usingredand squarein different parts of speech
Mary preferred the otherred (noun).The rising sunreddened (verb) the sky.Mary drankred (adj) wine.
Mary’s house faces asquare(noun).Mary squared (verb) her account.Mary bought asquare(adj) table.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 97
6.6.7 Core values in syntactic-semantic composition
bookV the redA squareNbookV the squareN redAbookV the squareA redNsquareV the redA bookNsquareV the bookN redAsquareV the bookA redNreddenV the squareA bookNreddenV the bookN squareAreddenV the bookN squareNetc.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 98
6.6.8 Cognitive procedures using placeholder core values
1. The time-linear syntactic-semanticinterpretationin the hear mode,
2. thestorageof content provided by recognition and inferencing in the Word Bank,
3. the navigation-based semantic-syntacticproductionin the speak mode,
4. the definition of suchmeaning relationsas synonymy, antonymy, hypernymy, hyponymy, meronymy, andholonymy as well as cause-effect,
5. the design and implementation of reactor, deductor, and effector inferences,
6. the design and implementation of language inferences foradjustingperspective,and
7. theinteractionbetween the context and language levels
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 99
6.6.9 Learning a new word
sur:
fnc:cat: sgmdr:prn: 465
noun: zebra
stage 2stage 1
context level
language level
zebra
fnc:cat: sgmdr:prn:
sur:
fnc:cat: sgmdr:prn: 465
zèbre zèbre
noun: zebra
noun: zebrasur: zèbre
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 100
Part II.The Coding of Content
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 101
7. Compositional Semantics
7.1 Forms of Graphical Representation
7.1.1 Comparing representations of the subject-verb relation
Julia slept
slept
VP
sleep
Julia
sleep
Julia
S
Julia
NP
Phrase Structure Grammar Dependency Grammar DBS
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 102
7.1.2 Comparing determiner-adjective-noun constructions
The little girl slept.
slept
VP
S
NP
NADJ
littlethe girl
sleep
girl
DBS
the little little
girl
Phrase Structure Grammar Dependency Grammar
sleep
DET
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 103
7.2 Absorption and Precipitation of Function Words
7.2.1 Function words with different lexical readings
1a.Mary has a house by the lake.1b. The book was read by Mary.
2a.Mary moved to Paris.2b. Mary tried to sleep.
7.2.2 Correlating elementary/phrasal surfaces and contents
elementary surface phrasal surface phrasal surfaceJulia the girl the little girl
elementary content elementary content phrasal content
noun: Juliacat: nmsem: sgfnc: sleepmdr:prn: 1
noun: girlcat: snpsem: def sgfnc: sleepmdr:prn: 2
noun: girlcat: snpsem: def sgfnc: sleepmdr: littleprn: 3
noun: littlecat: adnsem: psvmdd: girlnc:prn: 3
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 104
7.2.3 Hear mode derivation ofThe little girl ate an apple.
the little girl ate an
noun: n_1
fnc:mdr:
noun: n_1
fnc:
adj: littlecat: adn
mdr: little mdd: n_1
noun: girl
sem: countfnc:
fnc:
adj: littlecat: adn
mdr: little
noun: girl
mdd: girl
lexical lookup
1
2
3
prn: 7 prn: 7
verb: eatcat: n’ a’ vsem: pastarg:prn:
prn: 7 prn: 7 prn:
prn: 7
adj: littlecat: adn
mdd:prn:
noun: n_1
fnc:mdr:
adj: littlecat: adn
mdd:
noun: girl
sem: countfnc:
verb: eatcat: n’ a’ vsem: pastarg:
noun: n_2
fnc:mdr:
prn: prn: prn: prn: prn:
syntactic−semantic parsing
sem: def
sem: def
sem: def
sem: indef sg
apple .
noun: apple
sem: countfnc: arg:
cat: v’ declverb: .
sem:
prn: prn:
cat: sncat: sn
sem:def sg
sem: psv
sem: psv
sem: psv
sem: psv
cat: sg
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 105
adj: littlecat: adn
mdr: little
noun: girl
mdd: girlfnc: eat
adj: littlecat: adn
mdr: little
noun: girl
mdd: girlfnc: eat
4
5
6
adj: littlecat: adn
mdr: little
noun: girl
mdd: girlfnc: eat
verb: eat
sem: pastcat: v
arg: girl apple
noun: apple verb: .cat: v’ declsem: arg:prn: prn: 7 prn: 7prn: 7prn: 7
prn: 7 prn: 7
verb: eat
sem: pastcat: v
arg: girl n_2
noun: n_2
prn: 7 prn: 7
prn: 7 prn: 7
verb: eat
sem: pastcat: a’ v
arg: girlprn: 7
noun: n_2
fnc:mdr:prn:
resulting content
sem:def sg
sem:def sg
sem:def sg
adj: littlecat: adn
mdr: little
noun: girl
prn: 7 prn: 7
fnc: eatsem:def sg
sem: indef sg
verb: eat
sem: past
prn: 7arg: girl apple
prn: 7
noun: applecat: decl
noun: apple
fnc:prn:
cat: snsem: sg
fnc: eat
fnc: eat
fnc: eat
sem: indef sg
sem: indef sgcat: snp
cat: snp
cat: snpsem: indef sg
sem: psv
sem: psv
sem: psv
sem: psvmdd: girl
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 106
7.2.4 Comparing different function word absorptions
sem: def sem: sg
the
noun: n_1
prn:prn:
noun: n_1
prn:prn: 5
1
2
garden
prn
prn
adj: in n_2 noun: garden
adj: in n_2
prn: 5
adj: in n_1 noun: garden
result
in
preposition−determiner−noun
the
prn:
1
noun: n_1
prn:prn: 4
garden
noun: garden
noun: garden
determiner−noun
lexical lookup
syntactic−semantic parsing
result
noun: n_1
prn:
prn: 4
noun: garden
cat: np cat: snsem: def sem: sgfnc: fnc:
fnc: fnc:
cat: np cat: sn
fnc:sem: def sgcat: snp
cat: np cat: sn
cat: np
cat: sn
prn: 5
adj: in_garden
mdd:sem: sem: def
fnc:sem: sgfnc:
sem: mdd:
sem: deffnc:
sem: defmdd: fnc:
sem: sg
mdd:sem: def sg
cat: adnv
cat: adnv
cat: adnv np
cat: adnv snp
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 107
7.2.5 Prepositional phrase as elementary adjective
Julia
sleep
there
V
N A
sleep
Julia in_garden
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 108
7.3 Deriving DBS Graphs from Sets of Proplets
7.3.1 On relating proplet sets to DBS graphs
• What is the nature of the relation between the proplet representation of a content and the correspondingSRG or signature?
• Can an SRG or a signature be derived automatically from a set of proplets representing a certaincontent?
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 109
7.3.2 Content corresponding toThe little girl ate an apple.
noun: girlsem: def sgfnc: eatmdr: littleprn: 7
adj: littlecat: adnsem: psvmdd: girlprn: 7
verb: eatcat: declsem: pastarg: girl appleprn: 7
noun: applecat: snpsem: indef sgfnc: eatprn: 7
7.3.3 Schemata interpreting transparent intraprop. relations
(1) noun/verb (2) noun\verb (3) adjective|noun (4) adjective|verb[
noun:αfnc: β
] [
verb:βarg:α X
] [
noun:αfnc: β
] [
verb:βarg:γ X α
] [
adj:αmdd:β
] [
noun:βmdr: Xα Y
] [
adj:αmdd:β
] [
verb:βmdr: Xα Y
]
(5) noun− noun (6) verb−verb (7) adjective−adjective[
noun:αnc: β
] [
noun:βpc:α
] [
verb:αnc: β
] [
verb:βpc:α
] [
adj:αnc: β
] [
adj: βpc:α
]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 110
7.3.4 DBS graph based on proplets (proplet graph)
verb: eat
sem: past
prn: 7arg: girl apple
cat: decl
prn: 7
noun: apple
fnc: eatsem: indef sgcat: snp
mdr: little
noun: girl
prn: 7
fnc: eatsem: def sg
adj: little
sem: posmdd: girlprn: 7
cat: adn
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 111
7.3.5 Resulting SRG and signature
V
A
NN
(ii) signature
girl
little
eat
apple
(i) semantic relations graph (SRG)
7.3.6 The seven transparent semantic relations of structure
subject-verb: 1. N/Vobject-verb: 2. N\Vadjective-noun: 3. A|Nadjective-verb: 4. A|Vconjunct-conjunct: 5. N−N
6. V−V7. A−A
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 112
7.4 Producing Natural Language Surfaces from Content
7.4.1 The four DBS views on a content and its surface
(iii) numbered arcs graph (NAG)
V
A
NN
girl
little
eat
apple
(i)
signature(ii)
semantic relations graph (SRG)
girl
little
eat
apple
1
2 3
4 56
girlThe little ate
1 2 3 4 5
.6
(iv) surface realization
an__apple
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 113
7.4.2 Numbered arcs graph based on proplets (proplet NAG)
verb: eat
sem: past
prn: 7arg: girl apple
cat: decl
adj: littlecat: adnsem: posmdd: girlprn: 7
1
2 3
4 5
6
proplet NAG
mdr: little
noun: girl
prn: 7
fnc: eatsem: def sg
prn: 7
noun: apple
sem: indef sgfnc: eat
cat: snp
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 114
7.4.3 LA-speak grammar for the proplet NAG 7.4.2
0. START {DET-ADN, DET-CN }
rule level
verb:αarg: Xprn: K
content level
verb: eatcat: declsem: pastarg: girl appleprn: 7
1. DET-ADN {ADN}
rule level
verb:αarg: Xβ Yprn: K
noun:βsem:γfnc: αmdr: δprn: K
⇒
the⇑lex(γ)
content level
verb: eatcat: declsem: pastarg: girl appleprn: 7
noun: girlsem: def sgfnc: eatmdr: littleprn: 7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 115
2. ADN {CN}
rule level
noun:βmdr: Xα Yprn: K
adj:αcat:γmdd:βprn: K
⇒
little⇑lex(α γ)
content level
noun: girlsem: def sgfnc: eatmdr: littleprn: 7
adj: littlecat: adnsem: psvmdd: girlprn: 7
3. CN {FV, PNC}
rule level
adj:αmdd:βprn: K
noun:βcat:γmdr:αprn: K
⇒
girl⇑lex(β γ)
content level
adj: littlecat: adnsem: psvmdd: girlprn: 7
noun: girlcat: snfnc: eatmdr: littleprn: 7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 116
4. FV {DET-ADJ, DET-CN}
rule level
noun:βfnc: αprn: K
verb:αarg:β Ysem:γprn: K
⇒
ate⇑lex(α γ)
content level
noun: girlsem: def sgfnc: eatmdr: littleprn: 7
verb: eatcat: declsem: pastarg: girl appleprn: 7
5. DET-CN {FV, PNC}
rule level
verb:αarg: Xβ Yprn: K
noun:βsem:γfnc: αmdr: NILprn: K
⇒
an apple⇑lex(γ β)
content level
verb: eatcat: declsem: pastarg: girl appleprn: 7
noun: applesem: indef sgfnc: eatmdr:prn: 7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 117
6. PNC {PNC-START}
rule level
noun:βfnc: αprn: K
verb:αcat: PNCarg: Xβprn: K
⇒
.⇑lex(PNC)
content level
noun: applesem: indef sgfnc: eatmdr:prn: 7
verb: eatcat: declsem: pastarg: girl appleprn: 7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 118
7.4.4 Representing an extrapropositional coordination
John
read
V
N
(iii) numbered arcs graph (NAG)(i)
Susanne
singsleep
V
N
Julia
V
N
signature
1 2
3
45
6
7 8
Susanne
sing
John
read
Julia
sleep0
(ii)
semantic relations graph (SRG)
(iv) surface realization
Julia Susanne John
2 5 8
.. ..3−4 6−70−1
slept__ ..sang__ read__
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 119
7.4.5 Russian word order based on alternative traversals
(1)
(2)
(4)
(5)
0 1 42−3find bone dog .
2 3.4
bone find dog0−1
(b)
1 23 4
dog
find
bone
0V
N N
dog
find
bone
.2 3 4
dog find bone0−1
(3) 0 1 42−3find dog bone .
(6)
(a)
1 2 34
dog
find
bone
0
(ii) signature
(i) SRG (iii) NAGs (iv) surface realizations
42−3.dog bone
0−1
bone dog2−3 40−1
.
find__
find__
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 120
7.5 Transparent vs. Opaque Functor-Argument
7.5.1 Extra- vs. intrapropositional FA structures
A
V
NV
V
N V
N N
V
N V
N
surprise
bark
Fido
Mary
subclause
Mary bark
hear
subclause
smile
bark
Fido
Mary
Fidosubclause
sentential subject sentential object sentential adverbial
corresponding simple subject corresponding simple object corresponding simple adverb
V
NN
V
N N
V
NMary smiled happily.The answer surprised Mary. Mary heard the answer.
That Fido barked surprised Mary. Mary smiled when Fido barked. Mary heard that Fido barked.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 121
7.5.2 Proplet representation ofThat Fido barked surprised Mary.
noun: Fidocat: nmsem: sgfnc: barkmdr:prn: 27
verb: barkcat: vsem: pastarg: Fidofnc: (surprise 28)prn: 27
verb: surprisecat: declsem: pastarg: (bark 27) Marymdr:prn: 28
noun: Marycat: nmsem: sgfnc: surprisemdr:prn: 28
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 122
7.5.3 Three-place verb with prepositional phrase as object: Julia put the flowers in a vase.
V
N N A1
Julia put
2 3
.4−5 6
(iv) surface realization
the_flowers in_a_vase
(ii) signature
put
Julia flower in_vase
12 3
456
Julia
put
flower in_vase
(i) semantic relations graph (SRG) (ii) numbered arcs graph (NAG)
7.5.4 Opaque intrapropositional object, content of 7.5.3
noun: Juliacat: nmsem: sgfnc: putmdr:prn: 26
verb: putcat: declsem: pastarg: Julia flower invasemdr:prn: 26
noun: flowercat: snsem: def plfnc: putmdr:prn: 26
adj: in vasecat: adnv snpsem: indef sgfnc: putmdr:prn: 26
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 123
7.5.5 Hear mode derivation ofJulia is a doctor.
arg:mdr:prn:
verb: cat: v’ decl
.
arg:mdr:prn:
verb: cat: v’ decl
.
Julia is alexical lookup
syntactic−semantic parsing
result
mdr:
noun: Juliacat: nm
prn: 38 prn: 38
fnc: doctor
verb: doctor
fnc:mdr:
noun: Juliacat: nm
verb: v_1
prn:prn: 38
mdr:
noun: Juliacat: nm
verb: v_1
prn: 38
fnc: v_1
prn: 38
fnc:mdr:prn:
noun: Juliacat: nm
1
2
3
4
noun: n_1
prn:
noun: n_1
mdr:prn:
mdr:
noun: Juliacat: nm
prn: 38 prn: 38
fnc: n_1
verb: n_1
mdr:
noun: Juliacat: nm
prn: 38 prn: 38
fnc: doctor
verb: doctor
fnc:sem: indef sgcat: snp
doctor .
prn:
noun: doctorcat: snsem: sgfnc:
sem: indef sg
verb: v_1
prn:
cat: ns3’ be’ v
arg:
arg:
arg: Julia
arg: Julia
cat: snp
prn:
noun: doctorcat: snsem: sgfnc:
arg: Julia
arg: Julia
sem: be pres
sem: be pres
cat: ns3’ be’ v
cat: be’ v
cat: snp vsem: be pres indef sg
sem: be pres
cat: snp vsem: be pres indef sg
cat: snp declsem: be pres indef sg
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 124
7.5.6 DBS graph analysis ofJulia is a doctor.
doctor
Julia
12
doctor
Julia
(i) semantic relations graph (SRG) (iii) numbered arcs graph (NAG)
N
V
Julia1
.2
is_a_doctor_
(ii) signature
(iv) surface realization
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 125
7.6 Possible and Actual Semantic Relations of Structure
7.6.1 Transparent and opaque relations beginning with N
1 N/N ?2 N/V subject/verb (transparent)3 N/A ?
4 N\N ?5 N\V object\verb (transparent)6 N\A ?
7 N|N ?8 N|V ?9 N|A ?
10 N−N noun–noun conjunction (transparent)11 N−V ?12 N−A ?
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 126
7.6.2 Transparent and opaque relations beginning with V
1 V/N ?2 V/V infinitive subject/verb (opaque)3 V/A ?
4 V\N ?5 V\V infinitive object\verb (opaque)6 V\A ?
7 V|N progressiveverb|noun, infinitive|noun (opaque)8 V|V ?9 V|A ?
10 V−N ?11 V−V verb–verb (transparent)12 V−A ?
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 127
7.6.3 Transparent and opaque relations beginning with A
1 A/N ?2 A/V ?3 A/A ?
4 A\N ?5 A\V prepositionalobject\verb (opaque)6 A\A ?
7 A|N adj|noun (transparent)8 A|V adj|verb (transparent)9 A|A ?
10 A−N ?11 A−V ?12 A−A adj-adj coordination (transparent)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 128
7.6.4 Transparent vs. opaque intrapropositional relations
transparent intrapropositional relations
1. N/V (subject/verb)2. N\V (object\verb)3. A|N (adj|noun)4. A|V (adj|verb)5. N−N (noun–noun)6. V−V (verb–verb)7. A−A (add–adj)
opaque intrapropositional relations
8. V/V (infinitive subject/verb)9. V\V (infinitive object\verb)10. V|N (progressiveverb|noun, infinitive|noun)11. A\V (prepositionalobject\verb)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 129
7.6.5 Extrapropositional relations of English
12 V/xV sententialsubject/xverb (opaque)13 V\xV sententialobject\xverb (opaque)14 V|xN sententialadnominal|xnoun, a.k.a. relative clause (opaque)15 V|xV sententialadverbial|xverb (opaque)16 V−xV extrapropositional−xcoordination (transparent)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 130
7.6.6 Content analysis corresponding to a 20-word sentence
woman13
14 15
161
2
3
45
6
7dog
find
quickly bone
little black
89
1011
12
0
(iii) numbered arcs graph (NAG)
persuade
buy
nice new
17
18
20
212219
23
24
prettycollar
A
V
N
A
N
A
V
N A N
A A
V
(ii) signature
dog
find
quickly
little black
(i) semantic relations graph (SRG)
bone
persuade
pretty
womanbuy
nice new
collar
new .20 21−22 23−24
collar
persuaded11−12
That the little black dog1 2 3 4 5−6
found7 8 9−10
quickly
(iv) surface realization
the_bone
the pretty woman a nice13 14 15 16−17 18 19
to_buy
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 131
8. Simultaneous Amalgamation
8.1 Intuitive Outline of LA-Content
8.1.1 Simultaneous amalgamation of content correspondingto English The little black dog found the bonequickly.
find
bonequickly
4.
A|V
find
bone
dog
little black dog
little black
bonequickly
finddog
little
A|N
2.
N\V
3.
A−A
5.
N/V1.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 132
8.1.2 Steps of an elementary amalgamation
1. Raw data provided by the agent’s visual, auditory, and other perception components are classified by concepttypes provided by the agent’s memory, based on the principleof best match – as in a Rorschach test.
2. The instantiated concept tokens are embedded into N, V, orA proplet shells (6.6.1, 6.6.6).
3. Selected pairs of nodes resulting from 2 are connected such that they form one of the 16 elementary signa-turesdefined in 7.6.4 and 7.6.5.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 133
8.1.3 Unambiguous and ambiguous input to signatures
Unambiguous match for {A, N}3. A|N (e.g.,little|dog)
Unambiguous match for {N, N}5. N−N (e.g.,man−woman)
Unambiguous match for {A, A}7. A−A (e.g.,little−black)
Ambiguous match for {A, V}4. A|V (e.g.,beautifully|sing)11. A\V (e.g.,in vase\put)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 134
Ambiguous match for {N, V}1. N/V (e.g.,John/gave)2. N\V (e.g.,Mary\gave)10. V|N (e.g.,burning|fire, to help|desire)14. V|xN (e.g.,who loves|Mary)
Ambiguous match for {V, V}6. V−V (e.g.,walk−talk)8. V/V (e.g.,to err/is)9. V\V (e.g.,to read\try)12. V/xV (e.g.,that bark/surprise)13. V\xV (e.g.,that bark\hear)15. V|xV (e.g.,when bark|smile)16. V−xV (e.g.,sleep−read)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 135
8.1.4 Interaction of five cognitive procedures
external language surface (output)
external languagesurface (input)
order−free content
external non−language
(i)
(v)
time−linear time−linearfree−order
time−linear time−linear
(ii) navigating(iii) inferencing
(iv)
external non−languageinput output
(vi)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 136
8.2 Formal Definition of LA-Content
8.2.1 Definition of LA-content
STS =def {([RA: α] {rules 1–16})}
1. N/V:
noun:αfnc:prn:
verb:βarg:prn:
⇒
noun:αfnc: βprn: K
verb:βarg:αprn: K
{rules 2–6 and 8–16}
2. N\V:
noun:αfnc:prn:
verb:βarg: Xprn:
⇒
noun:αfnc: βprn: K
verb:βarg: Xαprn: K
{rules 1–6, and 8–16}
3. A|N:
adj:αmdd:prn:
noun:βmdr:prn:
⇒
adj:αmdd:βprn: K
noun:βmdr:αprn: K
{rules 1–2, 4–7, 10–11, and 14}
4. A|V:
adj:αmdd:prn:
verb:βmdr:prn:
⇒
adj:αmdd:βprn: K
verb:βmdr:αprn: K
{rules 1–4 and 6–16}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 137
5. N−N:
noun:αnc:prn:
noun:βpc:prn:
⇒
noun:αnc: βprn: K
noun:βpc:αprn: K
{rules 1–3, 5, 10, 14}
6. V−V:
verb:αnc:prn:
verb:βpc:prn:
⇒
verb:αnc: βprn: K
verb:βpc:αprn: K
{rules 1–2, 4, 6, 8–16}
7. A−A:
adj:αnc:prn:
adj: βpc:prn:
⇒
adj:αnc: βprn: K
adj:βpc:αprn: K
{3–4, 7, 11}
8. V/V:
verb:αarg:prn:
verb:βarg:prn:
⇒
verb:αarg:fnc: βprn: K
verb:βarg:αprn: K
{rules 2, 4, 6, 9–16}
9. V\V:
verb:αarg:prn:
verb:βarg: Xprn:
⇒
verb:αarg:fnc: βprn: K
verb:βarg: Xαprn: K
{rules 1–2, 4, 6, 8–16}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 138
10. V|N:
verb:αarg:prn:
noun:βmdr:prn:
⇒
verb:αmdd:βprn: k
noun:βmdr:αprn: K
{rules 1–6 and 8–16}
11. A\V:
adj:αmdd:prn:
verb:βarg: Xprn:
⇒
adj:αfnc: βprn: K
verb:βarg: Xαprn: K
{rules 1–4 and 6–16}
12. V/xV:
verb:αarg:prn:
verb:βarg:prn:
⇒
verb:αarg:fnc: (β K+1)prn: K
verb:βarg: (α K)prn: K+1
{rules 1–2, 4, 6, 8–11, 13–16}
13. V\xV:
verb:αarg:prn:
verb:βarg:prn:
⇒
verb:αarg:fnc: (β K+1)prn: K
verb:βarg: X (α K)prn: K+1
{rules 1–2, 4, 6, 8–16}
14. V|xN:
verb:αarg:prn:
noun:βmdr:prn:
⇒
verb:αarg:mdd: (β K)prn: K+1
noun:βmdr: (α K+1)prn: K
{rules 1–6 and 8–16}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 139
15. V|xV:
verb:αmdr:prn:
verb:βmdr: Xprn:
⇒
verb:αmdr:mdd: (β K+1)prn: K
verb:βmdr: X (α K)prn: K+1
{rules 1–2, 4, 6, 8–16}
16. V−xV:
verb:αnc:prn:
verb:βpc:prn:
⇒
verb:αnc: (β K+1)prn: K
verb:βpc: (α K)prn: K+1
{rules 1–2, 4, 6, 8–16}
STF =def {([cat: X] rprules1−−16}
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 140
8.3 Linear Complexity of LA-Content
8.3.1 Recursive structures in natural language
1. Intrapropositional coordinationExamples: (i)The man, the woman, the child, ..., and the cat (noun coordination); cf. rule 5 (N−N) in8.2.1. (ii) Peter bought, peeled, cooked, cut, spiced, served, ..., and ate the potatoes (verb coordi-nation); cf. rule 6 (V−V). (iii) The fuzzy clever little black hungry ... dog (adnominal coordination); cf.rule 7 (A−A).
2. Extrapropositional coordinationExample: Julia slept. Susanne sang. John read.; cf. rule 16 (V−xV) in 8.2.1, 3.2.5 for a propletrepresentation, and 7.4.4 for a DBS analysis.
3. Iterated object sentencesExample: John said that Bill believes that Mary suspects that Suzy knows that Lucy loves Tom;cf. rule 13 (V\xV) in 8.2.1, and 9.4.1 for a DBS analysis. Related are the constructions ofunboundedorlong distance dependency, such asWho did John say that Bill believes that Mary suspects that Suzyknows that Lucy loves?, which are analyzed in 9.4.2. Iterated object sentences mayalso serve as a subjectsentence, as inThat Bill believes that Mary suspects that Suzy knows that Lucy loves Tom surprisedJohn.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 141
4. Iterated relative clausesExample:The man who loves the woman who feeds the child who has a cat is sleeping; cf. rule 14(V|xN) in 8.2.1, and 9.3.3 for a DBS analysis.
5. Gapping constructionsExamples: (i)Bob ate an apple, walked the dog, read the paper, had a beer, called Mary, ..., andtook a nap. (subject gapping); cf. rule 1 (N/V) in 8.2.1 and 9.5.5 for a DBS analysis. (ii)Bob ate anapple, Jim a pear, Bill a peach, Suzy some grapes, ..., and Tom a tomato. (verb gapping); cf. rules1 (N/V) and 2 (N\V), and 9.5.3 for a DBS analysis. (iii)Bob bought, Jim peeled, Bill sliced, Peterserved, and Suzy ate the peach (object gapping); cf. rule 2 (N\V), and 9.6.3 for a DBS analysis. (iv)Bob ate the red, the green, and the blue berries. (noun gapping); cf. rule 3 (A|N), and 9.6.5 for a DBSanalysis.
6. Iterated prepositional phrasesExample:Julia ate the apple on the table behind the tree in the garden ...; cf. rule 3 (A|N) in 8.2.1,and 7.2.4 for a partial DBS analysis.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 142
8.4 Infinitive Content Constructions
8.4.1 Hear mode derivation ofJohn tried to read a book
prn:
pnc: cat: v’ decl
.lexical lookup
John tried to read
syntactic−semantic parsing
a
noun: n_1
prn:
mdr:
verb: trynoun: Johncat: nm
prn: 32 prn: 32
arg: John v_1
verb: v_1cat: inffnc: try
prn: 32arg: John
1
2
3
arg:mdr:prn:
verb: trycat: n’ a’ v
arg:mdr:prn:
verb: trycat: n’ a’ v
noun: Johncat: nm
prn: 32
prn:
verb: v_1cat: inffnc:
mdr:prn:
verb: readcat: n−s3’ a’ varg:
mdr:prn:
verb: readcat: n−s3’ a’ varg:
arg:
cat: v
sem: indef sg
.book
cat: sn
prn:
noun: book
sem: sgfnc: fnc:
sem: sgfnc:
prn:
noun: Johncat: nmsem: sgfnc:
mdr:
verb: trynoun: Johncat: nm
prn: 32
cat: a’ varg: John
prn: 32 prn:
verb: v_1cat: inffnc:arg:
sem: sg
sem: sg
fnc: try
fnc: try
cat: sn’ snp
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 143
prn:
pnc: cat: v’ decl
.prn:
noun: bookcat: snsem:fnc:
mdr:
verb: trynoun: Johncat: nm
prn: 32 prn: 32
arg: John readcat: inffnc: try
prn: 32
verb: read noun: n_1
prn:
cat: v
mdr:
verb: trynoun: Johncat: nm
prn: 32 prn: 32
arg: John readcat: inffnc: try
prn: 32
verb: read noun: n_1
prn: 32
cat: v
result
mdr:
verb: trynoun: Johncat: nm
prn: 32 prn: 32
arg: John readcat: inf
prn: 32
verb: read
prn: 32
noun: bookcat: decl
mdr:
verb: trynoun: Johncat: nm
prn: 32 prn: 32
arg: John readcat: inf
prn: 32
verb: read
prn: 32
noun: bookcat: v
fnc: try
fnc: try
arg: John
arg: John n_1
arg: John book
arg: John book
4
5
6
sem: indef sgfnc:
sem: indef sgfnc:
fnc: readsem: indef sg
fnc: readsem: indef sg
sem: sgfnc: try
sem: sgfnc: try
sem: sgfnc: try
sem: sgfnc: try
cat: sn’ snp
cat: sn’ snp
cat: snp
cat: snp
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 144
8.4.2 Content representation of an infinitive construction
noun: Johncat: nmsem: sgfnc: tryprn: 32
verb: trycat: declarg: John readmdr:prn: 32
verb: readcat: inffnc: tryarg: John bookprn: 32
noun: bookcat: snpsem: indef sgfnc: readprn: 32
8.4.3 Schema characterizing an elementary V\V signature
(9) verb\verb
verb:βfnc: αarg:γ X
[
verb:αarg:γ β Y
]
to read try (examples of matching proplets for illustration only)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 145
8.4.4 DBS graph analysis of a content corresponding toJohn tried to read a book
John
try
read
book John
try
read
book
12
3
4 5
6
(iii) numbered arcs graph (NAG) (i) semantic relations graph (SRG)
V
N
V
N
(ii) signature
1
John
2 3
tried .
(iv) surface realization
4 5−6
to__read a__book
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 146
8.5 Selectional Constellations of Elementary Signatures
8.5.1Try class infinitive constructions
1. nominal object:John tried a cookie.
2. one-place infinitive object:John tried to sleep.
3. two-place inf. object:John tried to read a book.
4. three-place inf. object:John tried to give Mary a kiss.
5. inf. with prepositional object:Julia tried to put the flower in a vase.
6. inf. with object sentence recursion:Julia tried to say that Bill believes that Mary suspects that Susyknows that Lucy loves Tom. . . .
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 147
8.5.2 Definition oftry class infinitives
verb\verb noun\verb
verb:βfnc: αarg:γ X
[
verb:αarg:γ β
] [
noun:βfnc: α
][
verb:αarg:γ β
]
to read try cookie try (examples of matching proplets, for illustration only)
whereα ǫ {begin, can afford, choose, decide, expect, forget, learn,like, manage, need, offer, plan, prepare, refuse, start, try, want}
Selectional constellations:
matrix verbα subjectγ infinitival objectβbegin 18 992 people 204, men 55, government 54, number 49, . . .feel 528, be 492, take 371, . . .can afford 1 841 . . . . . .. . .
matrix verbα subjectγ nominal objectβbegin 6 642 government 32, people 26, commission 24, . . . work206, career 141, life 113, . . .can afford 1 542 . . . . . .. . .
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 148
8.5.3 Sorting nominaldecide objects into a Corpus Word Bank
member proplets owner proplets
. . .
noun: casefnc: decidefrq: 65
. . .[
noun: case]
. . .
verb: decidearg: X gamefrq: 4
verb: decidearg: X disputefrq: 7
verb: decidearg: X fatefrq: 47
verb: decidearg: X casefrq: 65
verb: decidearg: X issuefrq: 74
[
verb: decide]
. . .
noun: disputefnc: decidefrq: 7
. . .[
noun: dispute]
noun: fatefnc: decidefrq: 47
[
noun: fate]
noun: gamefnc: decidefrq: 4
. . .[
noun: game]
. . .
noun: issuefnc: decidefrq: 74
[
noun: issue]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 149
8.5.4 Formal query and answer 1
query result
verb: decidearg: X casefrq: ?
verb: decidearg: X casefrq: 65
8.5.5 Formal query and answer 2
query result
noun: casefnc: decidefrq: ?
noun: casefnc: decidefrq: 65
8.5.6 Formal query and answer 3
query result
verb: decidearg: ?frq: ?
noun: issuefnc: decidefrq: 74
noun: casefnc: decidefrq: 65
noun: fatefnc: decidefrq: 57
noun: disputefnc: decidefrq: 7
noun: gamefnc: decidefrq: 4
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 150
8.6 Appear, Promise, and Persuade Class Infinitives
8.6.1 Content structures corresponding to infinitives
1. Infinitive as subject:To err is human, to forgive divine.
2. Infinitive as object:John tried to read a book.
3. Infinitive as adnominal modifier:the desire to help
4. Bare infinitive:Peter saw the accident happen.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 151
8.6.2 Bare infinitive: Peter saw the accident happen.
V
N
V
(ii) signature
(iii) numbered arcs graph (NAG) (i) semantic relations graph (SRG)
1 2
(iv) surface realization
12
3 6
N
Peter
see
happen
accident Peter
see
happen
accident4 5
Peter saw .happen
653−4
the__accident
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 152
8.6.3Appear class infinitive constructions
1. nominal object: *John appeared a cookie.
2. one-place infinitive object:John appeared to sleep.. . . (as in 8.5.1)
8.6.4 Definition ofappear class infinitives
verb\verb
verb:βfnc: αarg:γ X
[
verb:αarg:γ β
]
to sleep appear (examples of matching proplets for illustration only)
whereα ǫ {agree, appear, be able, seem, tend}
Selectional constellations:[omitted]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 153
8.6.5Promise class infinitive constructions
1. nominal object:John promised Mary a cookie.
2. one-place infinitive object:John promised Mary to sleep.. . . (as in 8.5.1)
8.6.6 Definition ofpromise class infinitives
[
noun:δfnc: α
]
verb:βfnc: αarg:γ X
[
verb:αarg:γ δ β
] [
noun:δfnc: α
][
noun:βfnc: α
][
verb:αarg:γ δ β
]
Mary to sleep promise Mary cookie promise (examples of matching proplets for illustration only)
whereα ǫ {offer, promise, threaten}
Selectional constellations:[omitted]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 154
8.6.7Persuade class infinitive constructions
1. nominal object:*John persuaded Mary a cookie.
2. one-place infinitive object:John persuaded Mary to sleep.. . . (as in 8.5.1)
8.6.8 Definition ofpersuade class infinitives
[
noun:δfnc: α
]
verb:βfnc: αarg:δ X
[
verb:αarg:γ β δ
]
Mary to sleep persuade (examples of matching contents, for illustration only)
whereα ǫ {advise, allow, appoint, ask, beg, choose, convince, encourage, expect, forbid, force, invite, need, permit, persuade, select,teach, tell, urge, want, would like}.
Selectional constellations:[omitted]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 155
8.6.9 Object control inJohn persuaded Mary to read a book.
V
N N
V(ii) signature
N book
read
67
85
John
12
persuade
3
Mary
4
(iii) numbered arcs graph (NAG)
2
Marypersuaded
3
John
1
(iv) surface realization
to__read .a__book
64−5 7−8
persuade
read
John book Mary
(i) semantic relations graph (SRG)
8.6.10 Corresponding proplet representation
noun: Johncat: nmsem: sgfnc: persuadeprn: 36
verb: persuadecat: declsem: pastarg: John read Maryprn: 36
noun: Marycat: nmsem: sgfnc: readprn: 36
verb: readcat: inffnc: persuadearg: Mary bookprn: 36
noun: bookcat: snpsem: indef sgfnc: readprn: 36
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 156
8.6.11 Subject control inJohn promised Mary to sleep.
John
promised
Mary
N
V
N V
sleep
12
John
promise
Mary3
45
6
1
John promised
2
Mary .6
sleep
(i) semantic relations graph (SRG)
(ii) signature
(iii) numbered arcs graph (NAG)
(iv) surface realization
3 4−5
to__sleep
8.6.12 Proplet representation ofJohn promised Mary to sleep.
noun: Johncat: nmsem: sgfnc: promiseprn: 35
verb: promisecat: declsem: pastarg: John Mary sleepprn: 35
noun: Marycat: nmsem: sgfnc: promiseprn: 35
verb: sleepcat: inffnc: promisearg: Johnprn: 35
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 157
9. Graph Theory
9.1 Content Analysis as Undirected and Directed Graphs
9.1.1 The “complete” n=4 graph K4
K1 K2
K4K3
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 158
9.1.2 Same number of nodes but different degrees
child appleman
give
V
A
NN
2211
girl
little
eat
apple
The little girl ate an apple. The man gave the child an apple
V
N N N
3111
(i) semantic relations graph (SRG)
(ii) signature
(i) semantic relations graph (SRG)
(ii) signature
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 159
9.1.3 DBS graph analysis of a content:The dog found a bone.
1
dog
find
bone3 42
0
dog
find
bone
V
N N
(i) semantic relations graph (SRG)
(ii) signature
(iii) numbered arcs graph (NAG)
1 2
found3 4
(iv) surface realization
The_dog a_bone .
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 160
9.1.4 Graph-theoretical constraints on wellformed NAGs
1. The signature must besimple, i.e., there must be no loops or multiple lines.
2. The NAG must besymmetric, i.e., for every arc connecting some nodes A and B,there mustbe an arc from Bto A.
3. The traversal of arcs must becontinuous, i.e., combining the traversal of arc x from A to B and of arc y fromC to D is permitted only if B = C.
4. The numbering of arcs must beexhaustive, i.e., there must exist a navigation which traverses each arc.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 161
9.1.5 Possible NAGs for an n=3 semantic relations graph
0
1
dog
find
bone3 42
NAG 1
dog
find
bone
NAG 3
1 24 3
0dog
find
bone
NAG 4
012
3 4
0
dog
find
bone
NAG 2
34 1 2
9.1.6 Linguistic constraints on wellformed NAGs
1. An intrapropositional traversal must begin and end with the node which acts as the verb.
2. The initial verbal node must be entered either by arc 0 or bya corresponding arc from a preceding proposition(7.4.4).
3. The only lines permitted in a NAG are “/” (subject-verb), “\” (object-verb), “|” (modifier-modified), and“−” (conjunct-conjunct).
4. The only nodes permitted in a NAG based on a signature are N (noun), V (verb), and A (adjective).
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 162
9.1.7 Deriving English passive from content 9.1.3
dog bone
findfind
dog bone1
34
2
N N
V
0
(i) semantic relations graph (SRG) (iii) numbered arcs graph (NAG)
(iii) signature
2
.431
(iv) surface realization
A_bone was_found by_the_dog
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 163
9.2 Extrapropositional Coordination
9.2.1 Extrapropositional coordination of two propositions
John house John
cross
street
leave
9.2.2 Bidirectional pointering in the proplet representation
noun: Johnfnc: leaveprn: 1
verb: leavearg: John housenc: (cross 2)pc:prn: 1
noun: housefnc: leaveprn: 1
noun: Johnfnc: crossprn: 2
verb: crossarg: John streetnc:pc: (leave 1)prn: 2
noun: streetfnc: crossprn: 2
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 164
9.2.3 NAG for forward navigation through 9.2.1
John house John
cross
street
leave 0
12 3 4
5
6 7 8 9
9.2.4 NAG for backward navigation through 9.2.1
John house John street
1 2 3 467 8 9
cross5
leave 0
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 165
9.2.5 Intrapropositional noun coordination (subject)
man woman child
sleep 0
(i) semantic relations graph (SRG)
V
N N N
man woman child
sleep1
2 3
456
(ii) signature
(iv) surface realization
(iii) numbered arcs graph (NAG)
.1 2 3 4−5−6
The_man the_woman and_the_child slept_
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 166
9.2.6 Extrapropositional functor-argument
1
23
4 5 6
Fido
bark
amuse
Mary
That Fido barked amused Mary .1 2 3 4 5 6
0
(iii) numbered arcs graph (NAG)
(iv) surface realization
(i) semantic relations graph (SRG)
Fido
bark
amuse
Mary
N
V
V N
(ii) signature
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 167
9.2.7 Graph-theoretical alternatives for coordination
0
3. bidirectional, asymmetric
0
2. bidirectional, symmetric
0NAG−1 NAG−2 NAG−3
NAG−1
NAG−1
NAG−3NAG−2
NAG−3NAG−2
1. unidirectional
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 168
9.3 Sharing in Relative Clauses
9.3.1 Main clauses and equivalent relative clauses
sleep
man
The man loves a woman.
man woman
love
love
man
woman
give
woman man kiss
man
woman
give
flower
main clause
The man sleeps.
man
sleep
A woman loves the man.
The man who sleeps
love
man
woman
love
relative clause
11 211 211
11 211
woman man
The man who loves a woman The man whom a woman loves
211
The man gives the woman a flower.
The man who gives the woman a flower
man woman flower
give
give
woman
man
kiss
3111
3111
3111
The woman gives the man a kiss.
The man whom the woman gives a kiss.
3111
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 169
9.3.2 Relative clause center embedding
German: Der Mann singtder die Frau liebt
die das Kind füttertTrans-literation: the man who the woman who the child feeds loves sings
9.3.3 Graph analysis of center-embedded relative clauses
man
love
woman
feed
child
sing
1 6 7−8 9−10
fuettert liebt ..der
2 3
die
4 5
(i) semantic relations graph (SRG)
(iv) surface realization
V
N
N
V
(ii) signature
N
V
Der_Mann die_Frau das_Kind singt_ ..
man
love
woman
feed
child
sing1
2
3
4
56
7
8
9
10
(German, center−embedded)
(iii) numbered arcs graph (NAG)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 170
9.3.4 English realization of content 9.3.3
The man who loves the woman who feeds the child sings.
9.3.5 English surface realization of relative clauses
surface realization
The man1 3 4 5
.
(English, unmarked)
2 6−7−8−9−10 who_loves the_woman who_feeds the_child sings_
9.3.6 Surface realization with extraposed relative clause
surface realization
1−2 3 4sings101 5
.
(English, extraposed, marked)
6−7−8−9−10 The_man who_loves the_woman who_feeds the_child
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 171
9.3.7 Constraint on multiple visits, variant I
In content navigation without language realization, a multiple visit is
1. permittedif there are still untraversed arcs in the graph,2. prohibitedif all arcs in the graph have been traversed.
9.3.8 Constraint on multiple visits, variant II
In content navigation with language realization, a multiple visit is
1. permittedif a required function word has not yet been realized,2. prohibited if there is no value remaining in the current proplet set which has not already been used
exhaustively for realization.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 172
9.4 Unbounded Dependencies
9.4.1 DBS graph analysis forJohn said that Bill believes that Mary loves Tom.
say
John believe
Bill love
Mary Tom
0
12 3
45 6
78 9
10
11
12
(iii) numbered arcs graph (NAG)
V
V
V
N
N
N N
(ii) signature
10−11−12saidJohn
1 2that
3Bill4
believes5
that 6
Mary7
loves8
Tom9
.
(iv) surface realization
say
John believe
Bill love
Mary Tom
(i) semantic relations graph (SRG)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 173
9.4.2 NAG and surface realization ofWhom did John say that Bill believes that Mary loves?
3−6−9 10−11−12 1 2 3 4 5 6 7 8 11−12(iv) surface realization
did John say that Bill believes that Mary loves ?Whom0
12 3
45 6
78 9
10
11
12
say
John believe
Bill love
Mary
(iii) numbered arcs graph (NAG)
WH
9.4.3 Proplet representation of 9.4.2
verb: sayarg: John (believe 7)prn: 6
noun: Johnfnc: sayprn: 6
verb: believearg: Bill (love 8)fnc: (say 6)prn: 7
noun: Billfnc: believeprn: 7
verb: lovearg: Mary WHfnc: (believe 7)prn: 8
noun: Maryfnc: loveprn: 8
noun: WHfnc: loveprn: 8
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 174
9.5 Subject Gapping and Verb Gapping
9.5.1 Subject gapping
Bob ate an apple, # walked the dog, and # read the paper.
9.5.2 Verb gapping
Bob ate an apple, Jim # a pear, and Bill # a peach.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 175
9.5.3 DBS graph analysis of verb gapping in 9.5.2
Jim
pear
Bill peach
eat
appleBob
V
NN
NN
NN
(ii) signature
appleBob
eat
Jim
pear
Bill peach
12 3
45
6
78
91011
12
(iii) numbered arcs graph (NAG)
1
Bob ate Jim and Bill .2 3 4−5 6−7 8 9 10−11 12
(iv) surface realiztion
an_apple a_pear a_peach
(i) semantic relations graph (SRG)
9.5.4 Verb gapping content as a set of proplets
noun: Bobcat: nmsem: sgfnc: eatprn: 31
verb: eatcat: declsem: pastarg: Bob apple
Jim pearBill peach
prn: 31
noun: applecat: snpsem: indef sgfnc: eatprn: 31
noun: Jimcat: nmsem: sgfnc: eatprn: 31
noun: pearcat: snpsem: indef sgfnc: eatprn: 31
noun: Billcat: nmsem: sgfnc: eatprn: 31
noun: peachcat: snpsem: indef sgfnc: eatprn: 31
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 176
9.5.5 DBS graph analysis of subject gapping in 9.5.1
apple
walk
dog
paper
read
Bob
eat V
N N
N
N
V
V
(iii) signature
apple
eat
1 2 34
Bob
5 dog6
walk7
8
read
9
paper10
11
12
(iii) numbered arcs graph (NAG)
Bob
1
ate
2 3
walked4−1−5 6 7−8−9 10
.11−12−2
(iv) surface realization
an_apple the_dog and_read the_paper
(i) semantic relations graph (SRG)
9.5.6 Proplet representation of subject gapping
noun: Bobcat: nmsem: sgfnc: eat
walkread
prn: 32
verb: eatcat: declsem: pastarg: Bob appleprn: 32
noun: applecat: snpsem: indef sgfnc: eatprn: 32
verb: walkcat: declsem: pastarg: Bob dogprn: 32
noun: dogcat: snpsem: def sgfnc: walkprn: 32
verb: readcat: declsem: pastarg: Bob paperprn: 32
noun: papercat: snpsem: def sgfnc: readprn: 32
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 177
9.6 Object Gapping and Noun Gapping
9.6.1 Object gapping:
Bob bought #, Jim peeled #, and Bill ate the apple.
9.6.2 Noun gapping:
Bob ate the red #, the green #, and the blue berries.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 178
9.6.3 DBS graph analysis of object gapping in 9.6.1
(ii) signature
V
V
V
N N N N Bob
12
Bill Jim
5 6
9
eat
10
apple
347
118
12peel
buy
(iii) numbered arcs graph (NAG)
1
bought
2
Jim
3−4−5
peeled
6
and
7−8
Bill
9
ate
10 11 12
the_apple .
(iv) surface realization
Bob
Bill
eat
peel
Jim appleBob
buy
(i) semantic relations graph (SRG)
9.6.4 Proplet representation of object gapping
noun: Bobcat: nmsem: sgfnc: buyprn: 33
verb: buycat: declsem: pastarg: Bob appleprn: 33
noun: Jimcat: nmsem: sgfnc: peelprn: 33
verb: peelcat: declsem: pastarg: Jim appleprn: 33
noun: Billcat: nmsem: sgfnc: eatprn: 33
verb: eatcat: declsem: pastarg: Bill appleprn: 33
noun: applecat: snpsem: def sgfnc: buy
peeleat
prn: 33
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 179
9.6.5 DBS graph analysis of noun gapping in 9.6.2
eat
1 2 3
Bob berry
red green blue
4 5 6 7 8 9
10
(iii) numbered arcs graph (NAG)
N N
A A A
V(ii) signature
Bob
1
ate
2 3
the red
4
the
5
green
6
blue berries .8 9 107
(iv) surface realization
and_the
eat
Bob berry
red bluegreen
(i) semantic relations graph (SRG)
9.6.6 Proplet representation of noun gapping
noun: Bobcat: nmsem: sgfnc: eatprn: 34
verb: eatcat: declsem: pastarg: Bob berryprn: 34
noun: berrycat: snpsem: def plmdr: red
greenblue
fnc: eatprn: 34
adj: redcat: adnsem: posmdd: berryprn: 34
adj: greencat: adnsem: posmdd: berryprn: 34
adj: bluecat: adnsem: posmdd: berryprn: 34
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 180
9.6.7 DBS graph analysis of adnominal coordination:Bob ate the red, green, and blue berries.
N N
V
A A A
(ii) signature
eat
1 2 3
Bob berry
10
red
4
green5
blue6
789
(iii) numbered arcs graph (NAG)
Bob
1
ate
2 3
the red
4
green
65
berries
7−8−9
.10
(iv) surface realization
and__blue
eat
Bob berry
red green blue
(i) semantic relations graph (SRG)
9.6.8 Proplet representation of adnominal coordination
noun: Bobcat: nmsem: sgfnc: eatprn: 35
verb: eatcat: declsem: pastarg: Bob berryprn: 35
noun: berrycat: pnpsem: def plmdr: redfnc: eatprn: 35
adj: redcat: adnsem: posmdd: berrync: greenpc:prn: 35
adj: greencat: adnsem: posmdr:nc: bluepc: redprn: 35
adj: bluecat: adnsem: posmdr:nc:pc: greenprn: 35
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 181
9.6.9 Combination of object gapping and noun gapping
Bob bought #, Jim peeled #, and Bill ate the red #, the green #, and the yellow apple.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 182
10. Computing Perspective in Dialogue
10.1 Agent’s STAR-0 Perspective on Current Content
10.1.1 Anchored nonlanguage content
I am writing you a letter.STAR−0
10.1.2 Coding unanchored content as a proplet set
noun: moi1
fnc: writeprn: 659
verb: writearg: moi toi letterprn: 659
noun: toifnc: writeprn: 659
noun: letterfnc: writeprn: 659
10.1.3 Specification of a STAR
S = ParisT = 1930-07-03A = Jean-Paul SartreR = Simone de Beauvoir
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 183
10.1.4 STAR-0 content with 1st and 2nd person indexicals
noun: moifnc: writeprn: 659
verb: writearg: moi toi letterprn: 659
noun: toifnc: writeprn: 659
noun: letterfnc: writeprn: 659
S: ParisT: 1930-07-03A: J.-P. SartreR: S. de Beauvoirprn: 659
10.1.5 STAR-0 content without indexicals
noun: Fidofnc: barkprn: 572
verb: barkarg: Fidoprn: 572
S: ParisT: 1930-07-03A: S. de Beauvoirprn: 572
10.1.6 STAR-0 content with a 3rd person indexical
noun: çafnc: barkprn: 572
verb: barkarg: çaprn: 572
S: ParisT: 1930-07-03A: S. de Beauvoir3rd: Fidoprn: 572
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 184
10.2 Speaker’s STAR-1 Perspective on Stored Content
10.2.1 STAR-1 expression with 1st and 2nd person indexicals
I wrote you a letter yesterday.STAR−1
10.2.2 STAR-1.1 inference for temporal specification
[
verb:αprn: K
]
S: LT: DA: NR: Oprn: K
S: L′
T: D′
A: NR: O′
prn: K+M
⇒
verb:αsem:βmdr: γprn: K+M
adj: γmdd:αprn: K+M
S: L′
T: D′
A: NR: O′
prn: K+M
If D < D ′, thenβ = past, and if D diff D′ = 1 day, thenγ = yesterday; and similarly for all theother possible temporal relations between a STAR-0 and a STAR-1 differing in their T value.
10.2.3 STAR-1 contentMoi wrote toi a letter yesterday.
noun: moifnc: writeprn: 659+7
verb: writearg: moi toi lettersem: pastmdr: yesterdayprn: 659+7
noun: toifnc: writeprn: 659+7
noun: letterfnc: writeprn: 659+7
adj: yesterdaymdd: writeprn: 659+7
S: ParisT: 1930-07-04A: J.-P. SartreR: S. de Beauvoirprn: 659+7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 185
10.2.4 STAR-1.3 inference for specification of recipient
verb:αarg: {X toi}prn: K
noun: toifnc: αprn: K
S: LT: DA: NR: Oprn: K
S: L′
T: D′
A: NR: O′
prn: K+M
⇒
verb:αarg: {X O}prn: K+M
noun: Ofnc: αprn: K+M
S: L′
T: D′
A: NR: O′
prn: K+M
10.2.5 STAR-1 contentMoi wrote Simone a letter yesterday.
noun: moifnc: writeprn: 659+7
verb: writearg: moi Simone lettersem: pastmdr: yesterdayprn: 659+7
noun: Simonefnc: writeprn: 659+7
noun: letterfnc: writeprn: 659+7
adj: yesterdaymdd: writeprn: 659+7
S: ParisT: 1930-07-04A: J.-P. SartreR: Julietteprn: 659+7
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 186
10.3 Hearer’s STAR-2 Perspective on Language Content
10.3.1 Result of analyzing 10.2.1 in the hear mode
noun: moifnc: writeprn: 623
verb: writearg: moi toi lettersem: pastmdr: yesterdayprn: 623
noun: toifnc: writeprn: 623
noun: letterfnc: writeprn: 623
adj: yesterdaymdd: writeprn: 623
10.3.2 Main hear mode perspectives on language content
1. The perspective of the hearer as the partner in face-to-face communication.
2. The perspective of someone overhearing a conversation between others.
3. The reader’s perspective onto the content of a written text (Chap. 11.).
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 187
10.3.3 STAR-2.1 inference for deriving hearer perspective
noun: moifnc: αprn: K
verb:αarg: {X moi toi}prn: K
noun: toifnc: αprn: K
S: LT: DA: NR: Oprn: K
⇒
noun: toifnc: αprn: K
verb:αarg: {X toi moi}prn: K
noun: moifnc: αprn: K
S: LT: DA: OR: Nprn: K
10.3.4 STAR-2 contentToi wrote moi a letter yesterday.
noun: toifnc: writeprn: 623
verb: writearg: toi moi lettersem: pastmdr: yesterdayprn: 623
noun: moifnc: writeprn: 623
noun: letterfnc: writeprn: 623
adj: yesterdaymdd: writeprn: 623
S: ParisT: 1930-07-04A: Simone de B.R: J.-P. Sartreprn: 623
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 188
10.3.5 STAR-1 content without indexicals
Fido barked.STAR−1
10.3.6 STAR-2.2 inference for content without indexicals
[
verb:αprn: K
]
S: LT: DA: NR: Oprn: K
⇒
[
verb:αprn: K
]
S: LT: DA: OR: Nprn: K
10.3.7 STAR-2 contentFido barked.
noun: Fidofnc: barkprn: 572
verb: barkarg: Fidosem: pastprn: 572
S: ParisT: 1930-07-03A: J.-P. SartreR: Simone de B.prn: 572
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 189
10.3.8 Operations of STAR-2 inferences
1. TheS value of the STAR-1 in the input (matching the antecedent) equals theS value of the STAR-2 in theoutput (derived by the consequent).
2. TheT value of the STAR-1 in the input equals theT value of the STAR-2 in the output.
3. TheA value of the STAR-1 in the input equals theR value of the STAR-2 in the output.
4. TheR value of the STAR-1 in the input equals theA value of the STAR-2 in the output.
5. Theprn value of the input equals theprn value of the output.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 190
10.4 Dialogue with a WH Question and Its Answer
10.4.1 Nonlanguage content in the interrogative mood
What did you write?STAR−0
10.4.2 Anchored STAR-0 content of WH interrogative
noun: toifnc: writeprn: 625
verb: writecat: interrogsem: pastarg: toi whatprn: 625
noun: whatfnc: writeprn: 625
S: ParisT: 1930-07-04A: Simone de B.R: J.-P. Sartreprn: 625
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 191
10.4.3 Questioner as speaker: DBS graph analysis of 10.4.1
whattoi
write
N N
V
123
4whattoi
write
(i) semantic relations graph (SRG) (ii) numbered arcs graph (NAG)
(ii) signature
1 2 3
What did you write_?
(iv) surface realization
4
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 192
10.4.4 Answerer as hearer parsing 10.4.1
pnc: ?
prn: cat: vi’ interrog
?
pnc: ?
prn: cat: vi’ interrog
result
1
prn:
sem: pastarg:
fnc:mdr:
verb: v_1cat: n’ do’ vi
2 sem: pastmdr:
verb: v_1
fnc: v_1cat: n’ do’ vi
3
prn:
sem: presarg:
sem: pastmdr:
verb: v_1
fnc: v_1mdr:
noun: toi
fnc: v_1cat: do’ vi
4 sem: pastmdr: mdr:
noun: toiverb: write
fnc: write
prn: 668
prn: 668 prn: 668
prn: 668 prn: 668 prn: 668
prn: 668 prn: 668 prn: 668
sem: pastmdr: mdr:
noun: toiverb: write
fnc: writecat: interrog
prn: 668 prn: 668 prn: 668
fnc: write
fnc: write
cat: np3
cat: np3
cat: np3
cat: np3
cat: np3
syntactic−semantic parsing
arg: wh?
arg: toi wh?
arg: toi wh?
arg: toi wh?
verb: write
didlexical lookup
youWhat
prn:
cat: n’ do’ vsem: pastarg:
fnc:mdr:prn:
verb: v_1
write
prn:
sem: presarg:
verb: write
fnc:mdr:prn:
noun: toicat: np3noun: what
noun: what
noun: what
noun: what
noun: what
noun: what
cat: vi
cat: n−s3’ a’ v
cat: n−s3’ a’ v
cat: sp2
fnc:mdr:prn:
noun: toicat: sp2
cat: sp2
cat: sp2
cat: sp2
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 193
10.4.5 STAR-2.3 inference for deriving hearer perspective
noun: toifnc: αprn: K
verb:αcat: interrogarg: {X toi wh}prn: K
noun: whfnc: αprn: K
S: LT: DA: NR: Oprn: K
⇒
noun: moifnc: αprn: K
verb:αcat: interrogarg: { X moi wh}prn: K
noun: whfnc: αprn: K
S: LT: DA: OR: Nprn: K
10.4.6 Result of applying the STAR-2.3 inference to 10.4.4
noun: moifnc: writeprn: 668
verb: writecat: interrogsem: pastarg: moi whprn: 668
noun: whatfnc: writeprn: 668
S: ParisT: 1930-07-04A: J.-P. SartreR: Simone de B.prn: 668
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 194
10.4.7 Answerer as speaker
A little poem.STAR−1
10.4.8 Answer to a WH question as a set of STAR-0 proplets
noun: poemsem: indef sgmdr: littleprn: 655
adj: littlemdd: poemprn: 655
S: ParisT: 1930-07-03A: J.-P. SartreR: Simone de B.prn: 655
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 195
10.4.9 Questioner as hearer parsing 10.4.7
fnc:mdr:prn:
cat: snnoun: poem
1
lexical lookupA
syntactic−semantic parsing
little poem .
result
prn:
pnc: .cat: v’ decl
prn:
pnc: .cat: v’ decl
3
prn:
adj: littlecat: adnsem: posfnc:
mdr:prn:
noun: n_1
prn:
adj: littlecat: adn
fnc:mdr:
noun: n_1
prn: 626
sem: posmdd:
2
adj: littlecat: adn
fnc:
noun: n_1
prn: 626
sem: posmdr: little mdd: n_1
prn: 626
adj: littlecat: adn
fnc:
prn: 626
sem: posmdr: little
prn: 626
noun: poem
mdd: poem
adj: littlecat: adn
prn: 626
sem: posmdr: little
prn: 626
noun: poem
mdd: poemfnc: decl
fnc:mdr:prn:
cat: snnoun: poem
mdd:
cat: sn’ snp
cat: sn’ snp
cat: sn’ snp
cat: snp
cat: snp
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 196
10.4.10 STAR-2.4 connecting WH interrogative with answer
verb:αcat: interrogarg: { X toi what}prn: K
noun: whatfnc: αprn: K
S: LT: DA: NR: Oprn: K
+
noun:βfnc: declmdr: γprn: K+M
adj: γmdd:βprn: K+M
⇒
verb:αcat: declarg: { X toi β}prn: K+1
noun:βmdr: γfnc: αprn: K+M
adj: γmdd:βprn: K+M
S: LT: DA: NR: Oprn: K+1
10.4.11 Questioner’s STAR-2 content for regaining balance
noun: toifnc: writeprn: 625+2
verb: writecat: declsem: pastarg: toi poemprn: 625+2
noun: poemfnc: writemdr: littleprn: 625+2
adj: littlemdd: poemprn: 625+2
S: ParisT: 1930-07-04A: Simone de B.R: J.-P. Sartreprn: 625+2
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 197
10.5 Dialogue with a Yes/No Question and Its Answer
10.5.1 STAR-0 content underlying language countermeasure
Is the poem about me?STAR−0
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 198
10.5.2 Answerer as hearer parsing a yes/no interrogative
pnc: ?
prn: cat: vi’ interrog
lexicallookup
prn:arg:
cat: adnsem: pos
adj: about n_2
pnc: ?
prn: cat: vi’ interrog
Is
prn:arg:sem: prescat: sn3’ be’ vverb: v_1
about
prn:arg:
cat: adnsem: pos
adj: about n_2
fnc:mdr:prn:
noun: moi
5
result
syntactic−semantic parsing
1arg:sem: pres
verb: v_1cat: sn3’ be’ vi
2 sem: pres
verb: v_1 noun: n_1
arg: n_1
noun: poem
prn:
cat: be’ vi
3sem: pres
verb: v_1
arg: poem
noun: poemcat: be’ vi
4sem: presarg: poem
verb: about n_2 noun: poemcat: be’ vi
prn: 669
prn: 669 prn: 669
prn: 669 prn: 669
prn: 669 prn: 669
sem: presarg: poem
verb: about moi noun: poemcat: be’ vi
prn: 669 prn: 669
sem: presarg: poem
verb: about moi noun: poemcat: be’ interrog
prn: 669 prn: 669
me ?the
prn:
noun: n_1
poem
prn:
noun: n_1
sem: deffnc:
prn:
noun: poemcat: snsem: sgfnc:
sem: deffnc:
sem: defcat: snsem: sgfnc: fnc: v_1
sem: def sg
sem: def sgfnc:
cat: snp
fnc: about n_2
cat: snpsem: def sgfnc: about moi
sem: def sgfnc: about moi
cat: snp
cat: np
cat: np
cat: np
cat: snp
cat: s1
fnc:mdr:prn:
noun: moicat: s1
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 199
10.5.3 Answerer as hearer: revised perspective of 10.5.2
noun: poemcat: snpsem: indef sgfnc: about toiprn: 669
verb: about toicat: be’ interrogsem: presarg: poemprn: 669
S: ParisT: 1930-07-04A: J.-P. SartreR: Simone de B.prn: 669
10.5.4 Answerer J.-P. as speaker
Yes.STAR−1
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 200
10.6 Dialogue with Request and Its Fulfillment
10.6.1 Anchored nonlanguage request content
(Please)2 pass the ashtray!STAR−0
10.6.2 Request STAR-0 content as a set of proplets
verb: passcat: impvsem: presarg: # ashtrayprn: 630
noun: ashtraycat: snpsem: def sgfnc: passprn: 630
S: ParisT: 1930-07-04A: Simone de B.R: J.-P. Sartreprn: 630
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 201
10.6.3 Graph structure used by requestor as speaker
0
12
pass
ashtray
N
Vsignature
0Pass the_ashtray !
21
surface realization
ashtray
pass
semantic relations numbered arcs graph (NAG)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 202
10.6.4 Requestee as hearer parsingPass the ashtray!
lexicallookup
1
2
3
arg:sem: pres
prn:
noun: n_1verb: passcat: n−s3’ a’ v
prn: 671
prn:
noun: ashtray
sem: pres
noun: n_1verb: pass
arg: # n_1prn: 671 prn: 671
syntactic−semantic parsing
result
Pass the ashtray
prn:
noun: ashtraycat: sn
!
prn:arg:sem: pres
prn:
noun: n_1verb: passcat: n−s3’ a’ v
sem: pres
verb: pass
prn: 671 prn: 671arg: # ashtray
cat: vimpnoun: ashtray
sem: pres
verb: pass
prn: 671 prn: 671arg: # ashtray
noun: ashtray
cat: npsem: deffnc:
cat: np
cat: npsem: deffnc:
sem: defcat: snsem: sg
fnc: pass fnc:
cat: snpsem: def sgfnc: pass
prn:
pnc: !
prn:
pnc: !cat: vimp’ impv
cat: vimp’ impv
cat: impv cat: snpsem: def sgfnc: pass
fnc:sem: sg
cat: sn’ vimp
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 203
10.6.5 Request STAR-2 content as a set of proplets
verb: passcat: impvsem: presarg: # ashtrayprn: 671
noun: ashtraycat: snpsem: def sgfnc: passprn: 671
S: ParisT: 1930-07-04A: J.-P. SartreR: Simone de B.prn: 671
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 204
10.6.6 Sequence of elementary dialogues
J.-P. Sartre: I wrote you a letter yesterday. (statement, Sect. 10.1–10.3)S. de Beauvoir:What did you write? (WH question, Sect. 10.4)J.-P. Sartre: A little poem. (WH answer, Sect. 10.4)S. de Beauvoir:Is the poem about me? (Yes/No question, Sect. 10.5)J.-P. Sartre : Yes. (Yes/No answer, Sect. 10.5)S. de Beauvoir:(Please) pass the ashtray! (request, Sect. 10.6)J.-P. Sartre: fullfils request (fullfilment, Sect. 10.6)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 205
10.6.7 Perspective conversions as time-linear sequences
1. StatementSTAR-0: emergence of a nonlanguage content in agent A (Sect.10.1)STAR-1: production of a statement by agent A as the speaker (Sect. 10.2)STAR-2: interpretation of statement by agent B as the hearer(Sect. 10.3)
2. Question Dialogue (WH (Sect. 10.4) and Yes/No (Sect.10.5) questions)STAR-0: emergence of a nonlang. content in agent A as the questionerSTAR-1: production of a question by agent A as the speakerSTAR-2: interpretation of the question by agent B as the hearerSTAR-1: production of an answer by agent B as the speakerSTAR-2: interpretation of the answer by agent A as the hearer
3. Request Dialogue (Sect. 10.6)STAR-0: emergence of a nonlanguage content in agent A as the requestorSTAR-1: production of a request by agent A as the speakerSTAR-2: interpretation of the request by agent B as the hearerSTAR-1: nonlang. or lang. fulfillment action by agent B as therequesteeSTAR-2: nonlanguage or language fulfillment recognition byagent A as the requestor
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 206
11. Computing Perspective in Text
11.1 Coding the STAR-1 in Written Text
11.1.1 Text with a dispersed coding of the STAR-1
Jan. 16th, 1832 – The neighbourhood of Porto Praya, viewed from the sea, wears a desolate aspect. The volcanic fire ofpast ages, and the scorching heat of the tropical sun, have inmost places rendered the soil steril and unfit for vegetation. Thecountry rises in successive steps of table land, interspersed with some truncate conical hills, and the horizon is bounded byan irregular chain of more lofty mountains. The scene, as beheld through the hazy atmosphere of this climate, is one of greatinterest; if, indeed, a person, fresh from the sea, and who has just walked, for the first time, in a grove of cocoa-nut trees, canbe a judge of anything but his own happiness.
Charles Darwin 1839,Voyage of the Beagle, p. 41
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 207
11.2 Direct Speech in Statement Dialogue
11.2.1 Direct speech in a statement content
John said to Mary: moi love toi. Mary said to John: moi love toi.STAR−1
11.2.2 Representing the content of 11.2.1 as a set of proplets
noun: Johnfnc: sayprn: 23
verb: sayarg: John toMary (love 24)nc: (say 25)prn: 23
adj: to Maryfnc: sayprn: 23
noun: moifnc: loveprn: 24
verb: lovearg: moi toifnc: (say 23)prn: 24
noun: toifnc: loveprn: 24
noun: Maryfnc: sayprn: 25
verb: sayarg: Mary to John (love 26)pc: (say 23)prn: 25
adj: to Johnfnc: sayprn: 25
noun: moifnc: loveprn: 26
verb: lovearg: moi toifnc: (say 25)prn: 26
noun: toifnc: loveprn: 26
11.2.3 STAR-2.5 inference interpretingmoi/toi in quoted speech
verb:γarg: Uδ Z (β K)prn: K-1
noun:αfnc: βprn: K
verb:βarg: Xα Yfnc: (γ K-1)prn: K
whereα ǫ {moi, toi}andγ ǫ {say, tell, ...}
⇒if U = NIL, then δ = moiotherwise,δ =toi
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 208
11.2.4 nonlanguage inferences for maintaining balance
1. R(eactor) inferences for recognizing an imbalance and deriving a countermeasure (5.2.1–5.2.3).
2. D(eductor) inferences for establishing meaning and event relations (5.3.1–5.3.3).
3. D inferences for creating summaries (5.3.5, 6.5.4, 6.5.5).
4. Consequence inferencesCIN andCIP (6.3.6, 6.3.7).
5. Meta-inference for derivingup anddown inferences for hierarchies (6.5.7).
6. The resulting inferences for performing upward and downward traversal in a hierarchy (6.5.9, 6.5.12).
7. E(ffector) inferences for deriving blueprints for action (5.2.1, 5.2.5, 5.5.5).
8. E inference for changing subjunctive to imperative content (5.6.1).
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 209
11.2.5 Language inferences for adjusting perspective
1. STAR-1 inferences deriving the speaker’s perspective onSpatial,Temporal (10.2.2), andRecipient (10.2.4)aspects of STAR-0 contents.
2. STAR-2 inferences deriving the hearer’s perspective on contents with (10.3.4, 10.4.5) and without (10.3.6)1st and 2nd person indexicals.
3. STAR-2 inference combining question with answer content(10.4.10).
4. STAR-2 inference interpretingmoi/toi in quoted speech (11.2.3).
5. STAR-2 inference interpretingça coreferentially (11.3.6)
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 210
11.3 Indexical vs. Coreferential Uses of3rd Pronouns
11.3.1 Ambiguous hear mode content
Mary knew that shewas happy.STAR−2
11.3.2 Indexical use ofshe
sheMary knew that was happy.
11.3.3 Coreferential use ofshe
Mary knew that she was happy.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 211
11.3.4 Indexical STAR-0 representation as a proplet set
noun: Marycat: nmsem: sg ffnc: knowprn: 89
verb: knowcat: declsem: pastarg: Mary (happy 90)prn: 89
noun: çacat: s3sem: sg ffnc: happyprn: 90
verb: happycat: vsem: pastarg: çafnc: (know 89)prn: 90
S: AustinT: 1974-09-12A: Peter3rd: Suzyprn: 89
11.3.5 Coreferential STAR-0 representation as a proplet set
noun: Marycat: nmsem: sg ffnc: knowprn: 93
verb: knowcat: declsem: pastarg: Mary (happy 94)prn: 93
noun: çacat: s3sem: sg ffnc: happyprn: 94
verb: happycat: vsem: pastarg: çafnc: (know 93)prn: 94
S: AustinT: 1974-09-12A: Peterprn: 93
11.3.6 Inference for the coreferential interpretation ofça[
noun:αprn: K
]
. . .
[
noun: çaprn: K+M
]
=⇒
[
noun: (α K)prn: K+M
]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 212
11.4 Langacker-Ross Constraint for Sentential Arguments
11.4.1 Pronoun in subject sentence constructions
1. LH’ Coreferent noun in lower clause (L) precedes pronoun in matrix (H’)That Mary was happy surprised her.
2. H’L Pronoun in matrix (H’) precedes non-coref. noun in lower clause (L)% Shewas surprised that Mary was happy.
3. L’H Pronoun in lower clause (L’) precedes coreferent nounin matrix (H)That shewas happy surprised Mary.
4. HL’ Coreferent noun in matrix (H) precedes pronoun in lower clause (L’)Mary was surprised that shewas happy.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 213
11.4.2 Subject sentence: Pronoun in higher clause (matrix)
1. LH’: That Mary was happy surprised her.2. H’L: % Shewas surprised that Mary was happy.
(iii) numbered arcs graphs (NAG)
(iv) surface realization
(i) semantic relations graph (SRG)
surprise
Mary
signature(ii)
V
V
N1
was_surprised2
that3 5
% She Mary .6
2 3
That
5
.61
Mary her
a.
b.
surprise
1
23
4 5
6
Mary
a.
N
happy happy
surprise
123
6
45
Mary
b.
happy
was_happy
was_happy surprised
4
4
(Mary)/ça (Mary)/ça ça
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 214
11.4.3 Subject sentence: Pronoun in lower clause
3. L’H: That shewas happy surprised Mary.4. HL’: Mary was surprised that shewas happy.
(iii)s numbered arcs graphs (NAG)
V
V
(N)
N
(iv) surface realization
(i) semantic relations graph (SRG)
Mary
surprise
signature(ii)
surprise
Mary
1
23
4 5
6
surprise
Mary123
6
45
a. b.
happy happy happy
That
1
surprised
4
.6
Mary
5a.
1
Mary was_surprised2
that3
she4 5
.6b.
was_happy
was_happy
32
she
(Mary)/ça (Mary)/ça (Mary)/ça
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 215
11.4.4 Pronoun in object sentence constructions
1. LH’ Coreferent noun in lower clause (L) precedes pronoun in matrix (H’)That Mary was happy was known to her.
2. H’L Pronoun in matrix (H’) precedes non-coref. noun in lower clause (L)% Sheknew that Mary was happy.
3. L’H Pronoun in lower clause (L’) precedes coreferent nounin matrix (H)That shewas happy was known to Mary.
4. HL’ Coreferent noun in matrix (H) precedes pronoun in lower clause (L’)Mary knew that shewas happy.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 216
11.4.5 Object sentence: Pronoun in higher clause (matrix)
1. LH’: That Mary was happy was known to her.2. H’L: % Sheknew that Mary was happy.
(iv) surface realization
(iii) numbered arcs graph (NAG)
V
signature(ii)
(i) semantic relations graph (SRG)
N V
know
(N)
happy
1 2 3
knew that4 5
.6
1 2 3
That
b.
a.
was_happy
was_happy
know
1
23
456
a.
happy(Mary)/ça
Mary
(Mary)/ça
Mary
know
12 3
45
6b.
happy
Mary
Mary .654
was_known to_her
% She Mary
ça
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 217
11.4.6 Object sentence: Pronoun in lower clause
3. L’H: That shewas happy was known to Mary.4. HL’: Mary knew that shewas happy.
(iv) surface realization
to_Mary .654
was_known
know
12 3
45
Mary
6b.
happy
(Mary)/ça
(iii) numbered arcs graph (NAG)
V
signature(ii)
(i) semantic relations graph (SRG)
N V
know
Mary
(N)
happy
1 2 3
knew that4
Mary she5
.6
1 2
she
3
That
b.
a.
was_happy
was_happy
know
1
23
456
a.
Mary happy
(Mary)/ça (Mary)/ça
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 218
11.5 Coreference in Adnominal Sentential Modifiers
11.5.1 Pronoun in Adnominal Modifier Constructions
1. LH’ Coreferent noun in lower clause (L) precedes pronoun in matrix (H’)The man who loves the womankissed her.
2. H’L Pronoun in matrix (H’) precedes non-coref. noun in lower clause (L)% Shewas kissed by the man who loves the woman.
3. L’H Pronoun in lower clause (L’) precedes coreferent nounin matrix (H)The man who loves her kissed the woman.
4. HL’ Coreferent noun in matrix (H) precedes pronoun in lower clause (L’)The womanwas kissed by the man who loves her.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 219
11.5.2 Adnominal modifier sentence: Pronoun in higher clause
1. LH’: The man who loves the womankissed her.2. H’L: % Shewas kissed by the man who loves the woman.
b.
She was_kissed by_the_man who_loves the_woman .1 2 3 4 5 6−7−8
a.
The_man who_loves the_woman kissed .her31 2 874−5−6
(iv) surface realization
%
(iii) numbered arcs graph (NAG) (i) semantic relations graph (SRG)
V
N (N)
V
N
(ii) signature
man
kiss
love
woman
(woman)/ça man
kiss
love
woman
1
2
34
5
6 78
a.
(woman)/ça man
kiss
love
woman
1 23
4
56
7
8
b.
ça
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 220
11.5.3 Adnominal modifier sentence: Pronoun in lower clause
3. L’H: The man who loves her kissed the woman.4. HL’: The womanwas kissed by the man who loves her.
(iii) numbered arcs graph (NAG)
(iv) surface realization
The_man who_loves1 2
her3
kissed the_woman .4−5−6 7 8a.
herwho_lovesby_the_manwas_kissed
21 3 4
.6−7−85
The_woman
b.
(i) semantic relations graph (SRG)
V
N (N)
V
N
(ii) signature
man
kiss
love
woman
(woman)/ça
man
kiss
love
1
2
34
5
6 78
a.
man
kiss
love
1 23
4
56
7
8
b.
woman woman
(woman)/ça (woman)/ça
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 221
11.5.4 The Donkey sentence
Every farmer who owns a donkey beats it.
11.5.5 Quantifier structure attributed to a Donkey sentence
∀x [[farmer(x) ∧ ∃y [donkey(y) ∧ own(x,y)] → beat(x,y)]
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 222
11.5.6 DBS graph analysis of the Donkey sentence
(iii) numbered arcs graph (NAG)
1
2
34
5
6 8
beat
farmer
own
donkey
7(donkey)/ça
V
signature(ii)
N
V
N
(N)
(iv) surface realization
.1 8
Every_farmer3
who_owns a_donkey beats it74−5−62
(i) semantic relations graph (SRG)
farmer
own
beat
donkey
(donkey)/ça
11.5.7 Representing the Donkey content as a set of proplets
noun: farmercat: snpsem: pl exhfnc: beatmdr: (own 17)prn: 16
verb: owncat: vsem: presarg: # donkeymdd: (beat 16)prn: 17
noun: donkeycat: snpsem: indef sgfnc: ownprn: 17
verb: beatcat: declsem: presarg: farmer (donkey 17)prn: 16
noun: (donkey 17)cat: snpsem: indef sgfnc: beatprn: 16
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 223
11.5.8 The Bach-Peters sentence
The man who deserves it will get the prize he wants.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 224
11.5.9 DBS graph analysis of the Bach-Peters sentence
(iii) numbered arcs graph (NAG)
man
deserve
get
prize
want
1
2
34
5
6 7
8
910
11
12
(man)/ça(prize)/ça
V V
V
N N
(N) (N)
signature(ii)
(iv) surface realization
who_deserves2
will_get the_prize7
he8−9
wants .it3 4−5−6
The_man
1 10 11−12
man
deserve
get
prize
want
(i)
(man)/ça(prize)/ça
semantic relations graph (SRG)
11.5.10 Proplet representation of the Bach-Peters sentence
noun: mancat: snpsem: def sgfnc: getmdr: (deserve 57)prn: 56
verb: deservecat: vsem: ind presarg: # (prize 56)mdd: (man 56)prn: 57
noun: (prize 56)cat: snpsem: def sgfnc: deserveprn: 57
verb: getcat: declsem: presarg: man prizeprn: 56
noun: prizecat: snpsem: def sgfnc: getmdr: (want 58)prn: 56
noun: (man 56)cat: snpsem: def sgfnc: wantprn: 58
verb: wantcat: vsem: presarg: (man 56) #mdd: (prize 56)prn: 58
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 225
11.6 Coreference in Adverbial Sentential Modifiers
11.6.1LANGACKER-ROSS CONSTRAINT IN ADVERBIAL SUBCLAUSES
1. LH’ Coreferent noun in lower clause (L) precedes pronoun in matrix (H’)When Mary returned shekissed John.
2. H’L Pronoun in matrix (H’) precedes non-coref. noun in lower clause (L)% Shekissed John when Mary returned.
3. L’H Pronoun in lower clause (L’) precedes coreferent nounin matrix (H)When shereturned Mary kissed John.
4. HL’ Coreferent noun in matrix (H) precedes pronoun in lower clause (L’)Mary kissed John when shereturned.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 226
11.6.2 Adverbial modifier sentence: Pronoun in higher clause
1. LH’: When Mary returned shekissed John.
2. H’L: % Shekissed John when Mary returned.
V
N
(ii) signature
V
N
(N)
(iii) numbered arcs graph (NAG)
surface realization(iv)
b.
%
a.
She kissed John when Mary returned
1 2returned
3kissed
4−5 6John .
7 8When
.1 2 3 4−5 6 7 8
Mary she
(i) semantic relations graph (SRG)
kiss
John
Mary
return(Mary)/ça
a.
return
Mary
1
23
John
45
6 78
kiss
return
Mary
John
kissb.1
2 34
5
67
8
(Mary)/ça ça
11.6.3 Representing variant 1 as a set of proplets
noun: Marycat: nmsem: sg ffnc: returnprn: 39
verb: returncat: vsem: pastarg: Marymdd: (kiss 40)prn: 39
noun: (Mary 39)/çacat: nmsem: sg ffnc: kissprn: 40
verb: kisscat: declsem: pastarg: (Mary 39)/ça Johnmdr: (return 39)prn: 40
noun: Johncat: nmsem: sg mfnc: kissprn: 40
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 227
11.6.4 Adverbial modifier sentence: Pronoun in lower clause
3. L’H: When shereturned Mary kissed John.4. HL’: Mary kissed John when shereturned.
(iii) numbered arcs graph (NAG)
surface realization(iv)
b.
a.
kissed John when returned
1 2returned
3kissed
4−5 6John .
7 8When
.1 2 3 4−5 6 7 8
she Mary
Mary she
(i) semantic relations graph (SRG)
V
N
(ii) signature
V
N
(N)
kiss
Johnreturn
a.
return
1
23
John
45
6 78
kiss
Mary Mary
(Mary)/ça (Mary)/ça
return John
kissb.1
2 34
5
67
8
Mary
(Mary)/ça
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 228
11.6.5 Proplet NAGs of an H’L and an L’H construction
adj: happy
arg: Maryprn: 18
fnc: (know 17)
45
fnc: happyprn: 18
noun: Mary
that Mary was_happy3 4 5 6
.
That1
she2
was__happy3
prn: 17
1 2
(iv) surface realization
H’L:
% She1
prn: 17
prn: 17fnc: know
5 6
L’H:
noun: Mary
arg: Mary (happy 18)
36
.6
prn: 17fnc: know
verb: know verb: know
14
23
fnc: happyprn: 18
adj: happy
prn: 18
fnc: (know 17)
knew2
was__known4
to__Mary5
noun: (Mary 18)/ça
arg: (Mary 18)/ça (happy 18)
noun: (Mary 17)/ça
arg: (Mary 17)/ça
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 229
Part III.Final Chapter
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 230
12. Conclusion
12.1 Level of Abstraction
12.1.1 The Equation Principle of Database Semantics
1. The more realistic the reconstruction of natural cognition, the better the functioning of the artificial model.
2. The better the functioning of the artificial model, the more realistic the reconstruction of natural cognition.
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 231
12.2 Evolution
12.2.1 Consecutive vs. Concurrent Hypothesis
language ability
abilitylanguage communication
ability
time line
transition to language
a. Consecutive hypothesis:
time line
non−languageabilities
abilitesnon−communication
language ability evolves from non−language abilities
b. Concurrent hypothesis:language ability evolves from communication ability
c©2011 Roland Hausser
Computational Linguistics and Talking Robots 232
12.3 Semantics
12.3.1 Predicate calculus analysis ofEvery man loves a woman.
reading 1: ∀x[man′(x) →∃y[woman′(y) & love′(x, y)]]
reading 2: ∃y[woman′(y) & [∀x[man′(x) → love′(x, y)]]
12.3.2 DBS analysis ofEvery man loves a woman.
noun: mancat: snpsem: exh plfnc: loveprn: 23
verb: lovecat: declsem: presarg: man womanprn: 23
womancat: snpsem: indef sgfnc: loveprn: 23
c©2011 Roland Hausser