hrase structure grammars - stanford university

13
I ! I I \ " " " Lecture 3 page 1 of 13 'hrase structure grammars also called: Immediate constituent grammars (by traditional linguists) Context free grammars (by mathematical linguists) Backus normal form (by computer-language designers) Recursive patterns (in some computer applications) S -> cAc related variants: Dependency grammar Categorial grammar Recursive transition nets Phrase structure grammars and their variants form the basis for most of the work in syntax, both in linguistics and in computer science. They assign tree-structures to sentences in a way which gives a clear idea of the organization of the phrases, and they allow a very simple correspondence (as we will see in .the following sections) between the knowledge structures, the assigned structures, and the processes of analysis and generation. It is this simplicity which makes them so appealing, and has caused them to be used as a basis for structuring more powerful grammar formalisms. These lectures will develop the ideas in several steps. In lecture 3, the basic ideas of phrase structure will be discussed in the context of intuitive formalisms like dependency and categorial grammar, and the formal mechanisms of context-free generative grammars will be described. In Lectures 4 and 5 the problems of designing processes which can use these grammars in analyzing sentences will be discussed and a number of approaches given in detail. Finally, lecture 6 will discuss the limitations of the context-free approach and the developments of transformational grammar which extend the generative approach. Dependency Grammars The patterns which can be described by simple transition nets as described in lecture 2 have a kind of "flat" quality the structure assigned to a string is a simple sequence of nodes and arcs. One of the easily observable facts about natural language is that it has a kind of constituent structure which is of much more use in describing the patterns and analyzing their meaning. If we have a sentence like "Good boys eat cherry pie." we can view it is expressing a basic relationship "boys eat" with further modification describing what kind of boys, and just what they eat. We can say that the modifier "good" depends on "boys", while the phrase "cherry pie" depends on "eat". Dependency grammar was an attempt to make these sorts of relations primary in the structure assigned to sentences. Adjectives were said to depend on the noun the modified, nouns on verbs as subjects and objects, and on prepositions as objects, adverbs and auxiliaries depended on main verbs, and prepositions on the words modified by prepositional phrases. Each piece of a sentence could be thought of as a head which dominates one or more dependent words. The entire head- dependent construct functions in the same way in larger constructs as the head would by itself. In dependency form, the structure of the sentence "Tho carton in the freezer contains leftover biryani." would be: Fall 1974-5 T. Winograd CS/Linguistics 265-66 * * i

Upload: others

Post on 10-Dec-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

I!

I

I\

"

"

"

Lecture 3 page 1 of 13

'hrase structure grammarsalso called:

Immediate constituent grammars (by traditional linguists)Context free grammars (by mathematical linguists)Backus normal form (by computer-language designers)Recursive patterns (in some computer applications)

S -> cAc

related variants:

Dependency grammarCategorial grammarRecursive transition nets

Phrase structure grammars and their variants form the basis for most of the work in syntax,both in linguistics and in computer science. They assign tree-structures to sentences in a way whichgives a clear idea of the organization of the phrases, and they allow a very simple correspondence(as we will see in .the following sections) between the knowledge structures, the assigned structures,and the processes of analysis and generation. It is this simplicity which makes them so appealing, andhas caused them to be used as a basis for structuring more powerful grammar formalisms.

These lectures will develop the ideas in several steps. In lecture 3, the basic ideas of phrasestructure will be discussed in the context of intuitive formalisms like dependency and categorialgrammar, and the formal mechanisms of context-free generative grammars will be described. InLectures 4 and 5 the problems of designing processes which can use these grammars in analyzingsentences will be discussed and a number of approaches given in detail. Finally, lecture 6 will discussthe limitations of the context-free approach and the developments of transformational grammar whichextend the generative approach.

Dependency Grammars

The patterns which can be described by simple transition nets as described in lecture 2 have akind of "flat" quality — the structure assigned to a string is a simple sequence of nodes and arcs.One of the easily observable facts about natural language is that it has a kind of constituent structurewhich is of much more use in describing the patterns and analyzing their meaning. If we have asentence like "Good boys eat cherry pie." we can view it is expressing a basic relationship "boys eat"with further modification describing what kind of boys, and just what they eat. We can say that themodifier "good" depends on "boys", while the phrase "cherry pie" depends on "eat".

Dependency grammar was an attempt to make these sorts of relations primary in the structureassigned to sentences. Adjectives were said to depend on the noun the modified, nouns on verbs assubjects and objects, and on prepositions as objects, adverbs and auxiliaries depended on mainverbs, and prepositions on the words modified by prepositional phrases. Each piece of a sentencecould be thought of as a head which dominates one or more dependent words. The entire head-dependent construct functions in the same way in larger constructs as the head would by itself.

In dependency form, the structure of the sentence "Tho carton in the freezer contains leftoverbiryani." would be:

Fall 1974-5 T. WinogradCS/Linguistics 265-66

**i

!

"

"

"

)

i

Lecture 3 page 2 of 13

containsS \

carton biryaniy \

the i n l ef tover

freezer

the

Many languages have rules of agreement and concord which follow these dependency links.For example in Spanish both the adjective and determiner depend for their form on the number andgender of the noun they modify, as in "la vaca mornda," "las vacas moradas," "al caballo morado," "lo*caballos morados." (the purple cow, the purple cows, the purple horse, the purple horses). Thisseemed to fit naturally with the semantic notions of dependency -- for example an adjective as afurther description depending on the object referred to by the noun.

A word can have more than one dependent. In our example, the verb "contains" has both itssubject "carton" and its object "biryani" as dependents, while "carton" has both the determiner "the"and the preposition "in".

Dependency grammars were used extensively in early translation and question-answeringprograms, since the dependency structure gave a good deal of relevant information for deciding onthe^form of the translated sentence and for sorting out the information content of sentences. Forexample one simple question-answerer (Protosynthex I, by Klein and Simmons) stored the contents ofa children's encyclopedia in the form of dependency structures. Given the two sentences "Worms eatgrass." "Horses with worms cat grass." it stored the two structures:

eat eat\

worms grass horse9 grass

with

worms

If asked the question "What do worms eat?" It would look for all the sentences containing thetwo words "worms" and "eat". It then checked to see if in any of them the word "worms" dependeddirectly on the word "eat", since this matched the dependency structure of the question, andtherefore would be more likely to provide a relevant answer. In this case, it would choose the first asopposed to the second sentence, as desired. Of course thi.s

v/as

an extremely simplistic form ofquestion answering, since it could only work if the relevant answer appeared directly in the storedknowledge, and still could only guess roughly whether the selected sentence was appropriate. Butthe use °of dependency structures enabled the system to be more selective than a simple key-wordsearch.

The formal theory of dependency grammar primarily emphasized dependency as a way ofdescribing the structure of sentences — in our terms, the assigned structures. It left a great deal oflatitude a°nd ad-hocness in the way the system's permanent knowledge was structured and the way

Fall 1974-5CS/Linguistics 265-66

i**

T. Winograd

ji

;

"

"

"

:ii!

Lecture 3 page 3 of 13

the process was organized to analyze a sentence. A number of parsing strategies were used, somemore systematic than others, but their structure was not particular to the ideas of dependency.Recently, Roger Schank has developed a theory of "conceptual dependency" which tries to capturethe same sorts of intuitive insights at the level of meaning and concepts. It also deals almost entirelywith issues of what the assigned structures should look like, avoiding many of the problems involvedin structuring and using language knowledge, leaving this to specialized programs whose form is notdetermined by the basic theory.

Categorial Grammars

The basic insights of dependency grammar were expressed in a more formal way in categorialgrammars. The way in which an element combines with others is represented algebraically in asymbolic form which can be easily manipulated for analyzing sentences.

In dependency theory, we saw that an adjective can modify a noun, with the resulting phrasestill taking the same place as an unmodified noun. If we designate nouns by N, we could describe anadjective as N/N and use a kind of arithmetic cancellation. Just as 2/2 * 2 gives 2, we can think ofthe sequence ADJECTIVE NOUN as the combination N/N . N, giving us N. Going one step further, anadverb like "very" could be expressed as the fraction [N/N]/[N/N]. It could cancel with an adjective(N/N) with the result still being an adjective. A phrase like "very fine stuff" could be viewed as thesequence [N/N]/[N/N] N/N N. The first two could be cancelled, leaving the sequence ADJECTIVENOUN, followed by a second cancellation leaving NOUN. This corresponds to the simple dependencystructure:

stuff/

f i neV

very

One problem with a simplistic version of dependency theory is that often a phrase functionsvery differently from what its head would do alone. For example a verb with a noun depending on itis already part of a sentence, and can't be treated as if it were just an elaborate verb. In linguistics,such constructions are called cxoccntric, as opposed to e.ndoc.cntric contstructions which can functionas a substitute for their head. The example ADJECTIVE NOUN is endocentric, since it can besubstituted for NOUN in order to further elaborate a sentence. On the other hand NOUN VERBconstructions are exocentric, since the resulting phrase is neither a noun nor a verb, but a sentence(or a part of a sentence with two nouns). Categorial grammars could equally well express exocentricconstructions, and most of them took NOUN and SENTENCE as the two basic classes, deriving othersfrom their dependencies on these. An intransitive verb like "sighed" can be thought of as "somethingwhich depends on a noun to produce a sentence".

The simple fraction notation does not account for the difference in ordering betv/een a wordwhich modifies to the right and one to the left. It can be augmented slightly by using a left-leaningslash "\" to mean "cancels to the left" and a right leaning one "/" for "cancels to the right". If westart v. ith the basic categories S (sentence) and N (noun), we can define an intransitive verb as S\N --something which produces an S if there is a N to its left, and a transitive verb like "prodded" as[S\N]/N -- something which cancels against a noun on its right to produce S\N, which in turn cancelswith a noun on its left to produce a sentence. By assigning an appropriate formula to each v/ord ofthe language, a categorial grammar can describe the set of possible structures in which they can takepart.

As an example we might have the grammar:-

CS/Linguistics 265-66 Fall 1974-5 T. Winograd

i

<*

«

1

"

"

"

I

i

I

page 4 of 13Lecture 3

shine = S\N(produces a S if there, is a N to the. left)

glorious = N/N(produces a N if there is a N to the right)

the - N/N

It would assign a sentence structure to "The glorious sun will shine in January." like:

The cilorious sun will shine in January.N\N " N\N N [S\N./[S\N. S\N _[S\N. /tS\N]_ /N N

This diagram indicates the effect of cancelling the category symbols by connecting the twosymbols which take part in the cancellation to their result. If any string of words is a sentence, it is

possible to produce such a derivation in which everything eventually cancels down to a single S. TheaVipned structures are very much like dependency structures. However in the case of categorialnrJL-r, ikprp is a "uch dPamr picture of what the knowledge structure looks like which enablesthe user (or program) to analyze and produce sentences. The knowledge of what structures are

allowable is all represented in the complex category symbols assigned to the individual words (hence

the name "categorial").

The cancellation rules can be more formally stated as:

Formalism: Categorial Grammar

Knowledge Structures

Each word is assigned a categorial formula, using the basic categories S and N, and nestedoccurences of the connectives "\" and "/", grouped with parentheses.

Process: Demonstration that a sentence is grammatical

Processing rules

Begin with a list containing the category symbols for each word in the sentence,where each of theso is either the symbol N or a compound symbol A/B orA\B where A and B are any logal category symbols, simple or compound.

If the symbol A/B appoars to tho loft of B, the two can bo cancelled to leave thesingle symbol A.

If the symbol B appears to tho loft of A\B, the two can be cancelled to leave thesingle symbol A.

CS/Linguistics 265-66 T. Winograd

sun = N

will = [S\N]/[S\N]January = Nin - [[S\N]/[S\N]]/N

xN ' S\N [S\Nl/[S\N]

J^ SXN "^*"

Fall 1974-5

page 5 of 13Lecture 3"

"

"

1,

The rosult of the cancellation takes the place of the original pair of symbols in thesequence.

If some sequence of legal cancellations reduces the ontire string to the singlesymbol S, than it is a sentence of the language.

Assigned Structures

A dependency tree is implicit in the cancellations.

Actually, we have been far too restrictive in assigning only a single category symbol to eachword -- in fact many words appear in a number of different categories, and a derivation like the oneabove represents a selection of those which are relevant to the structure of this particular sentence.In a sentence like "Your shoes will lose their shine if you wear them while you sun on the beach." thewords "sun" and "shine" have category assignments opposite to those in our example. We will discussthis problem in connection with problems of ambiguity later.

Context-free grammars

The structure resulting from a categorial analysis of a sentence can be described as itsimmediate constituent structure or phrase structure. Any sentence or phrase is made up of asequence of non-overlapping constituents, each of which is either a single word or a phrase which inturn has its own constituent structure. There are a number of other ways such a structure could berepresented -- one of the simplest is a bracketing like:

[ [the [glorious sun]] [[will shine][in January]].]

In the categorial formalism the set of possible constituents was implicit in the complex categorysymbols attached to the words. In fact in natural languages there is a fairly small set of possibleconstituent patterns, and the grammar becomes much more useful if we state it as a set of suchpatterns. Each pattern is given a name, and expressed as a sequence of possible constituents, each ofwhich is either a word (specified by its v/ord class) or the name of a pattern. The resulting set ofpatterns (or rulos) is called a phrase structure grammar.

The most common notation is one in which the pattern name is followed by an arrow, followedby the list of constituents which make it up. As a simple grammar we might have a set of patterns forsentence (S), noun phrase (NP), verb phrase (VP)

«

I

!

"

"

"

page 6 of 13Lecture 3

Given the sentence "Short stout giraffes admire peonies," We can describe the constituentstructure with the tree diagram:

By having more than one pattern assigned to a given constituent type we can account for suchthings as optional and repeated elements. The rule NP -> ADJECTIVE NP can be applied any number oftimes to produce a repeated string of adjectives. A number of different notations have beendeveloped to make it easy to combine similar patterns. For example some versions allow the use ofparentheses to indicate optional elements, as VP -) VERB (NP), while others use a bracket notation toindicate choices. For the moment we will stick to the simplest notation, avoiding these improvements.In the next section we will describe how transition nets can be used to do this in a more general way.

The rules of a context-free grammar can be thought of as a set of rewrite rules which operatein a systematic way to generate sentences of the language as follows:

Formalism: Context-free Rewrite Rules

Knowledge Structures

The language is described using a set of terminal symbols (individual words and word classes)and a set of non-terminal symbols (names of phrase patterns).

The grammar consists of a set of rewrite rules (also called productions), each of which has asingle non-terminal symbol on the left and a sequence of symbols (terminal or non-terminal) on the right

One non-terminal symbol (called the distinguished symbol) represents the pattern for entiresentences. Conventionally, S is used.

Process: Abstract generation

The generation is done by a process called derivation which consists of rewriting a string ofsymbols one step at a time according to the grammar rules.

Working Structures

In generating a sentence, the system builds a sequence of symbol strings, also called aderivation. Each string can contain both terminal and non-terminal symbols.

CS/Linguistics 265-66 Fall 1974-5 T. Winograd

t

NP YP

/ / NP

DJECTIYE ADJECTIVE NOUN VERB NOUNI I I I I

Short stout giraffes admire peonies

"

"

"

Lecture 3 page 7 of 13

Processing rules

Begin with a string containing a single instance of the distinguished symbol (S).

At each stop of the derivation add a new string based on tho previous one asfollows:

Choose a non-terminal symbol in the string.

Choose a rule which has it on the loft side.

Writo a new string in which the chosen symbol is replaced by the sequenceon the right side of that rule.

When tho most recent string of tho derivation contains only-figgsftorminal symbols,that string is a sentence of the language, and the entire sequence can becalled a dorivation of that sontonco.

Assigned Structures

The rules as stated above do not directly give an assigned structure to thegenerated. However a slight addition can provide a phrase-structure tree:

sentence

Processing rules

Begin with a single node corresponding to the distinguished symbol.

At each step of the derivation create a set cf branches attached to the nodecorresponding to tho symbol v/hich is replaced, with one new branch leadingto a node for each symbol in the expansion (tho right side of tho rule). Thebranches are in the same order as the symbols in tho pattern.

«HHOH«««e«H««(-»HH««««««««H««H««flH««H«HO«eHHH«HHe

Sample derivation

Non-terminal symbols

S, NP, NP2, VP, VERB, NOUN, ADJECTIVE, DETERMINER

Terminal symbols

The, a, decorated, flat, pieplate, elephant,contain

GrammarS -» NP VPNP -> DETERMINER NP 2NP2 -» ADJECTIVE NP 2NP2 -* NOUNVP -> VERBVP -> VERB NPDETERMINER ■. aDETERMINER -> theNOUN -> pieplate

CS/Linguistics 265-66 Fall 1974-5 T. Winograd

page 8 of 13Lecture 3" NOUN -» surprise

VERB -» containsVERB -■* surprise

(more of the same for all the word classes, one for each word)

DerivationSNP VPDETERMINER NP2 VPDETERMINER NP2 VERB NPDETERMINER ADJECTIVE NP2 VERB NPthe ADJECTIVE NP2 VERB NPthe ADJECTIVE NOUN VERB NPthe ADJECTIVE NOUN contains NPthe ADJECTIVE NOUN contains DETERMINER NP2the ADJECTIVE pieplate contains DETERMINER NP2the ADJECTIVE pieplate contains DETERMINER NOUNthe ADJECTIVE pieplate contains a NOUNthe decorated pieplate contains a NOUNthe decorated pieplate contains a surprise,

Asssigned structure -- phrase structure tree

S

" 1

NP VP

DETERMINER

'

NP2 VERB NP\.... I <?„ \

ADJECTIVE NP2 DETERMINER NP2

NOUN > NOUNi

the decorated pieplate contains a surprise

Summary of basic ideasGive names to sub-patterns (non-terminals)

Describe structures by trees

Represent knowledge as rewrite rules

Well defined processes for abstract derivation, corresponding directly to the rewrite rules,avoiding issues of choice.

Issues for thought

Analogy with mathematical proofs (rule a axiom, derivation s proof)

T. Winograd" CS/Linguistics 265-66 Fall 1974-5

"

II

"

"

"

!i;

i

i

r\

!

i

page 9 of 13Lecture 3

Partial specification of the order of the steps in the derivation

Multiple rules for single non-terminal

Multiple ways of getting a single terminal

OHHHHHf)H

H««BB«H««H«««e«««Htl»«»««««.«e«eHO«B»H«0«

Recursive Transition NetworksIn lecture 2 we used transition nets to provide a wider range of patterns than simple linear

sequences of words and classes. We can do exactly the same thing for our phrase structure rules. Anon-terminal symbol (one of Ihe pattern names which appears on the left of the arrow) is related in anordinary context-free grammar to a simple sequence on the right of the arrow. Instead, we canassociate it with a transition net, each of whose arcs is labelled with a word, a word class, or apattern name. We could define the simple sentence grammar of lecture 2 with two nets, one forsentence and one for noun phrase:

S:

NP VERB NP

""* ""■""^'-"'-""-X-+o

NP

DETERMINER NOUNT,

-♦O

/ADJECT I VfT)

We may want such groupings to be recursive, in the sense that a particular constituent canhave as one of its internal constituents, something of the same type. For example the noun phrasethe painted orange bottle in the lop drawer has as one of its constituents the noun phrase the topdrawer. We can capture this by extending our NP network to:

NP

NP

-»0 o

The same sort of analysis immediately shows that the combination PREPOSITION NP occurs inmany places, and deserves to be labelled as a grouping in its own right, which we will call PREPP(prepositional phrase).

Transition nets extended in this way to -include entire groupings as transitions are calledrecursive transition nets. Although they are based on ordinary transition nets, their formal properties

Fall 1974-5CS/Linguistics 265-66 T. Winograd

DETERMINER NOUN PREPOSITION

->o "(^AD£XTjWpS

"

"

"

page 10 of 13Lecture 3

are quite different. First, there is no longer a straightforward connection between the networkdescription and' the process of analyzing an input. In the simpler networks, it is always possible todecide whether a given transition is possible by comparing the symbol of the arc to the next word inthe input string. In recursive transition networks, the correspondence is more complex. In order tosee whether an arc can be taken, it is necessary to call the entire recognition process recursively,with the current string as its input, and with its network specified by the name on the arc. If thissecond process succeeds at recognizing the desired constituent, as indicated by going into anaccepting state, then processing returns to the original network, going to the state at the end of thetransition, and continuing with the input string that is left after removing the elements "used up" by it.

We will discuss this processing further in the next lecture.

'.

T. WinogradCS/Linguistics 265-66 Fall 1974-5

;

"

"

"

page 1 1 of 13Lecture 3

Exercises for Lecture 3

1. [Fs] What traditional linguistic term can be applied to constructions in a categorial grammar whichinvolve fractions with matching numerator and denominator like N/N or S\S?

******:m. . *** . *** . . . * . . ****2. [Fls] Consider the nonsense languagedefined by the categorial formulas

glub N/N posh N/Ncrym N

yut N/Nverk S\Ntarl N blit (S\N)/N

Which of the following are sentences?

a) glut) posh yut verkb) glub yut posh tarl verkc) posh crym blitd) blit crym posh

str*****:):****. . **« . . . . . * . ** . *3. [F2o] Consider a simple dependency grammar in which a verb dominates any adjacent noun (i.e. the

verb is the head of a construction containing the two of them), a noun dominates an adjectiveor determiner to its left, and a verb dominates an adverb on its left.

Take a dictionary containing VERBS: clawed, ran; NOUNS: kitten, couch; ADJECTIVES: little,black; DETERMINERS: a, the; ADVERB: furiously.

Write a categorial formula for each of these words, corresponding to the dependency rulesgiven above, using the symbols S, N, \, and /, as explained in the text.

Show the set of cancellations needed for a categorial analysis of the sentences:

The. kitten ran./_ little black kitten furiously clawed the couch.

*****:(:*......*. . * * - . ** . *** . *4. [F2s] Give a categorial grammar for the following sentences (i.e. a set of category assignments to

the individual words which will allow all of these sentences to be analyzed properly).

Fear prevents changeCustoms usually change slowly.Customs developin strange ways.The unknown causes fear./_ difference in customs causes fear.People sometimes change their ways.

Give 3 examples of sentences which your grammar also allows, but which do not sound likereasonable English.

**************...*....***...

Fall 1974-5 T. WinogradCS/Linguistics 265-66

/**

"

"

"

!

i

page 12 of 13Lecture 3

5. [F2o] Write a context-free grammar for the set of examples in the previous exercise, using as fewnon-terminal symbols as possible.

****************.*.*..*.*.**

6. [F2o] In a context-free grammar we can avoid the need for dealing with word-classes by simplyusing a non-terminal symbol as the class-name and having rules like:

NOUN -> horsecollarNOUN -» newspaperNOUN -. wicket

VERB -> fidgetVERB -> expectorateVERB -> impeach

How could we avoid the use of word classes in writing finite-state transition net grammars likethose defined in lecture 2? Is it possible to produce a net all of whose arcs are actual wordswhich defines the same set of patterns as one which allows word-class arcs? What problemsare there in doing so?

****.*.*.......*.**.*.......

7. [F2o] Write a CFG which will derive all of the sentences in the left hand column, but none in theright hand column. Try to express in general terms any difficulty which arises in doing this. ■

She walks.She does walk.Does she walk?They walk.They do walk.Do they walk?He rides.Does he walk?Do we walk?We do ride... etc.

She walk.She does walks.Do she walk?They walks.They does walks.Do they walks?

etc.

****************************8. [F2o] Write a recursive transition net grammar (one or more transition nets) corresponding to the

following CFG:

VP -» VERB ADVERB NP NP -> ADJECTIVE NOUNVP -» VERB ADVERB NP -> DETERMINER NOUN PGNP -> DETERMINER NOUN PG -+ PREPOSITION NP

S -> NP VPVP -» VERBVP -* VERB NP

*****t*******. ....**.*.*. .*...*. *.**..

9. [Fls] Consider the problem of converting back and forth from CFG to RTN, trying to use therninumum number of nets and rules.

Show that the RTN formulation never needs more nets than the number of CFG rules.

Give an example which shows that there must sometimes be more rules than nets. Prove thatthe extra rules are necessary.

Fall 1974-5 T. WinogradCS/Linguistics 265-66

I ■» /« A

I

"

"

"

i

!i

1

i\

\

.;tiiI

!!I

page 13 of 13Lecture 3

Does an RTN grammar ever need more than one network for a single non-terminal? If so, givean example. If not, show how to avoid it in general.

************** .....

t*.

....** .**..*.***

10. [F3s] Describe an algorithm to determine the minimum number of context free rules needed towrite a grammar corresponding to a given RTN grammar.

*************.**>m'* .**.*. **11. [P2s] Write a LISP program which takes as its input a string of words and returns as a value

either T or NIL, depending on whether it has an appropriate structure based on the set ofcategorial symbols associated with each word. Assume that every v/ord is represented as aLISP atom and that it has a property CATEGORY on its property list. This property is eitherthe atom N (for noun), the atom S (for sentence) or a list of the form (LEFT A B) or (RIGHT AB), corresponding to A/B and A\B respectively, where A and B are in turn simple or compoundcategory symbols. For example, in the categorial grammar given as an example, the atomGLORIOUS would have the property (LEFT N N), while SHINE would be (RIGHT S N), and VERYwould be (LEFT (LEFT N N) (LEFT N N)). As a mnemonic for remembering the direction, think ofeach parenthesized phrase as meaning "I'll end up hanging on the branch of aalong with a on the other side."

«

CS/Linguistics 265-66 Fall 1974-5 T. Winograd