systemic networks, relational networks, and neural networks sydney lamb lamb@rice

136
Systemic Networks, Relational Networks, and Neural Networks Sydney Lamb [email protected] Part II: GuangZhou 2010 November 3 Sun Yat Sen University

Upload: dustin-wise

Post on 03-Jan-2016

29 views

Category:

Documents


0 download

DESCRIPTION

Systemic Networks, Relational Networks, and Neural Networks Sydney Lamb [email protected]. Part II: GuangZhou 2010 November 3. Sun Yat Sen University. Topics in this presentation. Aims of SFL and NCL From systemic networks to relational networks - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Systemic Networks, Relational Networks, and Neural Networks

Sydney [email protected]

Part II: GuangZhou 2010 November 3 Sun Yat Sen University

Page 2: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics in this presentation

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 3: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 4: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Aims of SFL

SFG aims (primarily) to describe the network of choices available in a language• For expressing meanings

“SFL differs from Firth, and also from Lamb, in that priority is given to the system”

(Halliday, 2009:64)

“The organizing concept of a systemic grammar is that of choice (that is, options in ‘meaning potential’…)”

(Halliday 1994/2003: 434

Page 5: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Aims of Neurocognitive linguistics (“NCL”)

NCL aims to describe the linguistic system of a language user• As a dynamic system

• It operates• Speaking, comprehending, learning, etc.

• It changes as it operates Evidence that can be used

• Texts • Findings of SFL• Slips of “tongue” and mind• Unintentional puns• Etc.

Page 6: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

NCL seeks to learn ..

• How information is represented in the linguistic systemHow information is represented in the linguistic system

• How the system operates in speaking and understandingHow the system operates in speaking and understanding

• How the linguistic system is connected to other knowledge How the linguistic system is connected to other knowledge • How the system is learnedHow the system is learned• How the system is implemented in the brainHow the system is implemented in the brain

Page 7: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The linguistic system of a language user: Two viewing platforms

Cognitive level: the cognitive system of the language user without considering its physical basis• The cognitive (linguistic) system• Field of study: “cognitive linguistics”

Neurocognitive level: the physical basis• Neurological structures• Field of study: “neurocognitive linguistics”

Page 8: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 9: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

“Cognitive Linguistics”

First occurrence in print:

• “[The] branch of linguistic inquiry which aims at characterizing the speaker’s internal information system that makes it possible for him to speak his language and to understand sentences received from others.”

(Lamb 1971)

Page 10: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Operational Plausibility

To understand how language operates, we need to have the linguistic information represented in such a way that it can be used for speaking and understanding

(A “competence model” that is not competence to perform is unrealistic)

Page 11: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Relational network notation

Thinking in cognitive linguistics was facilitated by relational network notation

Developed under the influence of the notation used by Halliday for systemic networks

Earlier steps leading to relational network notation appear in papers written in 1963

Page 12: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

More on the early days

In the 1960s the linguistic system was viewed (by Hockett and Gleason and me and others) as containing items (of unspecified nature) together with their interrelationships• Cf. Hockett’s “Linguistic units and their relations”

(Language, 1966) Early primitive notations showed units with

connecting lines to related units

Page 13: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The next step: Nodes

The next step was to introduce nodes to go along with such connecting lines

Allowed the formation of networks – systems consisting of nodes and their interconnecting lines

Halliday’s notation (which I first saw in 1964) used different nodes for paradigmatic (‘or’) and syntagmatic (‘and’) relationships• Just what I was looking for

Page 14: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

From systemic networks to relational networksThree notational adaptations

Rotate 90 degrees, so that • upwards would be toward meaning (at the

theoretical top) and • downwards would be toward phonetics (at the

theoretical bottom) Replace the brace for ‘and’ with a (more

node-like appearing) triangle; Retaining the bracket for ‘or’, allow the

connecting lines to connect at a point

Page 15: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The downward OR

a

b

a b

Page 16: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The downward AND

a

b

a b

Page 17: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The 90° Rotation: Upward and Downward

Expression (phonetic or graphic) is at the bottom

Therefore, downward is toward expression

Upward is toward meaning (or other function) – more abstract

network

meaning

expression

Page 18: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Orientation of Nodes

Downward AND and OR nodes:• Branching on the expression side

Multiple branches to(ward) expression

Upward AND and OR nodes:• Branching on the content side

Multiple branches to(ward) content

Page 19: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Downward and upward branching

a b

a b

a b

a b

Page 20: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The meaning of up/down:Neurological interpretation

At the bottom are the interfaces to the world outside the brain:• Sense organs on the input side• Muscles on the output side

‘Up’ is more abstract

Page 21: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The ordered AND

We need to distinguish simultaneous from sequential

For sequential, the ‘ordered AND’ Its two (or more) lines connect to

different points at the bottom of the triangle (in the case of the ‘downward and’)• to represent sequential activation

leading to sequential occurrence of items

a b

First a then b

Page 22: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The downward ordered or

For the ‘or’ relation, we don’t have sequence since only one of the two (or more) lines is activated

But an ordering feature for this node is useful to indicate precedence• So we have precedence ordering.

The line connecting to the left takes precedence• If conditions allow for its activation to be

realized, it will be chosen in preference to the other line

Page 23: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The downward ordered or (original notation)

a b

marked choice unmarked choice (a.k.a. default )

The marked choice takes precedence: It is chosen if the conditions that constitute the marking are present

Page 24: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The downward ordered or (revised notation)

a b

marked choice unmarked choice (a.k.a. default )

The unmarked choice is the one that goes right through. The marked choice is off to the side – either side

Page 25: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The downward ordered or (revised notation)

a b

unmarked choice marked choice(a.k.a. default )

The unmarked choice is the one that goes right through. The marked choice is off to the side – either side

Page 26: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Sometimes the unmarked choice has zero realization

b

unmarked choice marked choice

The unmarked choice is nothing. In other words, the marked choice is optional.

Page 27: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Operational Plausibility

To understand how language operates, we need to have the information represented in such a way that it can be directly used for speaking and understanding

Competence as competence to perform The information in a person’s mind is “knowing how” –

not “knowing that” Information in operational form

• Able to operate without manipulation from some added “performance” system

Page 28: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Relational networks:Cognitive systems that operate

Language users are able to use their languages Such operation takes the form of activation of

lines and nodes The nodes can be defined on the basis of how

they treat incoming activation

Page 29: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Nodes are defined in terms of activation:The AND

a b

Downward activation from k goes to a and later to b

Upward activation from a and later from b goes to k

Downward ordered AND

k

Page 30: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Nodes are defined in terms of activation

a b

The OR condition is notAchieved locally – at the node itself – it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both

Downward unordered OR

k p q

Page 31: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Nodes are defined in terms of activation:The OR

a b

Upward activation from either a or b goes to k

Downward activation from k goes to a and [sic] b

Downward unordered OR

k

Page 32: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Nodes are defined in terms of activation

a b

The OR condition is not achieved locally – at the node itself – it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both

Downward unordered OR

k p q

Page 33: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The Ordered AND: Upward Activation

Activation moving upward from below

Page 34: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The Ordered AND: Downward Activation

Activation coming downward from above

Page 35: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Downward Activation

AND OR

Upward

Downward

Page 36: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Upward Activation

AND OR

Upward

Downward

Page 37: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Upward activation through the or

The or operates as either-or for activation going from the plural side to the singular side.

For activation from plural side to singular side it acts locally as both-and, but in the context of other nodes the end result is usually either-or

Page 38: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Upward activation through the or

bill

BILL1 BILL2

Usually the context allows only one interpretation, as in I’ll send you a bill for it

Page 39: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Upward activation through the or

bill

BILL1 BILL2But if the context allows both to get through, we have a pun:

A duck goes into a pub and orders a drink and says, “Put it on my bill“.

Page 40: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Zhong Guo: Shadow Meaning

CENTRALCHINA

KINGDOM

zhong guo

Page 41: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The ordered OR:How does it work?

default

Ordered

This line taken if possible

Node-internal structure (not shown in abstract notation) is required to control this operation

Page 42: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 43: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

A purely relational network

After making these adaptations to systemic network notation, resulting in relational network notation (abstract form), it became apparent (one afternoon in the fall of 1964) that relational networks) need not contain any items at all

The entire structure could be represented in the nodes and their interconnecting lines

Page 44: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Morpheme as item and its phonemic representation

boy

b - o - y

Symbols?Objects?

Page 45: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Relationship of boy to its phonemes

boy As a morpheme, it is just one unit

Three phonemes, in sequence

b o y

Page 46: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The nature of this “morphemic unit”

BOY Noun

b o y

boy The object we are considering

Page 47: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The morpheme as purely relational

BOY Noun

b o y

We can remove the symbol with no loss of information. Therefore, it is a connection, not an object

boy

Page 48: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Another way of looking at it

BOY Noun

b o y

boy

Page 49: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Another way of looking at it

BOY Noun

b o y

Page 50: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

A closer look at the segments

b

boy

y

Phonologicalfeatures

o The phonological segments also are just locations in the network – not objects

(Bob) (toy)

Page 51: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

boy as label (not part of the structure)

BOY Noun

b o y

boy Just a label – to make the

diagram easier to read

Page 52: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Objection I

If there are no symbols, how does the system distinguish this morpheme from others?

Answer: Other morphemes necessarily have different connections

Another node with the same connections would be another (redundant) representation of the same morpheme

Page 53: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Objection II

If there are no symbols, how does the system know which morpheme it is?

Answer: If there were symbols, what would read them? Miniature eyes inside the brain?

Page 54: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Relations all the way

Perhaps all of linguistic structure is relational It’s not relationships among linguistic items; it

is relations to other relations to other relations, all the way to the top – at one end – and to the bottom – at the other

In that case the linguistic system is a network of interconnected nodes

Page 55: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Objects in the mind?

When the relationships are fully identified, the objects as such disappear, as they have no existence apart from those relationships

“The postulation of objects as some- thing different from the terms of relationships is a superfluous axiom and consequently a metaphysical hypothesis from which linguistic science will have to be freed.”

Louis Hjelmslev (1943/61)

Page 56: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Compare SF Networks – nodes and lines, plus symbols

SF networks have and and or nodes They also have symbols for linguistic items

E.g., polarity, positive, negative And symbols for relationships/operations

Symbol Meaning Example

+ insertion + x

/ conflation X / Y

· expansion X (P · Q)

^ ordering X ^Z

: preselection : w

:: classification ::z

= lexification =t

Page 57: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Syntax is also purely relational:Example: The Actor-Goal Construction

CLAUSE DO-SMTHG

Vt Nom

Material process (type 2)

Syntactic function

Semantic function

Variable expression

Page 58: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Syntax is also purely relational:Linked constructions

CL

Nom

DO--SMTHG

Vt Nom

Material process (type 2)

TOPIC-COMMENT

Page 59: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Add another type of process

CL

DO-TO-SMTHG

THING-DESCR

BE-SMTHG

be

Nom

Vt

AdjLoc

Page 60: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

More of the English Clause

DO-TO-SMTHGBE-SMTHG

be Vt

Vi

to

<V>-ing

CL

Subj Pred

Conc

Past Mod

Predicator

FINITE

Page 61: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The system of THEME,

THEMESELECTION

predicator theme <unmarked in imperative>

other

THEMESELECTION

THEMESELECTION

adjunct theme

other

non-wh-theme

wh-theme<unmarked in

WH-interrogative and exclamative>

THEMESELECTION

other

subject theme<unmarked in declarative and

yes/no interrogative>

System network for THEME SELECTION

Halliday (2004: 80)

Page 62: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

THEME SELECTION PREDICATOR THEME

ADJUNCT THEME WH- THEME

SUBJECT THEME

(Unmarked in imperative)

Non-wh-theme

Other

(Unmarked in wh-interrogativeand exclamative)

(Unmarked in declarative and yes/no interrogative)

Direct translation of Halliday’s system network

Page 63: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Theme selection in operation

This direct translation seems not to represent the way theme selection works in the cognitive system of the person forming a clause

Rather, whatever will be the theme• the specific item, not a high-level category

to which it belongs, • is active at the start of the clause formation

Having been activated it comes first, as theme and the rest of the clause follows, as Rheme

Page 64: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

(Getting ready to add Theme)

BE-SMTHG

Vi

to

<V>-ing

CL

Subj Pred

Conc

Past Mod

Predicator

FINITE

Page 65: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Add Theme-Rheme

BE-SMTHG

Vi

to

<V>-ing

CL

Subj Pred

Predicator

FINITE

THEME RHEME

Nom

DECLARE

Page 66: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Yes-No Questions

to <V>-ing

Pred

VPPerf Prog

Subj

ASKDECLARE

Finite

Page 67: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Yes-No Questions:Finite as Theme

Pred

Subj

ASK

Finite

CL

THEME RHEME

DECLARE

Nom

Page 68: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Circumstance in the Verb Phrase

be Vt

Vi

VP

Obj

Vbl Phrase

Circumstance

They did itI saw themHe was walking

in the garden a couple of days ago while she was away

Page 69: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Circumstance as Theme

Vi

VP

Vbl Phrase

Circumstance

THEME RHEME

Page 70: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Conclusion: Relationships all the way to..How far?What is at the bottom?

Introductory view: it is phonetics In the system of the speaker, we have

relational network structure all the way down to the points at which muscles of the speech-producing mechanism are activated• At that interface we leave the purely relational

system and send activation to a different kind of physical system

For the hearer, the bottom is the cochlea, which receives activation from the sound waves of the speech hitting the ear

Page 71: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

What is at the top?

Is there a place up there somewhere that constitutes an interface between a purely relational system and some different kind of structure?

Somehow at the top there must be meaning

Page 72: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

What are meanings?

DOGC

Perceptual

properties

of dogsAll those dogs

out there and

their properties

In the Mind

The World Outside

For example, DOG

Page 73: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

How High is Up?

Downward is toward expression Upward is toward meaning/function Does it keep going up forever? No — as it keeps going it arches over, through perception Conceptual structure is at the top

Page 74: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The great cognitive arch

The “Top”

Page 75: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 76: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Systemic Networks vis-à-vis Relational Networks:How related?

They operate at different levels of precision Compare chemistry and physics

• Chemistry for molecules• Physics for atoms

Both are valuable for their purposes

Page 77: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Different levels of investigation: Living Beings

Systems Biology Cellular Biology Molecular Biology Chemistry Physics

Page 78: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Levels of Precision

Advantages of description at a level of greater precision:• Greater precision• Shows relationships to other areas

Disadvantages of description at a level of greater precision:• More difficult to accomplish

Therefore, can’t cover as much ground• More difficult for consumer to grasp

Too many trees, not enough forest

Page 79: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Three Levels of precision for language

Systemic networks Abstract relational network notation Narrow relational network notation

(coming up)

Page 80: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 81: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Narrow relational network notation

Developed later Used for representing network

structures in greater detail• internal structures of the lines and

nodes of the abstract notation The original notation can be called

the ‘abstract’ notation or the ‘compact’ notation

Page 82: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Toward Greater Precision

• The nodes evidently have internal structures• Otherwise, how to account for their behavior?• We can analyze them, figure out what internal structure would make them behave as they do

Page 83: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The Ordered AND: How does it know?

Activation coming downward from above How does the AND node “know”

how long to wait before sending activation down the second line?

It must have internal structure to govern this function

We use the narrow notation to model the internal structure

Page 84: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Internal Structure – Narrow Network Notation

As each line is bidirectional, it can be analyzed into a pair of one-way lines

Likewise, the simple nodes can be analyzed as pairs of one-way nodes

Page 85: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Abstract and narrow notation

Abstract notation – also known as compact notation

The two notations are like different scales for making a map

Narrow notation shows greater detail and greater precision

Narrow notation ought to be closer to the actual neural structures

www.ruf.rice.edu/~lngbrain/shipman

Page 86: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Narrow and abstract network notation

Narrow notation Closer to neurological structure Nodes represent cortical columns Links represent neural fibers (or

bundles of fibers) Uni-directional

Abstract notation Nodes show type of relationship (OR,

AND) Easier for representing linguistic

relationships Bidirectional Not as close to neurological

structure

eat apple

eat apple

eat apple

eat apple

Page 87: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

More on the two network notations

The lines and nodes of the abstract notation represent abbreviations – hence the designation ‘abstract’

Compare the representation of a divided highway on a highway map• In a more compact notation it is

shown as a single line• In a narrow notation it is shown as

two parallel lines of opposite direction

Page 88: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Two different network notations

Narrow notation

ab

a b

b

a b

Abstract notation Bidirectional

ab

a b f

Upward Downward

Page 89: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Downward Nodes: Internal Structure

AND

OR

2

1

Page 90: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Upward Nodes: Internal Structure

AND

OR

2

1

Page 91: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Downward and, upward direction

W

2The ‘Wait’ Element

Page 92: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

AND vs. OR

In one direction their internal structures are the same

In the other, it is a difference in threshold – hi or lo threshold for hi or lo degree of activation required to cross

Page 93: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Thresholds in Narrow Notation

1 2 3 4

OR AND

– You no longer need a basic distinction AND vs. OR

– You can have intermediate degrees, between AND and OR

– The AND/OR distinction was a simplification anyway — doesn’t always work!

Page 94: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The ‘Wait’ Element

wKeeps the activation alive

A B

Activation continues to B after A has been activated

Downward AND, downward direction

Page 95: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Structure of the ‘Wait’ Element

W

1

2

www.ruf.rice.edu/~lngbrain/neel

Page 96: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Node Types in Narrow Notation

TJunction

Branching

Blocking

Page 97: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Two Types of Connection

Excitatory

InhibitoryType 1

Type 2

Page 98: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Types of inhibitory connection

Type 1 – connect to a node Type 2 – Connects to a line

• Used for blocking default realization• For example, from the node for

second there is a blocking connection to the line leading to two

Page 99: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Type 2 – Connects to a line

TWO ORDINAL

2

secondtwo -th

Page 100: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Additional details of structurecan be shown in narrow notation

Connections between upward and downward directions

Varying degrees of connection strength Variation in threshold strength Contrast

Page 101: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The two Directions

1

2

ww

Page 102: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The Two Directions

ww

Two Questions:

1. Are they really next to each other?

2. How do they “communicate” with each other?

1

2

Page 103: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Separate but in touch

ww

1

2

Down UpIn phonology, we know from aphasiology and neuroscience that they are in different parts of the cerebral cortex

Page 104: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Phonological nodes in the cortex

ww

1

2

Arcuate fasciculus

Frontal lobe

Temporal lobe

Page 105: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 106: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Another level of precision

Systemic networks Abstract relational network notation Narrow relational network notation Cortical columns and neural fibers Neurons, axons, dendrites, neurotransmitters

Page 107: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Narrow RN notation as a set of hypotheses

Question: Are relational networks related in any way to neural networks?

We can find out Narrow RN notation can be viewed as a

set of hypotheses about brain structure and function• Every property of narrow RN notation can be

tested for neurological plausibility

Page 108: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Some properties of narrow RN notation

Lines have direction (they are one-way)

But they tend to come in pairs of opposite direction (“upward” and “downward”)

Connections are either excitatory or inhibitory

Nerve fibers carry activation in just one direction

Cortico-cortical connections are generally reciprocal

Connections are either excitatory or inhibitory (from different types of neurons, with two different neurotransmitters)

Page 109: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

More properties as hypotheses

Nodes have differing thresholds of activation

Inhibitory connections are of two kinds

Additional properties – (too technical for this presentation)

Neurons have different thresholds of activation

Inhibitory connections are of two kinds • (Type 2: “axo-axonal”)

All are verified

Type 1

Type 2

Page 110: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The node of narrow RN notationvis-à-vis neural structures

The node corresponds not to a single neuron but to a bundle of neurons

The cortical column A column consists of 70-100 neurons

stacked on top of one another All neurons within a column act together

• When a column is activated, all of its neurons are activated

Page 111: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The node as a cortical column

The properties of the cortical column are approximately those described by Vernon Mountcastle

“[T]he effective unit of operation…is not the single neuron and its axon, but bundles or groups of cells and their axons with similar functional properties and anatomical connections.”

Vernon Mountcastle, Perceptual Neuroscience (1998), p. 192

Page 112: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Three views of the gray matter

Different stains show different features

Nissl stain shows cell bodies of pyramidal neurons

Page 113: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The Cerebral Cortex

Grey matter• Columns of neurons

White matter • Inter-column connections

Page 114: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Microelectrode penetrations in the paw area of a cat’s cortex

Page 115: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

The (Mini)Column

Width is about (or just larger than) the diameter of a single pyramidal cell• About 30–50 m in diameter

Extends thru the six cortical layers• Three to six mm in length• The entire thickness of the cortex is

accounted for by the columns Roughly cylindrical in shape If expanded by a factor of 100, the

dimensions would correspond to a tube with diameter of 1/8 inch and length of one foot

Page 116: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Cortical column structure

Minicolumn 30-50 microns diameter Recurrent axon collaterals of

pyramidal neurons activate other neurons in same column

Inhibitory neurons can inhibit neurons of neighboring columns• Function: contrast

Excitatory connections can activate neighboring columns• In this case we get a bundle of contiguous

columns acting as a unit

Page 117: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Levels of precision

Systemic networks Abstract relational network notation Narrow relational network notation Cortical columns and neural fibers Neurons, axons, dendrites, neurotransmitters Intraneural structures

• Pre-/post-synaptic terminals• Microtubules• Ion channels• Etc.

Page 118: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Levels of precision

Informal functional descriptions Semi-formal functional descriptions Systemic networks Abstract relational network notation Narrow relational network notation Cortical columns and neural fibers Neurons, axons, dendrites Intraneural structures and processes

Page 119: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 120: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Competition vis-à-vis Halliday’s systems

Halliday (not an exact quote):Putting the emphasis on systems gives recognition to the importance of Saussure's principle that everything meaningful has meaning in contrast to what could have been selected instead

Page 121: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Paradigmatic contrast: Competition

a b2 2

For example, /p/ vs. /k/

Page 122: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Simplified model of minicolumn II:Inhibition of competitors

Thalamus

Other corticallocations

IIIII

IV

V

VI

Cells in neighboring columns

Cell Types

Pyramidal

Spiny Stellate

Inhibitory

Page 123: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Local and distal connections

excitatory

inhibitory

Page 124: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Paradigmatic contrast: Competition

a b

a

b

Page 125: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Paradigmatic contrast: Competition

a b2 2

a

b

Page 126: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Competition vis-à-vis Halliday’s systems

Halliday (not an exact quote):Putting the emphasis on systems gives recognition to the importance of Saussure's principle that everything meaningful has meaning in contrast to what could have been selected instead

Page 127: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 128: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Precision vis-à-vis variability

Description at a level of greater precision encourages observation of variability

At the level of the forest, we are aware of the trees, but we tend to overlook the differences among them

At the level of the trees we clearly see the differences among them

But describing the forest at the level of detail used in describing trees would be very cumbersome

At the level of the trees we tend to overlook the differences among the leaves

At the level of the leaves we tend to overlook the differences among their component cells

Page 129: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Linguistic examples

At the cognitive level we clearly see that every person’s linguistic system is different from that of everyone else

We also see variation within the single person’s system from day to day

At the level of narrow notation we can treat • Variation in connection strengths• Variation in threshold strength• Variation in levels of activation

We are thus able to explain• prototypicality phenomena• learning• etc.

Page 130: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Variation in Connection Strength

Connections get stronger with use• Every time the linguistic system is used,

it changes Can be indicated roughly by

• Thickness of connecting lines in diagrams or by• Little numbers written next to lines

Page 131: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Variation in threshold strength

Thresholds are not fixed• They vary as a result of use – learning

Nor are they integral What we really have are threshold functions,

such that• A weak amount of incoming activation

produces no response• A larger degree of activation results in

weak outgoing activation• A still higher degree of activation yields

strong outgoing activation • S-shaped (“sigmoid”) function

N.B. All of these properties are found in neural structures

Page 132: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Threshold function

--------------- Incoming activation -------------------

Out

goin

g ac

tivati

on

Page 133: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

Topics in this presentation

Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language

Page 134: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

T h a n k y o u f o r y o u r a t t e n t I o n !

Page 135: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

References Halliday, M.A.K., 1994/2003. Appendix: Systemic Theory. In On Language and Linguistics (vol. 3 in the Collected Works of M.A.K. Halliday (ed. Jonathan Webster). London: ContinuumHalliday, M.A.K., 2009. Methods – techniques – problems. In Continuum Companion to Systemic Functional Linguistics (eds. M.A.K. Halliday & Jonathan Webster). London: ContinuumHockett, Charles F., 1961. Linguistic units and their relations” (Language, 1966)Lamb, Sydney, 1971. The crooked path of progress in cognitive linguistics. Georgetown Roundtable. Lamb, Sydney M., 1999. Pathways of the Brain: The Neurocognitive Basis of Language. John BenjaminsLamb, Sydney M., 2004a. Language as a network of relationships, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: ContinuumLamb, Sydney M., 2004b. Learning syntax: a neurocognitive approach, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: ContinuumMountcastle, Vernon W. 1998. Perceptual Neuroscience: The Cerebral Cortex. Cambridge: Harvard University Press.

Page 136: Systemic Networks,  Relational Networks,  and Neural Networks Sydney Lamb lamb@rice

For further information . .

www.rice.edu/langbrain