20060701 knight linguistic coding

Upload: kristyl-joey-paelmaeres-arellaenoe

Post on 06-Apr-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/3/2019 20060701 Knight Linguistic Coding

    1/17

    The Linguistic Coding of verbal and

    non-verbal backchannels:A Preliminary Approach

  • 8/3/2019 20060701 Knight Linguistic Coding

    2/17

    Initial Linguistic Codes- Recap:

    Continuers- Maintaining the flow of discourse

    Convergence Tokens- Marking agreement and disagreement

    Engaged Response Token- High level of engagement, with theparticipant responding on an affective level to the interlocutor

    Information Receipt Token- Marking points of the conversation

    where adequate information has been received

  • 8/3/2019 20060701 Knight Linguistic Coding

    3/17

    A key concern of this project is to explore how, in conversation,verbal and visual realisations of backchannels interact within andacross such categories (and beyond!).

    However in its present form, such a coding scheme is onlyproficient for the use of audio data, spoken linguistic

    backchannels forms, and provides no utility in its definitions todefine and encode non-verbal backchannels seen in the imagedata and for integrating this information with the verbal data.

    Therefore we need to develop a more all-encompassing codingscheme which can be used for both verbal and non-verbalforms.

    Limitations & Future Requirements:

  • 8/3/2019 20060701 Knight Linguistic Coding

    4/17

    PRAAT is a computer program that enables you toanalyse, synthesize, and manipulate speech- so can beused to explore the phonetic patterns of backchannels.

    Developed by researchers at the Institute of PhoneticSciences based at the University of Amsterdam

    It is free to download online and available for general use

    Note on technical restrictions in using our data in PRAAT

    Linguistic & Methodological Approach:

  • 8/3/2019 20060701 Knight Linguistic Coding

    5/17

    Methodological Approach:

    Aim: to explore any potential relationships between the actualexistence of non-verbal (i.e. head nods) and verbalbackchannels, as well as the duration, pitch and intensity of

    these.

    We have focused specifically upon our training data

    We have explored whether there are (in) consistencies:

    Across all instances- with nods, without nod Between those occurring with nods and those occurring

    without head nods Within the groups of yeah and mmm (the most frequent

    backchannels) specifically

  • 8/3/2019 20060701 Knight Linguistic Coding

    6/17

    Sample size = 100 verbal backchannels, from 275 total 50 co-occur with head nods (43 female, 7 male) 50 occur without head nods (41 female, 9 male)

    84 are spoken by the female supervisor

    16 are spoken by the male supervisee

    We are aware that we need to take into account the fact that the results willobviously vary according to who actually speaks as the phoneticcharacteristics of the student, male will be different to the supervisor, female

    Sample information:

  • 8/3/2019 20060701 Knight Linguistic Coding

    7/17

    Sample Data- Backchannel Length (secs):

    = with nod = no nod

    0

    0.5

    1

    1.5

    2

    2.5

    3

    1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49

    Backchannel Number

    Len

    gth

    (Seconds)

  • 8/3/2019 20060701 Knight Linguistic Coding

    8/17

    Sample Data Results- Length in detail (secs):

    Male Data: Female Data:

    0

    0.5

    1

    1.5

    2

    2.5

    3

    1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43

    Backchannel Number

    L

    en

    g

    th

    (seco

    n

    d

    s)

    = with nod = no nod

    0

    0.2

    0.4

    0.6

    0.8

    1

    1.2

    1.4

    1 2 3 4 5 6 7 8 9

    Backchannel Number

    Length

    (seconds)

  • 8/3/2019 20060701 Knight Linguistic Coding

    9/17

    Sample Data Results- Pitch (Hz) & Intensity (dB)

    0

    50

    100

    150

    200

    250

    300

    350

    400

    1 3 5 7 9 11 13 15 17 19 2 1 23 25 27 2 9 31 33 35 3 7 39 41 43 4 5 47 49

    Backchannel Number

    Pitch

    (dB)

    = with nod = no nod

    0

    10

    20

    30

    40

    50

    60

    70

    80

    1 3 5 7 9 11 1 3 15 1 7 19 2 1 23 2 5 27 2 9 31 3 3 35 3 7 39 4 1 43 4 5 47 4 9

    Backchannel Number

    Intensity

    (Hz)

  • 8/3/2019 20060701 Knight Linguistic Coding

    10/17

    Sample Data Results- Yeah in detail:

    - Focus on FUNCTION, linking back to initial linguisticcodes

    - 31 * yeah in the sample, 16 with nods, 15 with no nods

    - 6 male (3 + nods, 3 + no nods), 25 female (13, 12)

    - 101 in total, making the second most frequentbackchannel (mmm = 102 occurrences)

    - Similar investigations have been conducted across eachof the backchannel forms

    - Extensions and future plans

  • 8/3/2019 20060701 Knight Linguistic Coding

    11/17

    Sample Data Results- Yeah Focus:

    = with nod

    = no nod

    0

    10

    20

    30

    40

    50

    60

    70

    80

    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

    Backchannel Number

    Intensity(dB)

    0

    0.2

    0.4

    0.6

    0.8

    1

    1.2

    1.4

    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

    Backchannel Number

    L

    en

    g

    th

    (S

    eco

    n

    d

    s)

    0

    50

    100

    150

    200

    250

    300

    350

    400

    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

    Backchannel Number

    Pi

    tch

    (Hz)

  • 8/3/2019 20060701 Knight Linguistic Coding

    12/17

    Sample Data Results- Yeah Function Results

    = Continuer

    = Convergence

    Token0

    10

    20

    30

    40

    50

    60

    70

    80

    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

    Backchannel Number

    Intensity

    (d

    B)

    0

    0.2

    0.4

    0.6

    0.8

    1

    1.2

    1.4

    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

    Backchannel Number

    L

    eng

    th

    (secs)

    0

    50

    100

    150

    200

    250

    300

    350

    400

    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

    Backchannel Number

    Pitch

    (Hz)

  • 8/3/2019 20060701 Knight Linguistic Coding

    13/17

    All Sample Data- Function Focus: Length (sec)

    = Continuer = Convergence Token

    = Engaged Response = Information Receipt

    0

    0.5

    1

    1.5

    2

    2.5

    3

    1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61

    Backchannel Number

    Length

    (Secs)

  • 8/3/2019 20060701 Knight Linguistic Coding

    14/17

    All Sample Data- Function Focus: Pitch (Hz)

    = Continuer = Convergence Token

    = Engaged Response = Information Receipt

    0

    50

    100

    150

    200

    250

    300

    350

    400

    1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61

    Backchannel Number

    Pitch

    (Hz)

  • 8/3/2019 20060701 Knight Linguistic Coding

    15/17

    = Continuer = Convergence Token

    = Engaged Response = Information Receipt

    0

    10

    20

    30

    40

    50

    60

    70

    80

    1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61

    Backchannel Number

    Intensity(dB)

    All Sample Data- Function Focus: Intensity (dB)

  • 8/3/2019 20060701 Knight Linguistic Coding

    16/17

    Future Developments:

    - To look in more detail at:

    - The actual timings of the verbal and non-verbal

    backchannels- Where they occur in discourse

    - Between what words and lengths of pauses they

    occur

    This will allow us to examine whether there is a link

    between, for example, the time when the interlocutormakes and statement and when a response is made,and the value/ function of the response.

  • 8/3/2019 20060701 Knight Linguistic Coding

    17/17

    Although for the basic PRAAT explorations we have used thesupervision data but we also have collected multiple forms ofother data too for future investigation

    Future Developments- Data: