melcomplexity escom 20090729

31
Melodic Complexity Klaus Frieler Universität Hamburg Musikwissenschaftliches Institut ESCOM 2009, Jyväskylä 12.8.2009

Upload: klaus-frieler

Post on 03-Jul-2015

305 views

Category:

Technology


4 download

DESCRIPTION

Talk held at the ESCOM 2009 in Jyäskylä

TRANSCRIPT

Page 1: Melcomplexity Escom 20090729

Melodic Complexity

Klaus FrielerUniversität Hamburg

Musikwissenschaftliches InstitutESCOM 2009, Jyväskylä

12.8.2009

Page 2: Melcomplexity Escom 20090729

Melodic Complexity

• Perceived complexity is a complex process generated within a signal/receiver system

• Hypothesis: Perceived complexity is a function of (objective) signal complexity

Page 3: Melcomplexity Escom 20090729

Melodic Complexity

• Idea: Test various algorithmical melodic complexity measures in psychological experiments

• If there are significant correlations, build a model

Page 4: Melcomplexity Escom 20090729

Melodic Complexity

Complexity algorithms for n-gram sequences:

• Entropies • Zipf complexity• N-gram redundancy

Page 5: Melcomplexity Escom 20090729

Algorithm Construction

• Given: Melody as onset/pitch sequences with metrical annotations

• Apply basic transformations• Here: pitches, intervals, durations,

metrical circle map (cf. later)

Page 6: Melcomplexity Escom 20090729

Algorithm Construction

• Main transformation: n-gram sequences, i.e. sequences of subsequences of length n

• Calculate histograms of n-grams• Here: n = 1, 2, 3, 4, variable

Page 7: Melcomplexity Escom 20090729

Entropies

•Entropies of n-gram distribution

•Norm by max. entropy

Page 8: Melcomplexity Escom 20090729

Zipf complexity

Zipf‘s law: The ordered sequence of term frequencies obeys a power law (k = rank):

h(k) ~ k-s

log h(k) ~ - s log k

Page 9: Melcomplexity Escom 20090729

Zipf complexity

Source: Wikipedia

Page 10: Melcomplexity Escom 20090729

Zipf complexity

• Ordered n-gram frequencies• Regression on log-log data with

slope s• Define c := 2s as Zipf complexity• s = 0 ⇒ c = 1, s = -1 ⇒ c = 0.5,

s = -∞ ⇒ c = 0

Page 11: Melcomplexity Escom 20090729

N-gram redundancy

• Number of distinct elements in a sequence is a simple measure of redundancy.

The more distinct elements the more „complex“

Page 12: Melcomplexity Escom 20090729

N-gram redundancy

• Let |n(s)| be the count of distinct n-grams in a sequence s of length N. Then

Page 13: Melcomplexity Escom 20090729

N-gram redundancy

• Extensions: Weighted sum of n-gram redundancies up to a fixed or variable nmax

Page 14: Melcomplexity Escom 20090729

Metrical Circle Map

Page 15: Melcomplexity Escom 20090729

Ex.: „Mandy“ by Barry Manilow

Page 16: Melcomplexity Escom 20090729

Experiments

– Two listening experiments with a total of 47 subjects

– Stimuli: 12 folk songs and 3 jazz saxophon chorusses, 9 melodies identical in both experiments

– Task: Judgement of melodic complexity on a scale from 1-7

Page 17: Melcomplexity Escom 20090729

Results

• Normal distributed, reliable judgements

⇒ Pooling of data from both experiments and

⇒ Using subject means for further comparisions

Page 18: Melcomplexity Escom 20090729

Results

42 complexity measures:– Note count– Metrical Markov entropies (0th, 1th

order)– Zipf complexities (int, pitch, dur)– N-gram redundancies (int, pitch,

dur)– Entropies (int, pitch, dur)

Page 19: Melcomplexity Escom 20090729

Correlations

Note countr = ,869**

Page 20: Melcomplexity Escom 20090729

Results

• Note count explains judgement nearly perfect!

• Calculate partial correlations for other measures ⇒ Only metrical entropies left

Page 21: Melcomplexity Escom 20090729

Correlations

0th order metrical entropy

r = ,934**r‘= ,837**

Page 22: Melcomplexity Escom 20090729

Correlations

1st order metrical entropy

r = ,944**r‘= ,867**

Page 23: Melcomplexity Escom 20090729

Correlations

Pitch entropy

r = ,132r‘= ,092

Page 24: Melcomplexity Escom 20090729

Linear Regression

• Stepwise regression of variables with highest correlation

• Corrected R2 = .929

zsubjmean = .345 * znotecount + .677 * zmeter1ent

Page 25: Melcomplexity Escom 20090729

Conclusion

• Good agreement with measured complexity

• 1st order Metrical Markov entropy shows highest correlation

• But: Note count explains most of all correlations

⇒ Rather simple complexity ?!

Page 26: Melcomplexity Escom 20090729

Conclusion

• No partial correlation with any pitch/interval based measure could be found

• Meter is the most important dimension…

Page 27: Melcomplexity Escom 20090729

Outlook

• We plan experiments with note counts kept constant

• Pretests show, that metrical entropies might be suited to predict „hit-potential“ of pop melodies

Page 28: Melcomplexity Escom 20090729

Thank you!

Page 29: Melcomplexity Escom 20090729

Metrical Intervals

Page 30: Melcomplexity Escom 20090729

Metrical Markov chain0th order

Page 31: Melcomplexity Escom 20090729

Metrical Markov chains1st order